ELK
Splunk

We use a proprietary vendor product that migrates data into an HDFS store via RabbitMQ based collectors and dumps them in raw form. From there I have access to all the usual "big data" tools albeit I'm not using Flume just yet, we're still trying to get a handle on operationalizing all the various big data component so that data science developers can focus on development instead of operations and support of the hardware/software ecosystem.

Kevin D Dienst




From:        Joe Wulf <joe_wulf@yahoo.com>
To:        "linux-audit@redhat.com" <linux-audit@redhat.com>
Date:        12/14/2015 10:51 AM
Subject:        Re: New draft standards
Sent by:        linux-audit-bounces@redhat.com




Steve,

The last place I was at heavily used Splunk and then transitioned to dual-routing a substantial portion of the logs from across the infrastructure to ELK, as well.

-Joe


From: Steve Grubb <sgrubb@redhat.com>
To:
F Rafi <farhanible@gmail.com>; "linux-audit@redhat.com" <linux-audit@redhat.com>
Sent:
Monday, December 14, 2015 10:34 AM
Subject:
Re: New draft standards


But I guess this gives me an opportunity to ask the community what tools they
are using for audit log collection and viewing? Its been a couple years since
e had this discussion on the mail list and I think some things have changed.

Do people use ELK?
Apache Flume?
Something else?

It might be possible to write a plugin to translate the audit logs into the
native format of these tools.



-Steve


--
Linux-audit mailing list
Linux-audit@redhat.com
https://www.redhat.com/mailman/listinfo/linux-audit