On Tuesday, December 15, 2015 09:12:54 AM Burn Alting wrote:
I use a proprietary ELK-like system based on ausearch's -i
option. I would
like to see some variant outputs from ausearch that "packages" events into
parse-friendly formats (json, xml) that also incorporates the local
transformations Steve proposes. I believe this would be the most generic
solution to support centralised log management.
I am travelling now, but can write up a specification for review next week.
Yes, please do send something to the mail list for people to look at and
comment on.
If anyone wants to help influence future direction and does not want to do it
on the list, please contact me offlist and let me know how you aggregate logs.
We have to address central log aggregation and I would like to see what the
majority are using to know where effort would be best spent.
I did run across this page in my survey:
http://buildoop.github.io/
It mentions audit log processing. No idea if anyone is using this either.
-Steve
On 15 Dec 2015 4:13 am, <Kevin.Dienst(a)usbank.com> wrote:
> ELK
> Splunk
>
> We use a proprietary vendor product that migrates data into an HDFS store
via RabbitMQ based collectors and dumps them in raw form. From there I have
access to all the usual "big data" tools albeit I'm not using Flume just
yet, we're still trying to get a handle on operationalizing all the various
big data component so that data science developers can focus on development
instead of operations and support of the hardware/software ecosystem.