Splunk Search
Highlighted

searching for linux audit events?

Contributor

I am trying to generate some reports for linux audit events.

From what I understand linux can generate multiple lines of event log for a single task/action,and similar events are identified either by their session id or pid..Each line of event can have different type of fields

I will like to know if anyone has similar experience in setting up linux auditing and generating reports out of it,wouldn't mind sharing what searches you've used to generate the type of reports you have?

Tags (4)
0 Karma
Highlighted

Re: searching for linux audit events?

Communicator

I don't think this is too hard. In my example I want to see all similar events, that occur within 5 seconds of the first, grouped by the 'auid'.

  1. I first configure Splunk to recognize the 'auid' field.

  2. I then use the 'transaction' command to tell it to group by 'auid' and events within 5 seconds of the first

eventtype=audit | transaction fields=auid maxspan=5s

To do this based on PID or SID, you would do the same as above.

HTH

0 Karma
Highlighted

Re: searching for linux audit events?

Contributor

thanks.

So did you generate your reports based on auid,PID & SID?thus having 3 reports?

I'm not sure about grouping similar events by auid,since there can be several events generated and triggered by just one action alone,but these events can contain not just one auid...

0 Karma
Highlighted

Re: searching for linux audit events?

Communicator

I was just using auid as an example. Grouping based on PID and/or SID probably makes the most sense.

0 Karma
Highlighted

Re: searching for linux audit events?

New Member

Try grouping by 'msg', which contains a unique date/time and audit identifier for each set of log messages. For example:

eventtype=audit | transaction msg

Or, you can try doing what the *NIX app's built in rlog.sh does, and piping the audit log through "ausearch -i" to get human-readable output with one event per audit event. I'm trying to get this approach working, but there are some bugs in the shell script that I'm trying to fix. For more info about that, see this question.

0 Karma
Highlighted

Re: searching for linux audit events?

Communicator

Did you get this working properly?

0 Karma
Highlighted

Re: searching for linux audit events?

Explorer

Don't really have an answer, but am having to do the same type of searches at my location.

Did the above solutions work?
Thanks,
Al

0 Karma
Highlighted

Re: searching for linux audit events?

Explorer

In my experience the most useful fields for doing transactions are msg, auid and ses. But there's way too much data coming in to run transaction on everything (slow!), so I created a subsearch which will pick out the msg IDs I'm interested in, search those and then run transactions. After that I'll table a subset of the fields make it easier to look at.

Here's an example that looks for a user SCP'ing a file and the arguments passed to the scp command:

sourcetype=auditd [search sourcetype=auditd syscall=execve exe=/usr/bin/scp | table msg] | transaction msg | table _time, auid, exe, a0, a1, a2, a3
Highlighted

Re: searching for linux audit events?

Communicator

I took a look at the SplunkTAnix 5.1.2 script rlog.sh and made a change to it:

I got rid of the grep -v part
so the line that reads:

       awk -v START=$SEEK -v OUTPUT=$SEEK_FILE 'NR>START { print } END { print NR > OUTPUT }' $AUDIT_FILE | tee $TEE_DEST | /sbin/ausearch -i 2>/dev/null | grep -v "^----"

no reads

            awk -v START=$SEEK -v OUTPUT=$SEEK_FILE 'NR>START { print } END { print NR > OUTPUT }' $AUDIT_FILE | tee $TEE_DEST | /sbin/ausearch -i 2>/dev/null

This will give you a delimiter for the events which is the "----"

Then I made my own sourcetype for this:

[linux:audit]
BREAK_ONLY_BEFORE = ----
DATETIME_CONFIG =
MAX_TIMESTAMP_LOOKAHEAD = 23
NO_BINARY_CHECK = true
TIME_FORMAT = %m/%d/%Y %T.%3
TIME_PREFIX = msg=audit\(
category = Operating System
description = Auditd Events
disabled = false
pulldown_type = true

Seems to be working in my lab for now.
Thanks @foxyfred for the idea.

0 Karma