Getting Data In

Parsing a SAP audit log

hannanp
Path Finder

We are trying to pull back audit files back into Splunk. We are running into a couple of issues:

1.) Parsing the log file for the datetime/transaction/etc is unbelievable hard to decipher. Has anyone had any luck in determining how to break the stamp down? an example would be this:

20130515153145001237100032D2whqwbtspZ (I know the first part is YYYYMMDDHHMMSS - I have heard that it then is a 6 digit microsecond but am not positive)

2.) The event has 2AUW20130515153145001237100032D2 as a start. We are wanting to make a new event for everytime we see a 2au.

Here is the props file we have so far:

CHARSET=UTF-16 BE
NO_BINARY_CHECK=true

LINE_BREAKER = (?=2AU)

SHOULD_LINEMERGE = false
LEARN_MODEL = false
BREAK_ONLY_BEFORE_DATE=false
BREAK_ONLY_BEFORE=\s2AU[^\s]+\s+
TIME_PREFIX=2AU.
TIME_FORMAT=%Y%m%d%H%M%S
MAX_TIMESTAMP_LOOKAHEAD=14

Anyone run into this before by chance?

Cheers!

Tags (1)
0 Karma
1 Solution

hannanp
Path Finder

Here is the props.conf file that we had to put on the server and the client. Not quite sure which was doing the most good but we ended up getting all the events broken out to individual lines.

LINE_BREAKER=.()2AU
CHARSET=UTF-16BE
TIME_PREFIX=2AU.
TIME_FORMAT=%Y%m%d%H%M%S
LINE_MERGE=false
NO_BINARY_CHECK=1

Hope this helps out someone who is trying to do the same thing.

View solution in original post

pjdmfi
New Member

How are you gathering your SAP audit logs? I had have my dev guys write a program to export the audit log, thus they could modify the fields and log data in a more friendly splunk method. It gets saved to a text file then I am monitoring the text file.

But I'm trying to find a better method than this.

0 Karma

jmallorquin
Builder

Sap events are 200 caracteres size, the dimiliter is not 2AU. I think that the best way is setting the propertty Truncate.

Marinus
Communicator

You can use Armadillo to Splunk the SAP audit log http://bit.ly/15r1vq5.

0 Karma

hannanp
Path Finder

Here is the props.conf file that we had to put on the server and the client. Not quite sure which was doing the most good but we ended up getting all the events broken out to individual lines.

LINE_BREAKER=.()2AU
CHARSET=UTF-16BE
TIME_PREFIX=2AU.
TIME_FORMAT=%Y%m%d%H%M%S
LINE_MERGE=false
NO_BINARY_CHECK=1

Hope this helps out someone who is trying to do the same thing.

sshres5
Communicator

Yeah worked for me too. It just needs to be on the Forwarder's props.conf.

However in my case, I had to use
CHARSET=UTF16-LE

0 Karma

hannanp
Path Finder

We tried both and neither would break the line into events. Here is what we have in our props.conf file.

LEARN_MODEL = false
CHARSET=UTF-16BE
NO_BINARY_CHECK = 1
LINE_BREAKER = (b)(?=\x002\x00A\x00U)

0 Karma

kristian_kolb
Ultra Champion

No line breaks at all, odd kind of log. Have you tried the suggestion of jkat54 below? Works? Or is your log fundamentally different?

Don't know, but perhaps you can use a LINE_BREAKER regex like (not tested);

LINE_BREAKER = ()(?=2AU)

or if you need to match something

LINE_BREAKER = (\b)(?=2AU)

/k

hannanp
Path Finder

Unfortunately there are no carriage returns in the data. Here is a sample of the information we are receiving:
2AUW20130516080853001208400044D2WORKSTATION1USERNAME ZLSDU032 ZLSDU032 1002ZLSDU032& WORKSTATION1 2AU320130516080853001208400044D2WORKSTATION2USERNAME VL71 ZLSDU032 1002VL71 WORKSTATION2

0 Karma

jkat54
SplunkTrust
SplunkTrust

Assuming your logs look like these examples i found on the net:

2AUJ20091008153028000447200000D0a01-testDDIC        SM19   00011  gtva01-test
2AUE20091008153028000447200000D0a01-testDDIC        SM19   0001   gtvra01-test  

You will need this:

NO_BINARY_CHECK=1

You wont need anything else:

lalt textl

0 Karma

hannanp
Path Finder

We tried changing this as suggested and it still brings them in as one event. see the example above for an example of the log.

0 Karma

kristian_kolb
Ultra Champion

It's hard to tell from this information alone, but..;

You don't need the BREAK_ONLY_* when SHOULD_LINEMERGE=false. Then only LINE_BREAKER counts.

Please post two events - but I think that your LINE_BREAKER regex might be written like;

LINE_BREAKER = ([\r\n]+)(?=2AU)

/K

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...