Getting Data In

Subsecond not getting parsed by Splunk

Explorer

I have log events in the following format:

 2016-10-29 13:24:43.310_394 [145-xxxxxxxxxxxxxxxxxx:49] [XXXX_XXXX_MASTER] (DEBUG) Mapping Instrument Code Type
 2016-10-29 13:24:43.310_805 [145-xxxxxxxxxxxxxxxxxx:49] [EnrichmentManager] (INFO) &CC_CURRENCY.process(): table lookup failed - no enrichment performed

where '_394' and '_805' are the microsecond values.

So to parse the input, I've configured my props.conf as follows:

[efx_gfixappia_ulbridgelog_pfseq]
SEDCMD-timefix=s/_//
TIME_PREFIX=^
TIME_FORMAT=%Y-%m-%d %H:%M:%S.%6N
SHOULD_LINEMERGE = false
TZ_ALIAS = HKT=GMT+08:00

Now when I try sorting the events accoring to _time, it gets sorted by the millisecond but the events remain unsorted beyond that. On trying to striptime the microsecond value, it gets displayed as 13:24:43.310000.

I tried removing the sed command in props.conf, and removed the underscores in the logfile itself (in case the timestamp was being extracted before sed), to no avail.

Any suggestions on what I need to fix/should look at to fix this issue?

0 Karma
1 Solution

Motivator

Unless someone else has something better on how you can index it out, can you try to search it out like this:

your query to return the events
| rex "^(?<dateTimeString>[^_]+)_(?<microsecond>[\S]+)"
| eval epoch1=strptime( dateTimeString, "%Y-%m-%d %H:%M:%S.%6N")
| eval newTime=epoch1+(microsecond/1000000)
| fieldformat newTime=strftime(newTime, "%Y-%m-%d %H:%M:%S.%6N")
| table _time, newTime | sort newTime

View solution in original post

Explorer
0 Karma

Explorer

Unfortunately, that wouldn't work. I'm just modifying my search query instead of the indexed data.

0 Karma

Revered Legend

The timestamp parsing happens before the SEDCMD attribute is executed, hence it has no impact. Your option would to pre-process the log file (before Splunk reads it) to remove underscore, OR (kind of workaround which is not exactly you want but may be acceptable) use only the first 3 digits as millisecond. (set TIME_FORMAT=%Y-%m-%d %H:%M:%S.%3N and add MAX_TIMESTAMP_LOOKAHEAD=23 to your props.conf).

0 Karma

Legend

Since there is an underscore present between milliseconds and microseconds, and strptime can parse using single time format variable %N, it will not be able to parse beyond underscore even with %6N, hence will skip microseconds.

Would there be any possibility of change of logging event timestamp from 2016-10-29 13:24:43.310_394 to 2016-10-29 13:24:43.310394?

If that is done then your existing pros.conf should work.
TIME_FORMAT=%Y-%m-%d %H:%M:%S.%6N

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

Motivator

Unless someone else has something better on how you can index it out, can you try to search it out like this:

your query to return the events
| rex "^(?<dateTimeString>[^_]+)_(?<microsecond>[\S]+)"
| eval epoch1=strptime( dateTimeString, "%Y-%m-%d %H:%M:%S.%6N")
| eval newTime=epoch1+(microsecond/1000000)
| fieldformat newTime=strftime(newTime, "%Y-%m-%d %H:%M:%S.%6N")
| table _time, newTime | sort newTime

View solution in original post

Explorer

Yep, you pointed in the correct direction. Using the <![CDATA]] tag around the search query works. Thanks for the help!
For your ref: https://answers.splunk.com/answers/3435/escape-and-in-the-xml-of-dashboards.html

Motivator

From the error it looks like your xml code is taking <microsecond> as an xml tag and trying to find its ending tag. Can you check if:

1) you are missing the " at the end of this line | rex "^(?<dateTimeString>[^_]+)_(?<microsecond>[\S]+)"
2) If microsecond is already a tag for you thereby renaming it to something else say microsecond1 in | rex "^(?<dateTimeString>[^_]+)_(?<microsecond1>[\S]+)" and then using this in later | eval newTime=epoch1+(microsecond1/1000000)

Explorer

Minor follow up: it works in a normal search, but how should I use this query in a dashboard XML file? I'm getting an XML Syntax Error: Opening and ending tag mismatch: microsecond line 9 and query, line 9, column 346. The rest of the config is fine, it only runs into the error at microsecond.

State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!