Getting Data In

Is it possible to index a different timestamp based on a field value in the same log?

matthewe25
Engager

I'm a new Splunk user that needs to perform some analysis on a set of logs with the following format:

Status, Starttime, Endtime, JobID, ReturnCode

START,2021-03-15 10:56:15, ,123,

END,2021-03-15 10:56:15,2021-03-15 10:56:27,123,0

...

For a single job, there are separate START and END logs which can be paired together by matching the start time and job ID using the 'transactions' command.

When indexing these logs in Splunk, we've been parsing the Starttime field (2nd comma delimited value) as our timestamp but this means that all log entries with the 'END' status have a _time field corresponding to their start time rather than their end time. This leads to some extra work when filtering entries by time or calculating duration of jobs since we have to parse the Endtime field at search time.

Is there any way to add some sort of condition which allows us to index a different timestamp field depending on the 'Status' field of an individual log? (i.e. if Status=START, take _time=Starttime but if Status=END, take _time=Endtime)

Labels (3)
0 Karma
1 Solution

manjunathmeti
Champion

hi @matthewe25,

You can check with the below configurations in props.conf. Remove attribute TIMESTAMP_FIELDS if you are using it.

[sourcetype]
TIME_PREFIX = (START|END,\d+-\d+-\d+\s\d+:\d+:\d+),
TIME_FORMAT = %Y-%d-%m %H:%M:%S

 

If this reply helps you, an upvote/like would be appreciated.

View solution in original post

manjunathmeti
Champion

hi @matthewe25,

You can check with the below configurations in props.conf. Remove attribute TIMESTAMP_FIELDS if you are using it.

[sourcetype]
TIME_PREFIX = (START|END,\d+-\d+-\d+\s\d+:\d+:\d+),
TIME_FORMAT = %Y-%d-%m %H:%M:%S

 

If this reply helps you, an upvote/like would be appreciated.

Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Splunk Observability Synthetic Monitoring - Resolved Incident on Detector Alerts

We’ve discovered a bug that affected the auto-clear of Synthetic Detectors in the Splunk Synthetic Monitoring ...

Video | Tom’s Smartness Journey Continues

Remember Splunk Community member Tom Kopchak? If you caught the first episode of our Smartness interview ...