Splunk Search

Time Stamp in Stats table - event relevance

Explorer

I'm generating a stats table to count the occurrence of errors in our production app logs and presenting a top 10 errors to our engineering team daily. They would like to have a time stamp included in the table so they can determine relevance. The time stamp needs to be the time stamp of the last error message seen for each count. I'm not sure how to present this time in a stats table or if it's even possible.

The idea is if a high volume error occurred but it's not a error that continuously occurs they would like to discount it when viewing the daily report.

Hope this makes sense. If you have a better idea on how to present this data I'm all ears.

Tags (2)
1 Solution

Legend

This should do the trick:

<yourbasesearch>
| stats count,first(_time) as "Most recent event" by errortype 
| convert ctime("Most recent event")
| sort -count
| head 10

View solution in original post

Legend

This should do the trick:

<yourbasesearch>
| stats count,first(_time) as "Most recent event" by errortype 
| convert ctime("Most recent event")
| sort -count
| head 10

View solution in original post

Legend

True. Edited my answer.

0 Karma

Splunk Employee
Splunk Employee

may want to add | head 10 to only show the 10 most common uri_host (or errors or whatever else you're counting by)

0 Karma