Splunk Search

Time Stamp in Stats table - event relevance

nowplaying
Explorer

I'm generating a stats table to count the occurrence of errors in our production app logs and presenting a top 10 errors to our engineering team daily. They would like to have a time stamp included in the table so they can determine relevance. The time stamp needs to be the time stamp of the last error message seen for each count. I'm not sure how to present this time in a stats table or if it's even possible.

The idea is if a high volume error occurred but it's not a error that continuously occurs they would like to discount it when viewing the daily report.

Hope this makes sense. If you have a better idea on how to present this data I'm all ears.

Tags (2)
1 Solution

Ayn
Legend

This should do the trick:

<yourbasesearch>
| stats count,first(_time) as "Most recent event" by errortype 
| convert ctime("Most recent event")
| sort -count
| head 10

View solution in original post

Ayn
Legend

This should do the trick:

<yourbasesearch>
| stats count,first(_time) as "Most recent event" by errortype 
| convert ctime("Most recent event")
| sort -count
| head 10

Ayn
Legend

True. Edited my answer.

0 Karma

gkanapathy
Splunk Employee
Splunk Employee

may want to add | head 10 to only show the 10 most common uri_host (or errors or whatever else you're counting by)

0 Karma
Get Updates on the Splunk Community!

Index This | I’m short for "configuration file.” What am I?

May 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with a Special ...

New Articles from Academic Learning Partners, Help Expand Lantern’s Use Case Library, ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Your Guide to SPL2 at .conf24!

So, you’re headed to .conf24? You’re in for a good time. Las Vegas weather is just *chef’s kiss* beautiful in ...