Looking for some help with this one.
I'm building a few charts that are meant to serve as vulnerability trending. Our data is uploaded to Splunk on a daily basis. However, what I did not account for is when a manual push occurs in the event of troubleshooting or rapidly changing data.
What I was doing was set a search that counts the number of times severity=critical appears in the uploaded data by _time. Due to the fact that sometimes a manual push will have a day with extra data.
In the table below, there are 86 records when it should be 60.
index="foobar"
| where severity="Critical"
| bucket _time span=1d as day
| eventstats latest(_time) as Last
| stats count(severity) by day, Last
| eval First=strftime(First,"%H:%M:%S")
| eval Last=strftime(Last,"%Y/%m/%d:%H:%M:%S")
| eval day=strftime(day,"%Y/%m/%d")
day Last count(severity)
2022/02/16 | 2022/03/18:05:34:27 | 57 |
2022/02/17 | 2022/03/18:05:34:27 | 60 |
2022/02/18 | 2022/03/18:05:34:27 | 86 |
How can I set my search to only count the number of entries once per day, restricted to the latest h:m:s?
Give this a try
index="foobar"
| where severity="Critical"
| bucket _time span=1d as day
| eventstats latest(_time) as Last by day
| where _time=Last
| stats count(severity) by day, Last
| eval First=strftime(First,"%H:%M:%S")
| eval Last=strftime(Last,"%Y/%m/%d:%H:%M:%S")
| eval day=strftime(day,"%Y/%m/%d")
Give this a try
index="foobar"
| where severity="Critical"
| bucket _time span=1d as day
| eventstats latest(_time) as Last by day
| where _time=Last
| stats count(severity) by day, Last
| eval First=strftime(First,"%H:%M:%S")
| eval Last=strftime(Last,"%Y/%m/%d:%H:%M:%S")
| eval day=strftime(day,"%Y/%m/%d")
Thanks, this worked perfectly!
How do you identify the extra data?