I am a new user to Splunk and working to create an alert that triggers if it has been more than 4 hours since the last alert. I am using the following query, which I have test and come back with a valid result:
index=my_index
| stats max(_time) as latest_event_time
| eval time_difference_hours = (now() - latest_event_time) / 3600
| table time_difference_hours
Result: 20.646666667
When I go in and enable the alert, I set the alert to run every every. Additional I choose a custom condition as the trigger and use the following:
eval time_difference_hours > 4
But the alert does not trigger. As you can see based on the result, it has been 20 hours since the last event was received in Splunk.
Not sure what I am missing. I have also modified the query to include a time span with earliest=-24H and latest=now, but that did work either.
I believe the correct syntax is
search time_difference_hours > 4
but you can also put that in the search rather than in the alert with
| where time_difference_hours > 4
and just trigger on number of results.
Thanks. That resolved the issue.
As a side note - instead of
index=my_index | stats max(_time) as latest_event_time
You can use
| tstats max(_time) as latest_event_time where index=my_index
You will notice a _huge_ performance difference.