Alerting

Alert Not Triggering if no events in more than 4 hours

pitt93
New Member

I am a new user to Splunk and working to create an alert that triggers if it has been more than 4 hours since the last alert. I am using the following query, which I have test and come back with a valid result:

index=my_index
| stats max(_time) as latest_event_time
| eval time_difference_hours = (now() - latest_event_time) / 3600
| table time_difference_hours

Result: 20.646666667

When I go in and enable the alert, I set the alert to run every every. Additional I choose a custom condition as the trigger and use the following:

eval time_difference_hours > 4

But the alert does not trigger. As you can see based on the result, it has been 20 hours since the last event was received in Splunk.

Not sure what I am missing. I have also modified the query to include a time span with earliest=-24H and latest=now, but that did work either.

 

 

Labels (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

I believe the correct syntax is

search time_difference_hours > 4

but you can also put that in the search rather than in the alert with

| where time_difference_hours > 4

and just trigger on number of results.

0 Karma

pitt93
New Member

Thanks. That resolved the issue.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

As a side note - instead of

index=my_index
| stats max(_time) as latest_event_time

You can use

| tstats max(_time) as latest_event_time where index=my_index

You will notice a _huge_ performance difference.

Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...