Alerting

Alert Not Triggering if no events in more than 4 hours

pitt93
New Member

I am a new user to Splunk and working to create an alert that triggers if it has been more than 4 hours since the last alert. I am using the following query, which I have test and come back with a valid result:

index=my_index
| stats max(_time) as latest_event_time
| eval time_difference_hours = (now() - latest_event_time) / 3600
| table time_difference_hours

Result: 20.646666667

When I go in and enable the alert, I set the alert to run every every. Additional I choose a custom condition as the trigger and use the following:

eval time_difference_hours > 4

But the alert does not trigger. As you can see based on the result, it has been 20 hours since the last event was received in Splunk.

Not sure what I am missing. I have also modified the query to include a time span with earliest=-24H and latest=now, but that did work either.

 

 

Labels (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

I believe the correct syntax is

search time_difference_hours > 4

but you can also put that in the search rather than in the alert with

| where time_difference_hours > 4

and just trigger on number of results.

0 Karma

pitt93
New Member

Thanks. That resolved the issue.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

As a side note - instead of

index=my_index
| stats max(_time) as latest_event_time

You can use

| tstats max(_time) as latest_event_time where index=my_index

You will notice a _huge_ performance difference.

Get Updates on the Splunk Community!

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

Splunk Decoded: Business Transactions vs Business IQ

It’s the morning of Black Friday, and your e-commerce site is handling 10x normal traffic. Orders are flowing, ...

Fastest way to demo Observability

I’ve been having a lot of fun learning about Kubernetes and Observability. I set myself an interesting ...