Alerting

How much of a delay is enough delay for an alert?

vanderaj2
Path Finder

I've read that a best practice for setting up a (non real-time) alert in Splunk is to schedule alerts with at least one minute of delay built in, to account for forwarding & indexing delays.

Well, I've got a alert setup to email me an alert whenever I've got a splunkd crashlog showing up anywhere in my environment. This alert runs every 5 minutes, with a 1 minute delay, like so:

Time range earliest: -6m@m latest: -1m@m
Cron schedule */5 * * * *
Condition if # of results > 0

However, I never get an email alert, even when Splunk finds results. Am I just not building enough delay into my alert? Is getting the right amount of delay just a matter of tweaking things until it works?

Thanks!

0 Karma
1 Solution

woodcock
Esteemed Legend

You almost certainly have bad timestamps in your data which are mis-labeling them so that events that really occurred "nowish" are being thrown hours into the future or the past. Install the Data Curator and Meta Woot apps and fix your _time problems. This is a deep topic and we do a TON of PS fixing this for clients. Those apps are by no means the whole story but they are a great first step. This is probably the single biggest (and most important) problem in the wild for Splunk (it is not a problem with the product; it is carelessness and confusion during the onboarding process).

View solution in original post

0 Karma

woodcock
Esteemed Legend

You almost certainly have bad timestamps in your data which are mis-labeling them so that events that really occurred "nowish" are being thrown hours into the future or the past. Install the Data Curator and Meta Woot apps and fix your _time problems. This is a deep topic and we do a TON of PS fixing this for clients. Those apps are by no means the whole story but they are a great first step. This is probably the single biggest (and most important) problem in the wild for Splunk (it is not a problem with the product; it is carelessness and confusion during the onboarding process).

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...