Splunk Cloud Platform

Splunk alert did not trigger

abhi04
Communicator

Hi All,

 

I have an alert that shows results for 7:00 Am to 7:01 AM with more than 20 results.

the cron for the alert is: * 6-15 * * 1-5

condition: more than 4 results

 

I checked and found there were more than 4 results in the timefram 7:00 AM to 7:01 AM but the alert did not trigger an email alert.

 

Though the same alert did trigger at 8 AM.

 

On checking the internal logs I can see that at 7 AM the alert_actions="", but at 8 AM I can see alert_actions="email" which confirms that there was no email action.

 

What all things can I check further to check and confirm?

Labels (1)
0 Karma
1 Solution

bowesmana
SplunkTrust
SplunkTrust

If your search is running every minute then it will run at some point AFTER 7:01 - what is your search window you are looking at. 

If you are looking at from -1m to now then the search window will be 1 minute prior to the time of the search up to the time of the search running.

If your search window is -1m@m to @m then it will be searching from 7:00 to 7:01. If you have zero LAG time between your ingested data being created at its source and the time it is indexed in Splunk then you should get your 20 results, but imagine if all those events that are generated between 7:00 and 7:01 actually arrive in Splunk and get indexed between 7:01 and 7:02 - then there will be 0 events when your search runs between 7:00 and 7:01.

Then at 7:02 when the search runs again and it looks for events with time from 7:01 to 7:02, there are also 0 events, because the time stamp for the 20 events were between 7:00 and 7:01.

This is important when you create alerts - Splunk can never be totally realtime, so you need to understand any lag in your event ingestion time and create your search window accordingly.

This often means that when you run your search, the window should be a little in the past, e.g. from -3m@m to -2m@m so you are looking from 3 minutes behind to 2 minutes behing - this gives the events time to get created at source, sent to Splunk and then indexed.

You can check this issue by looking at _time and _indextime fields in Splunk, so you can do

| eval ixt=strftime(_indextime, "%F %T")
| eval lag=_indextime - _time
| table _time ixt lag

to see what your data lag is.

 

View solution in original post

0 Karma

bowesmana
SplunkTrust
SplunkTrust

If your search is running every minute then it will run at some point AFTER 7:01 - what is your search window you are looking at. 

If you are looking at from -1m to now then the search window will be 1 minute prior to the time of the search up to the time of the search running.

If your search window is -1m@m to @m then it will be searching from 7:00 to 7:01. If you have zero LAG time between your ingested data being created at its source and the time it is indexed in Splunk then you should get your 20 results, but imagine if all those events that are generated between 7:00 and 7:01 actually arrive in Splunk and get indexed between 7:01 and 7:02 - then there will be 0 events when your search runs between 7:00 and 7:01.

Then at 7:02 when the search runs again and it looks for events with time from 7:01 to 7:02, there are also 0 events, because the time stamp for the 20 events were between 7:00 and 7:01.

This is important when you create alerts - Splunk can never be totally realtime, so you need to understand any lag in your event ingestion time and create your search window accordingly.

This often means that when you run your search, the window should be a little in the past, e.g. from -3m@m to -2m@m so you are looking from 3 minutes behind to 2 minutes behing - this gives the events time to get created at source, sent to Splunk and then indexed.

You can check this issue by looking at _time and _indextime fields in Splunk, so you can do

| eval ixt=strftime(_indextime, "%F %T")
| eval lag=_indextime - _time
| table _time ixt lag

to see what your data lag is.

 

0 Karma
Get Updates on the Splunk Community!

Get Inspired! We’ve Got Validation that Your Hard Work is Paying Off

We love our Splunk Community and want you to feel inspired by all your hard work! Eric Fusilero, our VP of ...

What's New in Splunk Enterprise 9.4: Features to Power Your Digital Resilience

Hey Splunky People! We are excited to share the latest updates in Splunk Enterprise 9.4. In this release we ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...