Alerting

Real time alerts triggered several days after

vikasphonsa
New Member

I have a an alert with settings shown below. Somehow I'm getting alert emails for events that happened many days ago, whereas I was expecting it to trigger ONLY for real time events. I have verified that the alerts trigger fine for real time events, but why did it trigger for older events. Any ideas ?

On April 21, I received an alert email for an event that happened on Sun Apr 15

Settings ( the ones I think are relevant ).

Time Range: Start time=rt, Finish Time=rt
Schedule and alert: Schedule this alert checked
Alert: Condition=always, Alert mode=Once per result, Throttling=enabled.

Thanks,
Vikas

Tags (2)
0 Karma

woodcock
Esteemed Legend

If you have timestamped your event incorrectly such that Splunk interprets it as having "happened in the future" then it will not age through your real-time window until well after it actually occurred. This is almost always a problem with TimeZones and you can get a good idea of what is happening (good or bad) with this search:

index=* | dedup date_zone splunk_server index host source | eval lagSecs=_time-_indextime | convert ctime(_indextime) as indextime| table _time indextime lagSecs date_zone splunk_server index host source

You are looking for sources where lagSecs<0 which indicate the types of events you are describing.

0 Karma

Ayn
Legend

It's available in the internal field _indextime. It won't show in searches because it's internal, so you'll have to use eval first to get it to show:

... | eval indextime=_indextime
0 Karma

vikasphonsa
New Member

Thanks for the respone.

How do I find when Splunk received the event ? Only timestamp I see on that entry is from a week ago.

0 Karma

Ayn
Legend

When did Splunk actually receive the event? Might it be that it's an event with an older timestamp, but that it arrived just now to Splunk?

0 Karma