I am running following search query to obtain history of triggered alerts (time, name, severity), manually:
index=_audit action=alert_fired ss_app=* | eval ttl=expiration-now() | search ttl>0 | table trigger_time ss_name severity | rename trigger_time as "Alert Time" ss_name as "Alert Name" severity as "Severity"
I am observing following strange behavior: I am running this for "all time", yet I can never match timestamps for two searches.
What I mean by this is that if I search for value of `trigger_time` for alert that triggered in May, and search for that very same value in all time export generated in June, I will not find it. It is not a single occurrence. Whichever record I randomly select, I am unable to find in other files exporting seemingly the same data.
In case it is important: I have 3 SeachHeads in a cluster.
Is this a bug? (v8.0.2.1)
If this is not a bug, how do I make Splunk to provide precise time of when alerts were triggered?