Alerting

Discrepancies in number of alerts triggered in alert manager overtime

sib1274
Engager

I use alert manager datamodel to keep track of all the invoked alerts month over month. Using the following:

(index=* OR index=_*) (eventtype="alert_metadata")

| where label != "NULL"

|chart count over label by Time
| rename label AS Alert
| addcoltotals labelfield=Alert label=TOTAL

 

It was working great. I was able to see the trends of how many time the alerts were triggered each month for the past 3 or 5 months. Then, the numbers that I saw last month does not match with what I see this month. For example:

When the spl above ran on Aug. 3rd, it showed the the number of alerts fired for July is 171

When the spl above ran on Sept. 3rd, it showed the the number of alerts fired for July is 145 ... missing 26

When the spl above ran on Sept. 9, it showed the the number of alerts fired for July is 123 ... missing 22

It seems the alerts keep disappearing from the alert manager data model. Has any body seen this type of behavior before? Is there any suggestion to fix this problem? Thanks.



 

Labels (1)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Data could be expiring.

---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Data could be expiring.

---
If this reply helps you, Karma would be appreciated.
0 Karma

sib1274
Engager

Thanks. This is what happens. 

0 Karma
Get Updates on the Splunk Community!

Discover Powerful New Features in Splunk Cloud Platform: Enhanced Analytics, ...

Hey Splunky people! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2408. In this ...

Splunk Classroom Chronicles: Training Tales and Testimonials

Welcome to the "Splunk Classroom Chronicles" series, created to help curious, career-minded learners get ...

Access Tokens Page - New & Improved

Splunk Observability Cloud recently launched an improved design for the access tokens page for better ...