Alerting

Discrepancies in number of alerts triggered in alert manager overtime

sib1274
Engager

I use alert manager datamodel to keep track of all the invoked alerts month over month. Using the following:

(index=* OR index=_*) (eventtype="alert_metadata")

| where label != "NULL"

|chart count over label by Time
| rename label AS Alert
| addcoltotals labelfield=Alert label=TOTAL

 

It was working great. I was able to see the trends of how many time the alerts were triggered each month for the past 3 or 5 months. Then, the numbers that I saw last month does not match with what I see this month. For example:

When the spl above ran on Aug. 3rd, it showed the the number of alerts fired for July is 171

When the spl above ran on Sept. 3rd, it showed the the number of alerts fired for July is 145 ... missing 26

When the spl above ran on Sept. 9, it showed the the number of alerts fired for July is 123 ... missing 22

It seems the alerts keep disappearing from the alert manager data model. Has any body seen this type of behavior before? Is there any suggestion to fix this problem? Thanks.



 

0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Data could be expiring.

---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Data could be expiring.

---
If this reply helps you, Karma would be appreciated.
0 Karma

sib1274
Engager

Thanks. This is what happens. 

0 Karma
Get Updates on the Splunk Community!

New Year, New Changes for Splunk Certifications

As we embrace a new year, we’re making a small but important update to the Splunk Certification ...

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...