I have configured an alert to send an email per result, and set the throttling based on a field value.
If I understand it correctly, Splunk must somehow track these field values, so that it knows next time to suppress the email.
Where does it caches these values? Is there a limit to the number of values tracked? When are they reset or how can we do it manually? (Modifying the alert seems to reset the throttling)
I searched answers and the documentation, but couldn't find anything about the internal mechanism. Anybody knows in which direction to search?
Ok--took a quick look at the documentation. Have you checked the savedsearches.conf configuration file? This should have information on triggering conditions and other configurations, including suppression settings, for your alerts.
Try this documentation topic to learn more about the configuration file:
If that doesn't help, let me know.
thank you for the pointer, however it describes how I can configure alert suppression, but not how it works.
#******* # alert suppression/severity/expiration/tracking/viewing settings #******* alert.suppress = 0 | 1 * Specifies whether alert suppression is enabled for this scheduled search. * Defaults to 0. alert.suppress.period = * Sets the suppression period. Use [number][time-unit] to specify a time. * For example: 60 = 60 seconds, 1m = 1 minute, 1h = 60 minutes = 1 hour etc * Honored if and only if alert.suppress = 1 * Defaults to empty string. alert.suppress.fields = * List of fields to use when suppressing per-result alerts. This field *must* be specified * if the digest mode is disabled and suppression is enabled. * Defaults to empty string.
What I'm looking for:
So any description of the internal mechanism would help a lot to understand how it works, limitations, and behaviour.
Thanks a lot in advance for your investigations!
Let me see if I can provide any further details to answer your follow-up questions. I indicated the .conf file because settings you're asking about are populated there. So, if you make certain settings in the UI, you could check on what settings (such as field values, suppression period, etc) have been populated to savedsearches.conf for a particular alert. You can also manually adjust these settings (such as values in alert conditions) in the conf file.
I'll report back with any further details I can provide. I'll write again soon.
Could you give me a bit more information to clarify some of your questions? I'd like to give you the information that will best help with what you're trying to accomplish.
What are you trying to do with your alert setup?
Are you concerned about exceeding a limit on the number of unique field/value combos in your events?
I have setup such an alert that sends an email per event, and then pause for one week.
I've seen that by changing the alert search definition, all events retriggered. So I was wondering how do the internals work, i.e. where are the values cached, when will they reset, ... By extension and being curious, I want to know more about the mechanism, and if there are ways to see the internal states. It's also about knowing the impact of changing something that will reset the state. Maybe, if I knew how it works, I could prepopulate the state with a query, so that I can avoid a flood of alerts when setting it up...
Do these questions make sense?
Your questions make sense. If you change the search that defines conditions for an alert, then the triggering no longer works in the same way. It's essentially a new search, which requires a new alerting configuration.
My suggestion would be that when you change a search but want the same alert triggering to happen, that you check the configurations carefully in savedsearches.conf or in Splunk Web, from the home page, at Settings > Searches, Reports, and Alerts.
I don't have more information about internal mechanisms for alerting, but I can say that configurations you make for alerts, such as conditions and suppression, are saved to the .conf files. You can adjust settings in Splunk Web or in the savedsearches.conf file and should be able to make sure you have the search conditions and the alert settings you want.
Hope this helps!
I'm a technical writer here at Splunk. I work on documenting alerts and want to help with your question. I'll look into this and report back with some information.