- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Why is Splunk Alert not triggering for every result?
I have configure a splunk alert with alert condition to Trigger for each result. But every time I only get the alert for only one of those results. Any idea why?
Below is the screenshot of the alert:
And below is a sample result from the alert query
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content


Hi @nytins,
at first, using yesterday as Time Range, if you schedule your alert at 10:00 and at 19:00 you have the same result in both the runs.
For the issue, what does it happen if you use "Once"?
Then are you shure that the Trigger action you configured can manage more than one result? I don't know PagerDuty.
Ciao.
Giuseppe
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Both "Once" and "For each result" behaves the same way for me. In both cases, I got the alert with only one event from the results. I am assuming PagerDuty doesn't support multiple results.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content


Hi @nytins,
as I said, I don't know PagerDuty and probably the issue is the it doesn't permits multiple values.
If you don't have many results, you could create a workaround like the following:
- create a lookup (called e.g. PageDuty_temp.csv),
- save your results in this lookup,
- create a new alert that:
- searches on this lookup,
- takes only the first value,
- send a message to PagerDuty,
- removes the used value from the lookup.
Ciao.
Giuseppe
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi
Have you already look from internal logs what has happened? There should be entries about fire of this alert.
r. Ismo
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I don't have access to splunk servers, these are managed by a central team. Are these logs available to search within splunk? If yes, any how how can I search for it?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Those are stored into _internal index. If you are not part of splunk admin team, you probably haven't access to it. You could try
index=_internal
To see if you can see events in that index and if you can then you can try this
index="_internal" component=SavedSplunker sourcetype="scheduler" thread_id="AlertNotifier*" NOT (alert_actions="summary_index" OR alert_actions="") app!=splunk_instrumentation
| fields _time app result_count status alert_actions user savedsearch_name splunk_server_group
| stats earliest(_time) as _time count as run_cnt sum(result_count) as result_count values(alert_actions) as alert_actions values(splunk_server_group) as splunk_server_group by app, savedsearch_name user status
| table _time, run_cnt, app, savedsearch_name user status result_count alert_actions splunk_server_group
It shows alerts which has previously run and what has happen.
If you haven't access to internal logs, then you should ask from your Splunk admin team, that they will check what has happened.
