My goal is to create some CloudWatch metrics from Splunk reports that run periodically. So I've created a report, scheduled it to be run every N minutes, send e-mail to me and run a script that will push values to CloudWatch.
The problem: If reports yields no results e-mail notification and script execution is not triggered. Is this behaviour by design or it could be changed with some option?
Update: Please see the comments for the accepted answer for the workarounds.
This is by design. Both the email and script execution are form of alert trigged, so you don't want to get alerted if there is nothing wrong, correct?
It all depends upon the alert condition, which I believe you've set as "if number of event >0". If you want to have an email sent and script executed regardless of the search result, then choose "always".
Yep, for alerts this logic seems to be completely valid, but I'm talking about reports, not alerts.
Another thing: In my Splunk installation for alarms I have only the following options:
There is no "Always" option.
Conceptually both Reports and Alerts are scheduled saved search only. The Splunk has done logical categorization to separate informative report vs actionable reports.
You should be able to see all alert options by going to Settings->Searches, reports, and alerts from top right menu bar.
Let me know the version that you're using in case you don't the options.
Also, another option that would resolve this would be to use the sendemail command at the end of the search, instead of selecting email send option from Report menu. The command will always send an email event though there are no results. See more details here.
Thanks for pointing me to Settings -> Searches, reports and alerts, I have found the needed option to trigger alert always there. Can you make a answer from this comment, so I will be able to flag it as resolved?
There is another possible workaround: it is possible to append
| stats count as Total in the end of the search. So there will be always one row of data.