Alerting

Is it possible to send mail only when the result is larger than 0?

garujoey
Engager
...
| where count>10
| sendemail to=xxx from=xxx 

I am using where count > 10 to sort out the count that is larger than 10 sessions and will send mails to those guys.
However, the problem is that the sendemail will run no matther if the search returns any results.

Is there a way to send mail only when "where count>10" really returns a result, if no results, just break the sendemail part?

Tags (1)
0 Karma

niketn
Legend

@garujoey, you have two options:

1) User Alerts with Trigger Condition to send out email only if specific condition for search field is met. Refer to documentation: http://docs.splunk.com/Documentation/Splunk/latest/Alert/AlertTriggerConditions#Example

2) Use map in Splunk search with sendemail to let the search fail if no results are found. Hence, sendemail will not be invoked. Following is a run anywhere search based on Splunk's _internal index:

index=_internal sourcetype=splunkd log_level!="INFO" earliest=-1h@h latest=now 
| stats count max(_time) as _time by component 
| where count>5 
| map search="| makeresults 
              | eval _time=$_time$, component=\"$component$\", count=$count$ 
              | sendemail to=\"abc@123.com\" format=\"html\" server=smtp.abc.com:123 use_tls=1 subject=\"Alert for Data\" message=\"This is an alert for some data\" sendpdf=true"

In the above examples if there are no Splunk components with more than 5 errors in last 1 hour, the map command will fail with the following error and sendemail will not be executed:

Error in 'map': Did not find value for required attribute 'component'.
The search job has failed due to an error. You may be able view the job in the Job Inspector.
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

garujoey
Engager

Thanks niketnilay, I was using option1 before, but since I need to send mail to those who are in the search results, while it looks like option1 only support the pre defined recipients.

So I used below way in the search, and run it as hourly.
| eventstats values(username) as recipients values(FULL_NAME) as _FULL_NAME
| eval _recipients=mvjoin(_recipients, ",")
| sendemail to=$result.
_recipients
By using your option2 suggestion, actually there are some entries over 5 counts, but it doesn't send out mail.

No results found.
0 Karma

niketn
Legend

Option 2 is supposed to work.

Your query should not have filter condition, rather it will be present as Trigger Condition.

Search Query

 index=_internal sourcetype=splunkd log_level!="INFO" earliest=-1h@h latest=now 
 | stats count max(_time) as _time by component 

Trigger Condition - Trigger type should be custom

where count>5

PS: I have tested the following alert on cron schedule to run every minute for last hour of Splunk's _internal error data by component. Since I was getting only 1 or 2 errors per components, I have reset trigger condition to search count>1 and trigger type as custom

[Sample Alert with Trigger Condition]
alert.suppress = 0
alert.track = 1
alert_condition = search count>1
counttype = custom
cron_schedule = */1 * * * *
dispatch.earliest_time = -1w
dispatch.latest_time = now
display.events.fields = ["host","source","sourcetype","log_level","info"]
display.general.type = statistics
display.page.search.mode = verbose
display.page.search.tab = statistics
display.visualizations.chartHeight = 550
display.visualizations.custom.type = aplura_viz_donut.donut
enableSched = 1
request.ui_dispatch_app = splunk_answers
request.ui_dispatch_view = search
search = index=_internal sourcetype=splunkd log_level!="INFO" earliest=-1h@h latest=now \
 | stats count max(_time) as _time by component

Please try out and confirm!

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...