I have an alert for log sources that stopped sending logs for a while. Alert string is like:
| metadata type=sources | eval age=now()-lastTime | search age>600
I am planning to clear alerts for the log sources that started to send logs after 300 seconds. Is there any way to do that? I thought log sources might be written into a reference set with the inputlookup command, but I am not able to find how to clear values that start sending logs.
By clearing the alert do you mean you want to clear already triggered alert from Triggered Alert History or do you want to change your alert condition from sources not feeding data in last 10 min to last 5 min (300 sec)?
| metadata type=sources | eval age=now()-lastTime | search age>300
I want to clear alert if it send logs in last 5 mins. Is there any method to clear alerts? Actually it don't have to be an alert. We can just save it as a report. The important part is clearing the values after 5 minutes if it started to send logs.
@cemiam, You can create a dashboard with source details which refreshes every minutes.
This time I'll not see if problem occurs and resolves in a few minutes. It will constantly refresh the report and delete every source which starts to send log. We need to delete them only after they send at most in 5 minutes. Do you know any way to do this?
I have updated the Splunk-TA_webtools app on splunkbase to support this scenario:
Deleting 10 fired alerts for search named "Test Alert":
| rest /servicesNS/admin/search/alerts/firedalerts/Test%20Alert
| fields title
| head 10
| map search="|curl method=delete ssl=true uri="localhost:8089/servicesNS/admin/search/alerts/firedalerts/$title$" user=admin pass=changeme | table *"
@cemiam this is working for me, please test on your end after downloading and installing the last Splunk-TA_webtool (v1.00).
It worked like a charm. Thank you for your efforts. It seems this will resolve our problem.
@cemiam if you don't mind, can you please take a moment to rate the app on splunkbase? Thanks again!