Hello,
I am a noob at Splunk. I know there are a few posts on this already but I'm not able to find a solution for my specific problem. I want to make an alert for when indexing stops. I am using the following: | tstats latest(_time) as latest where index=* by host | where latest < relative_time(now(), "-1m")
Normally, I want the "-1m" to be "-1d" but i changed it to test the alert. I can see on the search that I have a result from an ip address that is not indexing events every minute so I know the search is working. When I save it as an alert however, I get no alerts. I have tried real-time and scheduled alerts to attempts a trigger.
Does anyone know why the alert doesnt work or if there is something off with the search I am trying to use?
Thanks!
Hi. I don't use real time alerting.
Try scheduled. For real-time I can't see the conditions when it would alert.
I will attach to typical kinds of alerting I do (oh only allowed on attachment per reply)..
search myfield > 3
Hi @gba8912,
tstats searches cannot run in realtime. You should change your alert schedule to "Scheduled".
Hi do you want to share exactly how you are setting up the alert condition and what are you using to alert? Can you check the _internal index for errors?
Hi Burwell,
Thanks for your reply. I have searched _internal and have 1,136,041 events. I don't really see any errors but maybe I don't know what to look for. Also, I am using the search I posted and saving it as an alert. I have attached a screenshot of the settings that I currently have. I have also tried using the scheduled time feature. Neither works.
Hi. I don't use real time alerting.
Try scheduled. For real-time I can't see the conditions when it would alert.
I will attach to typical kinds of alerting I do (oh only allowed on attachment per reply)..
search myfield > 3
Thanks! I was able to get the alerts to work with scheduled.