We are using splunk for monitoring purpose. We are pulling data from multiple application We want fire an alert suppose if i wont get the one statement in particular log file. I get a one event at every two hour so whenever i get that event i get a alert. But my requirement is different if i wont get this event in 2 hours then i have to get a alert . Can anybody help me how to get this done?
You could have your alert trigger whenever there are zero results. For example, run this realtime with a two-hour window:
your search terms | stats count
Configure the alert's trigger condition to fire when count is zero. Make sure to also set throttling.
Whenever i am scheduling alert the option for time range is 1) Every hour 2) every day kind of thing but there is no every two hour option where i have to do this conf files?
Ah. Select cron schedule and enter this:
0 */2 * * *
That'll run at 00:00, 02:00, 04:00, etc.
If there's an event at 00:30 and one at 03:00 that's more than two hours... should your alert fire in this case? Scheduling this with a two-hour time range every two hours would not fire because there was an event in each two-hour time range despite the difference between the events being greater than two hours.
Also, you could evaluate the search, by calculating a "gap". Then alert when your gap is too long.
This is how I did it:
| stats max(_time) As LatestTime by appserver | eval gap=(now()-LatestTime) | eval appserver="WebSphere" | sort num(gap) D | head 1 | rangemap field=gap low=0-300 default=severe
Note, you need to use a time range longer than two hours for this to work... Long enough to be sure that the most recent event is still visible but older than two hours when running the alert.
Thanks Renems. Can you elaborate this i am not able to get the | eval appserver="WebSphere" | sort num(gap) D | head 1 | rangemap field=gap low=0-300 default=severe