Alerting

Alert based on number of events after charting the results

ilyashussain
Engager

I am a newbie on Splunk and have been trying to create an alert using the following search:

index=prodenvs source=prodlogs "transaction timeout"

I had setup it to cron every 5 mins and check past 5 mins and alert if the count goes more than 10. The alert was working fine but I wanted to summarize the results based on hosts.

index=prodenvs source=prodlogs "transaction timeout" | chart count by host

The alert stopped working after I changed it and I noticed that the alert criteria is taking number of hosts as the "count" now. Can I still use the alert feature based on the number of events and yet be able to chart my results? Any help is greatly appreciated.

Tags (3)
0 Karma

martin_mueller
SplunkTrust
SplunkTrust

You can set this as a custom alert condition:

stats sum(count) as sum | where sum > 10
0 Karma

strive
Influencer

Now you can set the alert using sum(count)

we will be able to provide clear answer if you can give us some more details

0 Karma

strive
Influencer

Where did you set the alerts? Is it on splunk web OR some custom script OR ?
Can you provide some more details

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...