Alerting

How to create a Splunk Alert for spike in log events?

testingtena
Loves-to-Learn Everything

Trying to implement an alert on detecting spikes in logged events in our Splunk deployment and not sure how to go about it...

For example: Have 15 hosts with varying levels of sources within each... one of my sources in a host averages about 5-6k events per day over the past 30 days; then out of the blue, we're hit with 1.3 million events on one of the days.

Assuming the alert would need to be tailored to each host (or source, not sure) and would need an average number of events over a "normal" week to compare to when there's a spike?

Any help would be greatly appreciated.

 

Labels (1)
Tags (2)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Something like this:

your search earliest=-7d@d latest=@d
| bin _time span=1d
| stats count by _time host
| eventstats avg(count) as average by host
| where _time>relative_time(now(),"-1d@d")
| where count > average
0 Karma

Taruchit
Contributor

Thank you for sharing your inputs and code logic. 

I tried the below way: -
In the time range picker: "earliest: -7d@d"  "latest:now".

index=xxx sourcetype IN ("A","B")
| bin _time span=1d
| stats count by _time sourcetype
| eventstats avg(count) as average by sourcetype
| eval rise_percent=((count-average)*100)/count
|where rise_percent>=25

 And I get results for each sourcetype when the count for a sourcetype on a given date is greater than average count by 25%. 

I need your assistance to build SPL which takes average of past 15 days and compare it with today's results, but it should exclude today's date in the average. For example: - today is 11 May 2022, the past 15 days should be from 26 April 2022 to10 May 2022

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

You could do something like this (with timepicker at -15d@d)

index=xxx sourcetype IN ("A","B")
| bin _time span=1d
| stats count by _time sourcetype
| eval previous=if(_time<relative_time(now(),"@d"),count,null())
| eventstats avg(previous) as average by sourcetype
| eval rise_percent=((count-average)*100)/count
|where rise_percent>=25
0 Karma

Taruchit
Contributor

Thank you for your prompt inputs.

I tried with following: -

index=xxx sourcetype IN ("A")
| bin _time span=1d
| stats count by _time sourcetype
| eval previous=if(_time<relative_time(now(),"@d"),count,null())

Due to large volume of data, for testing purpose I only kept once sourcetype in the SPL and time range as Last 7 days. 

In output, I get table with following columns: -
_time
sourcetype
count
previous

I get results for each date in past 7 days, however the values under column count and previous are same. 
Sample output: -

_timesourcetypecountprevious
2022-05-04A10045587051004558705
2022-05-05A24509362082450936208
2022-05-06A30740609433074060943


Thus, can you please help me to correct where I am going wrong. 

0 Karma

testingtena
Loves-to-Learn Everything

SplunkCommunity.pngThanks for the quick reply @ITWhisperer but when I try running it I get the following.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Well it was just an example - you probably want to add an index or more to restrict your search depending on your actual data - similarly, it looks like you don't have host extracted so change this for something you do have that you want to group your data by - only you will know what this is as you didn't provide that information in your original post.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...