Splunk Search

How to alert when a deviation has been detected in volume between two time periods?

New Member

I currently use the following query to compare volume counts between current day and a week ago:

sourcetype=abc index=xyz source=foo earliest=-0d@d latest=now |
bucket _time span=30m |
stats count by _time |
eval ReportLabel="Today" | 
append [search sourcetype=abc index=xyz source=foo earliest=-7d@d latest=-6d@d | 
bucket _time span=30m |
stats count by _time |
eval ReportLabel="PreviousWeek" | 
eval _time=_time+(60*60*24*7)] | 
chart max(count) as count over _time by ReportLabel

I'm interested in leveraging this query (if possible) to alert me if volume counts between the two time periods deviate by a certain percentage. Since the alert would run every 30 minutes, I'd have to adjust the timeframes accordingly.

  • How would I capture a specific half hour period from the previous week to reference against current day?
  • How could a deviation calculation be applied?
0 Karma

Communicator

This search will report a drop in events per index:

| tstats count AS curr_count WHERE index=* earliest=-30m latest=now() BY index 
| join type=left index [| tstats count AS old_count WHERE index=* earliest=-60m latest=-30m BY index] 
| fillnull value=1 
| eval ratio=curr_count/old_count 
| where ratio<0.20 

I'm not sure the best way to capture a 30m period from a week ago, so I've left the it to compare the last 30 minutes compared to the previous 30 minutes. Maybe someone else can assist with fine-tuning.

The | where ratio line can be changed to whatever you need.

0 Karma

Communicator

Super difficult to define the logic for a reduction in volume. I struggle with this myself. Volume decreases after hours (totally normal) and on weekends (again, normal) so showing a drop in volume alone doesn't really mean much. False alerting all the time. Now if I get 0 ingestion for a time period within business hours... that means something.

Hard to work around change control windows, maintenance windows, etc as well. If anyone has some good ideas, I'm certainly all ears as well.

0 Karma

Explorer

There are many related blog posts and answers around here.

I recommend you start from a read of the epic Maintaining the State of the Union post.

Once you've done this alerting will be as easy as scheduling and setting an alert action.

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!