Splunk Search

Query to analyze if the log size has been decreased over time from hosts

sudhir_gandhe
Explorer

We use Splunk as a central logging server for both security and IT operations. I would like to know if there is a way to write an alert that will trigger when someone changed the application log level and the size of log files coming into Splunk decreases. I want to be able to monitor that by host.

Tags (1)
0 Karma
1 Solution

lguinn2
Legend

This won't check the log level, but it will tell you every source that has supplied less than its average data. The calculation is based on hourly rates, and the average is the hourly average over the last week. Note that this takes into account the fact that many sources have normal periodic variations in volume.

source=* earliest=-7d@h latest=@h
|  eval currentData = if(_time > relative_time(now(),"-1h@h"),1,0)
|  eval currentHour = strftime(relative_time(now(),"-1h@h"),"%H")
|  bucket _time span=1h
|  eval hour = strftime(_time,"%H") 
|  where hour = currentHour
|  stats count(eval(currentData=0)) as histCount count(eval(currentData=1)) as currentCount by host source _time
|  stats avg(histCount) as AvgEvents max(currentCount) as currentEvents by host source
|  where currentEvents < AvgEvents

If you have many indexes, hosts or sources, you might want to break this search down so that it isn't running everything at once. You might also want to look at the Deployment Monitor - it does something very similar but uses summary indexing. You should probably consider that option as well.
Finally, I am not sure if the Splunk-on-Splunk (SOS) app has anything like this, but it has a lot of cool stuff for managing your Splunk environment. You should definitely have this free app installed on your Splunk systems.

Finally, this search could also be used to find sources that are sending more than the usual amount of data as well. And you could use a percentile instead of an average, etc. etc. etc.

View solution in original post

sudhir_gandhe
Explorer

Perfect! Thank you very much.

0 Karma

lguinn2
Legend

This won't check the log level, but it will tell you every source that has supplied less than its average data. The calculation is based on hourly rates, and the average is the hourly average over the last week. Note that this takes into account the fact that many sources have normal periodic variations in volume.

source=* earliest=-7d@h latest=@h
|  eval currentData = if(_time > relative_time(now(),"-1h@h"),1,0)
|  eval currentHour = strftime(relative_time(now(),"-1h@h"),"%H")
|  bucket _time span=1h
|  eval hour = strftime(_time,"%H") 
|  where hour = currentHour
|  stats count(eval(currentData=0)) as histCount count(eval(currentData=1)) as currentCount by host source _time
|  stats avg(histCount) as AvgEvents max(currentCount) as currentEvents by host source
|  where currentEvents < AvgEvents

If you have many indexes, hosts or sources, you might want to break this search down so that it isn't running everything at once. You might also want to look at the Deployment Monitor - it does something very similar but uses summary indexing. You should probably consider that option as well.
Finally, I am not sure if the Splunk-on-Splunk (SOS) app has anything like this, but it has a lot of cool stuff for managing your Splunk environment. You should definitely have this free app installed on your Splunk systems.

Finally, this search could also be used to find sources that are sending more than the usual amount of data as well. And you could use a percentile instead of an average, etc. etc. etc.

Get Updates on the Splunk Community!

Unlock Database Monitoring with Splunk Observability Cloud

  In today’s fast-paced digital landscape, even minor database slowdowns can disrupt user experiences and ...

Purpose in Action: How Splunk Is Helping Power an Inclusive Future for All

At Cisco, purpose isn’t a tagline—it’s a commitment. Cisco’s FY25 Purpose Report outlines how the company is ...

[Upcoming Webinar] Demo Day: Transforming IT Operations with Splunk

Join us for a live Demo Day at the Cisco Store on January 21st 10:00am - 11:00am PST In the fast-paced world ...