Splunk Search

Query to analyze if the log size has been decreased over time from hosts

sudhir_gandhe
Explorer

We use Splunk as a central logging server for both security and IT operations. I would like to know if there is a way to write an alert that will trigger when someone changed the application log level and the size of log files coming into Splunk decreases. I want to be able to monitor that by host.

Tags (1)
0 Karma
1 Solution

lguinn2
Legend

This won't check the log level, but it will tell you every source that has supplied less than its average data. The calculation is based on hourly rates, and the average is the hourly average over the last week. Note that this takes into account the fact that many sources have normal periodic variations in volume.

source=* earliest=-7d@h latest=@h
|  eval currentData = if(_time > relative_time(now(),"-1h@h"),1,0)
|  eval currentHour = strftime(relative_time(now(),"-1h@h"),"%H")
|  bucket _time span=1h
|  eval hour = strftime(_time,"%H") 
|  where hour = currentHour
|  stats count(eval(currentData=0)) as histCount count(eval(currentData=1)) as currentCount by host source _time
|  stats avg(histCount) as AvgEvents max(currentCount) as currentEvents by host source
|  where currentEvents < AvgEvents

If you have many indexes, hosts or sources, you might want to break this search down so that it isn't running everything at once. You might also want to look at the Deployment Monitor - it does something very similar but uses summary indexing. You should probably consider that option as well.
Finally, I am not sure if the Splunk-on-Splunk (SOS) app has anything like this, but it has a lot of cool stuff for managing your Splunk environment. You should definitely have this free app installed on your Splunk systems.

Finally, this search could also be used to find sources that are sending more than the usual amount of data as well. And you could use a percentile instead of an average, etc. etc. etc.

View solution in original post

sudhir_gandhe
Explorer

Perfect! Thank you very much.

0 Karma

lguinn2
Legend

This won't check the log level, but it will tell you every source that has supplied less than its average data. The calculation is based on hourly rates, and the average is the hourly average over the last week. Note that this takes into account the fact that many sources have normal periodic variations in volume.

source=* earliest=-7d@h latest=@h
|  eval currentData = if(_time > relative_time(now(),"-1h@h"),1,0)
|  eval currentHour = strftime(relative_time(now(),"-1h@h"),"%H")
|  bucket _time span=1h
|  eval hour = strftime(_time,"%H") 
|  where hour = currentHour
|  stats count(eval(currentData=0)) as histCount count(eval(currentData=1)) as currentCount by host source _time
|  stats avg(histCount) as AvgEvents max(currentCount) as currentEvents by host source
|  where currentEvents < AvgEvents

If you have many indexes, hosts or sources, you might want to break this search down so that it isn't running everything at once. You might also want to look at the Deployment Monitor - it does something very similar but uses summary indexing. You should probably consider that option as well.
Finally, I am not sure if the Splunk-on-Splunk (SOS) app has anything like this, but it has a lot of cool stuff for managing your Splunk environment. You should definitely have this free app installed on your Splunk systems.

Finally, this search could also be used to find sources that are sending more than the usual amount of data as well. And you could use a percentile instead of an average, etc. etc. etc.

Get Updates on the Splunk Community!

Enterprise Security Content Updates (ESCU) - New Releases

In the last month, the Splunk Threat Research Team (STRT) has had 3 releases of new content via the Enterprise ...

Thought Leaders are Validating Your Hard Work and Training Rigor

As a Splunk enthusiast and member of the Splunk Community, you are one of thousands who recognize the value of ...

.conf23 Registration is Now Open!

Time to toss the .conf-etti &#x1f389; —  .conf23 registration is open!   Join us in Las Vegas July 17-20 for ...