Splunk Search

How to edit my search to track event counts by Index/Sourcetype to see when data is no longer being received?

joesrepsol
Path Finder

Looking to build a report to list all the indexes/sourcetypes in use. And be able to monitor event counts as they go up/down. Specifically, I am trying to see this scenario...

Scenario:
All week we've been getting data events in Index/Sourcetype, then today it seems that ingestion has stopped.

Started out with this search which shows me counts (none with zero's) by time stamp, but still not getting where I need to be. HELP?

|tstats count by _time index sourcetype span=24h
|sort index sourcetype
0 Karma
1 Solution

mattymo
Splunk Employee
Splunk Employee

Hey joesrepsol!

I like building on top of the meta woot! app:

https://splunkbase.splunk.com/app/2949/
https://discoveredintelligence.ca/meta-woot-update/

I pair the app's approach to summary indexing and it's trend and compliance views with the Machine Learning toolkit ( https://splunkbase.splunk.com/app/2890/ )to apply algorithms that alarm when a sourcetype or host deviates from normal trend. It populates a summary index at a chosen interval (you chose 5, 15 or 30 mins) using tstats similar to your example.

This provides a highly customizable and advanced analytical view of your data while providing some really sweet tools for your Splunk arsenal!

Eventually you can not only build out better views of what data is in Splunk for your users, but you can also provide data feed integrity at a granular level.

For example, here is an example of looking at a sourctype trend over the last 4 hours to find deviations in the 5 min sum of events:

alt text

Whether monitoring for spikes or dips in ingest, alerting the search that counts deviations by upper or lover bounds is useful:

index=`meta_woot_summary` sourcetype=meta_woot orig_sourcetype!=stash orig_sourcetype=juniper:junos:firewall orig_host=* orig_index=n00blab 
| timechart limit=20 span=5m sum(count) as totalEvents by orig_sourcetype 
| streamstats window=12 current=true median("juniper:junos:firewall") as median 
| eval absDev=(abs('juniper:junos:firewall'-median)) 
| streamstats window=12 current=true median(absDev) as medianAbsDev 
| eval lowerBound=(median-medianAbsDev*3), upperBound=(median+medianAbsDev*3) 
| eval isOutlierLower=if('juniper:junos:firewall' < lowerBound, 1, 0), isOutlierUpper=if('juniper:junos:firewall' > upperBound, 1, 0) 
| timechart sum(isOutlierUpper), sum(isOutlierLower)

alt text

As you can see the toolkit lets you spit out the SPL you can use and provides some cool vizualizations that you can leverage.

Then once you get comfortable, you could even timewrap your event trends and then run the deviations against the timeseries which would provide a real nice view of "are we normal"

alt text

Lots of options! Maybe start with simply using the trends to set static lowerbound thresholds and work your way up to a more dynamic trending analysis!

- MattyMo

View solution in original post

joesrepsol
Path Finder

Meta Woot is looking like a great app. Thanks.

0 Karma

aaraneta_splunk
Splunk Employee
Splunk Employee

@joesrepsol - Did the answer provided by mmodestino help provide a working solution to your question? If yes, please don't forget to resolve this post by clicking "Accept". If no, please leave a comment with more feedback. Thanks!

0 Karma

mattymo
Splunk Employee
Splunk Employee

Hey joesrepsol!

I like building on top of the meta woot! app:

https://splunkbase.splunk.com/app/2949/
https://discoveredintelligence.ca/meta-woot-update/

I pair the app's approach to summary indexing and it's trend and compliance views with the Machine Learning toolkit ( https://splunkbase.splunk.com/app/2890/ )to apply algorithms that alarm when a sourcetype or host deviates from normal trend. It populates a summary index at a chosen interval (you chose 5, 15 or 30 mins) using tstats similar to your example.

This provides a highly customizable and advanced analytical view of your data while providing some really sweet tools for your Splunk arsenal!

Eventually you can not only build out better views of what data is in Splunk for your users, but you can also provide data feed integrity at a granular level.

For example, here is an example of looking at a sourctype trend over the last 4 hours to find deviations in the 5 min sum of events:

alt text

Whether monitoring for spikes or dips in ingest, alerting the search that counts deviations by upper or lover bounds is useful:

index=`meta_woot_summary` sourcetype=meta_woot orig_sourcetype!=stash orig_sourcetype=juniper:junos:firewall orig_host=* orig_index=n00blab 
| timechart limit=20 span=5m sum(count) as totalEvents by orig_sourcetype 
| streamstats window=12 current=true median("juniper:junos:firewall") as median 
| eval absDev=(abs('juniper:junos:firewall'-median)) 
| streamstats window=12 current=true median(absDev) as medianAbsDev 
| eval lowerBound=(median-medianAbsDev*3), upperBound=(median+medianAbsDev*3) 
| eval isOutlierLower=if('juniper:junos:firewall' < lowerBound, 1, 0), isOutlierUpper=if('juniper:junos:firewall' > upperBound, 1, 0) 
| timechart sum(isOutlierUpper), sum(isOutlierLower)

alt text

As you can see the toolkit lets you spit out the SPL you can use and provides some cool vizualizations that you can leverage.

Then once you get comfortable, you could even timewrap your event trends and then run the deviations against the timeseries which would provide a real nice view of "are we normal"

alt text

Lots of options! Maybe start with simply using the trends to set static lowerbound thresholds and work your way up to a more dynamic trending analysis!

- MattyMo

ebaileytu
Communicator

This is awesome - Thanks!

0 Karma

a212830
Champion

Wow. This sort of functionality should be within the product...tracking when feeds stop/start is something that we all deal with.

0 Karma

mattymo
Splunk Employee
Splunk Employee

I am rather vocal on that topic! Hope to see something similar in the monitoring console one day soon!

The good news it is rather simple to achieve with these really nice free apps!

- MattyMo
0 Karma

sloshburch
Splunk Employee
Splunk Employee

@mmodestino - Upvote SPL-126828 which is a feature request for this exact functionality!

0 Karma

mattymo
Splunk Employee
Splunk Employee

On it! Thanks Burch!

upvoted!

- MattyMo
0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...