Splunk Dev

Alert if data not received to an index for 1 hour

denose
Explorer

I am currently using this search:

index=_audit OR index=_internal OR index=_introspection OR index=a OR index=b OR index=c OR index=d earliest=-1h latest=-59m | stats count by index | where count=0

I tried input lookups for listing all the indexes but didn't get far quickly. If I take off the where command, it will show a count of events for only the indexes that have events. So when I say where=0 there are not indexes to be seen.

How can I make it search those indexes and tell me if they have in fact had no events?

Tags (1)
0 Karma
1 Solution

mayurr98
Super Champion

You can use following query to check if you're receiving data from a particular source OR not.

As your threshold time period for not reporting is 1 hour/3600 secs. Run below search for a period longer than 1 hr and set up an alert when there are records returned

Using metadata command
If you want to know based on host

| metadata type=hosts index=yourindexNameHere | where host=yourHostNameHere| eval age=(recentTime-now()) | where age>3600 | table host recentTime age | convert ctime(recentTime)

For sourcetype, use

| metadata type=sourcetypes index=yourindexNameHere | where sourcetype=yourSourcetypeNameHere| eval age=(recentTime-now()) | where age>3600 | table sourcetype recentTime age | convert ctime(recentTime)

For source, use

| metadata type=sources index=yourindexNameHere | where source=yourSourceNameHere| eval age=(recentTime-now()) | where age>3600 | table source recentTime age | convert ctime(recentTime)

Using tstats
Just replace sourcetype with any other metadata field that you want to use.

| tstats max(_time) as recentTime WHERE index=yourindexNameHere  by sourcetype | where sourcetype=yourSourcetypeNameHere| eval age=(recentTime-now()) | where age>3600 | table sourcetype recentTime age | convert ctime(recentTime)

let me know if this helps!

View solution in original post

mayurr98
Super Champion

You can use following query to check if you're receiving data from a particular source OR not.

As your threshold time period for not reporting is 1 hour/3600 secs. Run below search for a period longer than 1 hr and set up an alert when there are records returned

Using metadata command
If you want to know based on host

| metadata type=hosts index=yourindexNameHere | where host=yourHostNameHere| eval age=(recentTime-now()) | where age>3600 | table host recentTime age | convert ctime(recentTime)

For sourcetype, use

| metadata type=sourcetypes index=yourindexNameHere | where sourcetype=yourSourcetypeNameHere| eval age=(recentTime-now()) | where age>3600 | table sourcetype recentTime age | convert ctime(recentTime)

For source, use

| metadata type=sources index=yourindexNameHere | where source=yourSourceNameHere| eval age=(recentTime-now()) | where age>3600 | table source recentTime age | convert ctime(recentTime)

Using tstats
Just replace sourcetype with any other metadata field that you want to use.

| tstats max(_time) as recentTime WHERE index=yourindexNameHere  by sourcetype | where sourcetype=yourSourcetypeNameHere| eval age=(recentTime-now()) | where age>3600 | table sourcetype recentTime age | convert ctime(recentTime)

let me know if this helps!

denose
Explorer

Thanks mayurr98 though, I'm not sure how this works on a per index basis?
I want to know which index hasn't received events rather than source, sourcetype or hosts.

0 Karma

mayurr98
Super Champion

Try this

| tstats max(_time) as recentTime WHERE index=*  by index | eval age=now()-recentTime| where age>3600 | table index recentTime age | convert ctime(recentTime)

So what I have done is per index I took recent event time and then subtracted it from the current time to calculate age and then applied a condition on age to get the index details. Run this for all time to get accurate results.

denose
Explorer

Thanks for that. I think that's enough to do what I wanted for now.

I appreciate your assistance.

0 Karma

skoelpin
SplunkTrust
SplunkTrust

This wont work since your looking for a count equal to zero rather than null values. You may also want to change your time fields since your alerts wlll arrive an hour late

Try this

| makeresults | eval count="" 
| append [ | search index=_audit OR index=_internal OR index=_introspection OR index=a OR index=b OR index=c OR index=d earliest=-1h latest=-59m | stats count by index | fillnull value=0 | where count=0]
0 Karma

denose
Explorer

Thanks, I still don't get any indexes listed with 0 events.
It just returns, under the Statistics tab, the current timestamp in the _time column and blank under count column?

0 Karma
Get Updates on the Splunk Community!

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Splunk Education Goes to Washington | Splunk GovSummit 2024

If you’re in the Washington, D.C. area, this is your opportunity to take your career and Splunk skills to the ...