Getting Data In

indexes not passing data alerts

splunkyboy
Observer

im trying to set up an alert that will mail me when one of my indexes hasn't passed any data for the last 3 hours, and make it part of a dashboard

does anyone have a search string that will do this please

Labels (2)
0 Karma

alonsocaio
Contributor

Hi,

You should take a look at 'metasearch' command.

The query below may help you to check if your indexes have received any data or not:

 

| metasearch index=INDEX 
| stats count 
| appendpipe 
    [ stats count 
    | where count=0]

 

 

Tags (2)
0 Karma

splunkyboy
Observer

Thanks

That search gives me a count of all the indexes with data still, although with"  where count =0"  i cant work out why

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @splunkyboy ,

to have an alert when an index doesn't receive data, you have at first create a lookup that contains the list of indexes to monitor (called e.g. indexes.csv) with at least one column (index).

Then you have to run a search like this:

| metasearch index=*
| stats count BY index
| append [ | inputlookup indexes.csv | eval count=0 | fields index count ]
| stats sum(count) AS total BY index

in this way you have the list of all your indexes:

  • if total>0 they received logs,
  • if total=0 they didn't receive logs.

Now you can create an alert for the missing indexes adding at the end of the search:

| where total=0

if instead you add at the end:

| eval Status=if(total="0","Missing","Present")
| table index Status

you have a search to put in a dashboard panel that can display (also in graphic mode) the situation or in a time distribution.

Ciao.

Giuseppe

BrandonKeep
Explorer

I know this is a bit of an old thread but your search helped me come up with something similar. I've been looking for what seems like days to find a solution. I ended up going with tstats and using the date for when the last log was found within the lookback window.

| tstats latest(_time) as lastTime where index=* by index
| append [ 
    | rest /services/data/indexes 
    | dedup title 
    | rename title as index 
    | eval lastTime=0 
    | fields index lastTime
  ]
| search NOT index IN ("_*", test)
| convert ctime("lastTime")  
| stats max(lastTime) as lastTime by index

I used | rest /services/data/indexes but one can easily swap that with a lookup like you have in your example.

It will list out all the indexes and provides a date stamp of the last known log entry. The caveat is that the the last known log, is only known if there is an event that falls within the lookback period. However, this is still good because we can see all the indexes except those omitted as part of the filter. 

Hopefully this is also helpful to someone.

gcusello
SplunkTrust
SplunkTrust

Hi @splunkyboy and all,

Yes the solution with rest requires less maintenance.

If one of the answers to this question solves your need, please, accept it for the other people of Community, otherwise, please tell us how we can help you.

Ciao and happy splunking.

Giuseppe

P.S.: Karma points are appreciated by all the Contributors 😉

 

0 Karma

alonsocaio
Contributor

Actually, in this case, the 'appendpipe' is used to set a count=0 when there is no result returned by your query. 

Also, this query will probably work better when you specify only one index.

0 Karma

splunkyboy
Observer

this search would need to reside on the indexer ?

 

0 Karma

alonsocaio
Contributor

No, You could run the search on your Search Head.

But I totally agree with the solution provided by @gcusello. The lookup could help you out when checking more than one index.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...