Splunk Search

Alert if there are no logs (by host and by sourcetype)

woodentree
Communicator

Hello,

We scheduled a search that alerts us if we do not receive logs from any of our hosts since >5 minutes. It looks like below:

| metadata type=hosts index=* | eval age=now()-lastTime | where age>3600

However, there is an issue - it does not work if we partly receive logs from the host (let's say, only 1 sourcetype out of 2).

Do you know a way to create the same alert by host AND by sourcetype at the same time?

Thanks for the help.

0 Karma
1 Solution

sumanssah
Communicator

Try this

| tstats latest(_time) as lastSeen where index IN(*) by host sourcetype
|eval delay=round ((now() - lastSeen)/60/60/24,2)
|eval lastSeen=strftime(lastSeen, "%Y/%m/%d %H:%M:%S")
|search delay>1 
|eval HostName=lower(host)

View solution in original post

sumanssah
Communicator

Try this

| tstats latest(_time) as lastSeen where index IN(*) by host sourcetype
|eval delay=round ((now() - lastSeen)/60/60/24,2)
|eval lastSeen=strftime(lastSeen, "%Y/%m/%d %H:%M:%S")
|search delay>1 
|eval HostName=lower(host)

woodentree
Communicator

Gorgeous! That what we was searching for.
Thanks for the help!

0 Karma

sumanssah
Communicator

Thanks for confirming 🙂

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @woodentree,
you have to create a lookup containing the list of hosts to monitor (called e.g. perimeter.csv) containing at least one field (host) or also more information.
Then you can schedule a search e.g. every five minutes like this:

| metasearch index=_internal
| eval host=lower(host)
| stats count BY host
| append [ | inputlookup perimeter.csv | eval host=lower(host), count=0 | fields host count ]
| stats sum(count) AS total BY host
| where total=0

In this case you're sure that an host is sending logs to Splunk.
If instead you want to monitor that are arriving logs on an index with a sourcetype, you can modify the main search in this way, using the same approach:

| metasearch index=your_index sourcetype=your_sourcetype
| eval host=lower(host)
| stats count BY host
| append [ | inputlookup perimeter.csv | eval host=lower(host), count=0 | fields host count ]
| stats sum(count) AS total BY host
| where total=0

Ciao.
Giuseppe

0 Karma

woodentree
Communicator

Hi @gcusello,

Appreciate your help!

However, our goal is slightly different: we do not like to monitor some particular sourcetype, but all of them. The goal is to know if any of our hosts does not receive logs of any sourcetype.

Thanks.

0 Karma

gcusello
SplunkTrust
SplunkTrust

your use case is simplar than I thought: you need to know if there are hosts that don't receive anything, so try something like this:

 | metasearch index=*
 | eval host=lower(host)
 | stats count BY host
 | append [ | inputlookup perimeter.csv | eval host=lower(host), count=0 | fields host count ]
 | stats sum(count) AS total BY host
 | where total=0

Ciao.
Giuseppe

0 Karma

woodentree
Communicator

Got it.
Thank you @gcusello !

0 Karma
Get Updates on the Splunk Community!

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

 Prepare to elevate your security operations with the powerful upgrade to Splunk Enterprise Security 8.x! This ...

Get Early Access to AI Playbook Authoring: Apply for the Alpha Private Preview ...

Passionate about security automation? Apply now to our AI Playbook Authoring Alpha private preview ...

Reduce and Transform Your Firewall Data with Splunk Data Management

Managing high-volume firewall data has always been a challenge. Noisy events and verbose traffic logs often ...