Alerting

When monitoring events coming into Splunk, how to alert for new devices or devices not sending events?

davidwaugh
Path Finder

Hello

I would like to be able to detect
- When a device has stopped sending logs to splunk within a timeframe
- When a new device has started sending logs

How I am thinking of doing this is to run a search every hour so that I can populate a lookup csv with entries like the following:

Hostname : DeviceIP: SourceType: Index: Event First Seen: Event Last Seen

Im afraid I've used other SIEM's but am a bit new to Splunk.
I would then query this table of data to alert when a device has not sent data or when a new device is seen.

What would be the best way to achieve this?

Many thanks for your help.

0 Karma
1 Solution

nickhills
Ultra Champion

Hi @davidwaugh

Try using the metadata command:

| metadata type=hosts index=_internal 
| eval status=case(lastTime<(now()-(86400*3)), "missing", firstTime>(now()-(86400*3)), "new", 1=1, "normal") 
| where status!="normal"

This will show you devices which have not sent data in the last 3 days, or have recently (within 3 days) started sending data.
Run the search over all time.

Note - my example above uses the internal indexes - if your retention on internal data is not very long, you can use index=* to look at your data indexes.

If my comment helps, please give it a thumbs up!

View solution in original post

mlmcadams
Engager

Excellent solution thanks for sharing it @nickhills 

0 Karma

nickhills
Ultra Champion

Hi @davidwaugh

Try using the metadata command:

| metadata type=hosts index=_internal 
| eval status=case(lastTime<(now()-(86400*3)), "missing", firstTime>(now()-(86400*3)), "new", 1=1, "normal") 
| where status!="normal"

This will show you devices which have not sent data in the last 3 days, or have recently (within 3 days) started sending data.
Run the search over all time.

Note - my example above uses the internal indexes - if your retention on internal data is not very long, you can use index=* to look at your data indexes.

If my comment helps, please give it a thumbs up!
Get Updates on the Splunk Community!

There's No Place Like Chrome and the Splunk Platform

Watch On DemandMalware. Risky Extensions. Data Exfiltration. End-users are increasingly reliant on browsers to ...

The Great Resilience Quest: 5th Leaderboard Update

The fifth leaderboard update for The Great Resilience Quest is out &gt;&gt; &#x1f3c6; Check out the ...

Devesh Logendran, Splunk, and the Singapore Cyber Conquest

At this year’s Splunk University, I had the privilege of chatting with Devesh Logendran, one of the winners in ...