Monitoring Splunk

Figuring out last host reported data based on lookup

oleg106
Explorer

Hello,

Looking for some advice on a popular topping of non reporting hosts.  Perhaps someone already came across something like this, or has a better way of doing it. 

We have device pairs that report differently, and I am looking for a way to alert if a device stops reporting based on expected reporting  cadence for a particular device.  For example, have a CSV with device name/IP and a column for the expected reporting threshold that can be used to generate an alert if it is exceeded.

Example:

FW1-primary, 2m
FW1-secondary, 4h

So the search can look at the second column, and if it's been more than 4 hours since FW1-secondary sent an event, an alert can be generated.  TIA!

 

 

Labels (1)
0 Karma
1 Solution

ericjorgensenjr
Path Finder

I would recommend doing it like this:

Lookup (demolookup):

 

host,seconds
somehostname,60

 

Your alert search:

 

| metadata type=hosts | table host recentTime | lookup demolookup host | eval threshold=now()-seconds | where recentTime<threshold

 

 

View solution in original post

ericjorgensenjr
Path Finder

I would recommend doing it like this:

Lookup (demolookup):

 

host,seconds
somehostname,60

 

Your alert search:

 

| metadata type=hosts | table host recentTime | lookup demolookup host | eval threshold=now()-seconds | where recentTime<threshold

 

 

Get Updates on the Splunk Community!

.conf25 Community Recap

Hello Splunkers, And just like that, .conf25 is in the books! What an incredible few days — full of learning, ...

Splunk App Developers | .conf25 Recap & What’s Next

If you stopped by the Builder Bar at .conf25 this year, thank you! The retro tech beer garden vibes were ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...