Monitoring Splunk

Figuring out last host reported data based on lookup

oleg106
Explorer

Hello,

Looking for some advice on a popular topping of non reporting hosts.  Perhaps someone already came across something like this, or has a better way of doing it. 

We have device pairs that report differently, and I am looking for a way to alert if a device stops reporting based on expected reporting  cadence for a particular device.  For example, have a CSV with device name/IP and a column for the expected reporting threshold that can be used to generate an alert if it is exceeded.

Example:

FW1-primary, 2m
FW1-secondary, 4h

So the search can look at the second column, and if it's been more than 4 hours since FW1-secondary sent an event, an alert can be generated.  TIA!

 

 

Labels (1)
0 Karma
1 Solution

ericjorgensenjr
Path Finder

I would recommend doing it like this:

Lookup (demolookup):

 

host,seconds
somehostname,60

 

Your alert search:

 

| metadata type=hosts | table host recentTime | lookup demolookup host | eval threshold=now()-seconds | where recentTime<threshold

 

 

View solution in original post

ericjorgensenjr
Path Finder

I would recommend doing it like this:

Lookup (demolookup):

 

host,seconds
somehostname,60

 

Your alert search:

 

| metadata type=hosts | table host recentTime | lookup demolookup host | eval threshold=now()-seconds | where recentTime<threshold

 

 

Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...