Monitoring Splunk

Figuring out last host reported data based on lookup

oleg106
Explorer

Hello,

Looking for some advice on a popular topping of non reporting hosts.  Perhaps someone already came across something like this, or has a better way of doing it. 

We have device pairs that report differently, and I am looking for a way to alert if a device stops reporting based on expected reporting  cadence for a particular device.  For example, have a CSV with device name/IP and a column for the expected reporting threshold that can be used to generate an alert if it is exceeded.

Example:

FW1-primary, 2m
FW1-secondary, 4h

So the search can look at the second column, and if it's been more than 4 hours since FW1-secondary sent an event, an alert can be generated.  TIA!

 

 

Labels (1)
0 Karma
1 Solution

ericjorgensenjr
Path Finder

I would recommend doing it like this:

Lookup (demolookup):

 

host,seconds
somehostname,60

 

Your alert search:

 

| metadata type=hosts | table host recentTime | lookup demolookup host | eval threshold=now()-seconds | where recentTime<threshold

 

 

View solution in original post

ericjorgensenjr
Path Finder

I would recommend doing it like this:

Lookup (demolookup):

 

host,seconds
somehostname,60

 

Your alert search:

 

| metadata type=hosts | table host recentTime | lookup demolookup host | eval threshold=now()-seconds | where recentTime<threshold

 

 

Get Updates on the Splunk Community!

OpenTelemetry for Legacy Apps? Yes, You Can!

This article is a follow-up to my previous article posted on the OpenTelemetry Blog, "Your Critical Legacy App ...

UCC Framework: Discover Developer Toolkit for Building Technology Add-ons

The Next-Gen Toolkit for Splunk Technology Add-on Development The Universal Configuration Console (UCC) ...

.conf25 Community Recap

Hello Splunkers, And just like that, .conf25 is in the books! What an incredible few days — full of learning, ...