Alerting

outputlookup with no results

MickSheppard
Path Finder

I'm using an outputlookup to generate a list of services for which alerts have been raised in the last 60 minutes. I'm then using this to suppress alerts for the same service within that hour. This works as long as I have a steady stream of different alerts. If I have a lengthy period without any alerts the last alert that I have is suppressed indefinitely.

The problem is that if the search feeding outputlookup doesn't generate any results then the outputlookup doesn't overwrite the file. This means that I never end up with an empty file and consequently I suppress results I shouldn't be.

I'm using this mechanism to suppress the alerts because I want to suppress duplicates for specific events within a given time period and I don't want to define searches for each of the 100+ events I want to trigger a specific alert about. If I could suppress results based on a field value in the result rather than the name of the alert I'd do that.

Tags (1)
0 Karma

ziegfried
Influencer

One option that I can think of is to append a result to the search that generates the information for outputlookup with synthetical values that are never matched by the lookup. Eg.

... | append [ | stats count | fields - count | eval field1="__INVALID__" | eval field2="__INVALID__" | eval ... ] | outputlookup mylookup

Where field1, field2, etc are the columns in your lookup file. This forces the results to be written every time. The downside is the unnecessary entry in the lookup file.

0 Karma
Get Updates on the Splunk Community!

Splunk Lantern | Spotlight on Security: Adoption Motions, War Stories, and More

Splunk Lantern is a customer success center that provides advice from Splunk experts on valuable data ...

Splunk Cloud | Empowering Splunk Administrators with Admin Config Service (ACS)

Greetings, Splunk Cloud Admins and Splunk enthusiasts! The Admin Configuration Service (ACS) team is excited ...

Tech Talk | One Log to Rule Them All

One log to rule them all: how you can centralize your troubleshooting with Splunk logs We know how important ...