Alerting

Extensible Alerts

paddy3883
Path Finder

I've created an alert in Splunk which essentially checks for any occurence of an event with a certain attribute EventType=SOMETHING. If no events have been recorded for a specific time frame e.g 1 hour then it sends an email notification so we are aware the service sending the messages is having problems.

We have several EventTypes that are sent and ideally we'd like to know if any of them were failing to coe through without having to write either an alert for each, or hardcode the values for the EventTypes within the alert. Is it possible to do this by have a list of values stored in a CSV file have have Splunk monitoring for each? For example if have a file containing:

Type1, Type2, Type3

then Splunk for monitoring events for each of these from a single alert, and fire an alert when it can't decide events for any one of these. So if there were no events for Type1 and Type2 we would get two emails.

Tags (1)
0 Karma
1 Solution

lguinn2
Legend

[My first answer was backwards. I deleted it. Try this one instead.]

I think that a CSV file could be part of your solution, but here is my idea for how to accomplish this.

First, create a CSV for the eventtypes that you want to monitor - one eventtype per line. The first line must be "eventtype"

eventtype
Type1
Type2
Type3

Upload the CSV file into Splunk and create a lookup. (There is a tutorial on how to setup lookups.) I'll call the lookup alert_lookup in this example.

Then tag each eventtype that you want to alert on, with a tag named "alert". Then the search for your alert would be

| inputlookup alert_lookup
| join eventtype type=outer [ search tag=alert earliest=-24h latest=@h | stats count by eventtype  ]
| fillnull value=0 count
| where count = 0

in the search box. Notice that this search looks back over the past 24 hours. (Yes, the first thing in the search box is the vertical bar; that's not a typo.) The criteria for the alert would simply be

number of results > 0

You could set the alert to send one email for each result, or one email for all.

I was unable to test this, so the search may be off, but the idea will definitely work. If you get any errors, pleae post back and I will help debug!

View solution in original post

lguinn2
Legend

[My first answer was backwards. I deleted it. Try this one instead.]

I think that a CSV file could be part of your solution, but here is my idea for how to accomplish this.

First, create a CSV for the eventtypes that you want to monitor - one eventtype per line. The first line must be "eventtype"

eventtype
Type1
Type2
Type3

Upload the CSV file into Splunk and create a lookup. (There is a tutorial on how to setup lookups.) I'll call the lookup alert_lookup in this example.

Then tag each eventtype that you want to alert on, with a tag named "alert". Then the search for your alert would be

| inputlookup alert_lookup
| join eventtype type=outer [ search tag=alert earliest=-24h latest=@h | stats count by eventtype  ]
| fillnull value=0 count
| where count = 0

in the search box. Notice that this search looks back over the past 24 hours. (Yes, the first thing in the search box is the vertical bar; that's not a typo.) The criteria for the alert would simply be

number of results > 0

You could set the alert to send one email for each result, or one email for all.

I was unable to test this, so the search may be off, but the idea will definitely work. If you get any errors, pleae post back and I will help debug!

paddy3883
Path Finder

After my last comment I actually went back and did some modifications to your original answer, based on some other documentation on Lookups from the Splunk site, and I managed to more or less replicate the new answer you provided! Many thanks for your assistance, I now have it fully working.

0 Karma
Get Updates on the Splunk Community!

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...

Observability protocols to know about

Observability protocols define the specifications or formats for collecting, encoding, transporting, and ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...