All Apps and Add-ons

How to set up an alert for data from any input that stops being received

spluzer
Communicator

Hey splunksters,

Noob here. Title says it all.

(Example would be if a DB connect input stops sending data because of a password expiration ...we would get an alert)*

Unfortunately, I cant just go download an app for this-have to do it by spl query. Anybody have any cool suggestions!

Thanks splunksters!

0 Karma
1 Solution

spluzer
Communicator

Here is what I ended up doing.

| tstats count 
         latest(_time) as _time
  where index=* earliest=-48h latest=-24h
  by host index sourcetype
| join type=left max=0 host index sourcetype [
    | tstats count         as  currentcount
      where index=* earliest=-24h latest=-0h
      by host index sourcetype ]
| where isnull(currentcount)
| eval host=upper(host)
| convert ctime(_time) as "Data Last Received"

| table host index sourcetype count "Data Last Received"
| sort  host index sourcetype

View solution in original post

0 Karma

nahfam
Path Finder

This is what i ended up doing. (obviously, you will have to create your own lookup like the one below this paragraph) And you may or may not be referencing a host and sourcetype blacklist like mine.....if not, just remove those lines...As you can see I'm filtering on percent changes, which is a threshold you can change or remove the where command altogether..I'm still working on the math, but for the most part i think its right.

sourcetype interval

blah 300

blah1 86400
| tstats latest(_indextime) as Latest where index=* by host sourcetype index
| remove_blacklisted_servers()
| search NOT
[ inputlookup sourcetype_blacklist.csv
| table sourcetype]
| lookup sourcetype_interval.csv sourcetype OUTPUT interval as intervals
| eval intervals=round(intervals/60/60,2)
| eval intervals=coalesce(intervals,0)

| eval current=now()
| eval Minimum_Age=round(((current-Latest)/60)/60,2)
| eval perc_change=((Minimum_Age-intervals)/Minimum_Age*100)
| where perc_change > 90
| rangemap field=Minimum_Age default=Critical Normal=0-0.5 Elevated=0.5-2 Warning=2-3
| eval stIDX=tostring(index) + " -- " + tostring(sourcetype)
| eval stINT=tostring(sourcetype) + " -- " + tostring(intervals)
| eval stLast=tostring(sourcetype) + " -- " + tostring(Minimum_Age)
| eval pcChange=tostring(sourcetype) + " -- " + tostring(perc_change)
| stats values(stIDX) as Index--Sourcetype list(Latest) as "Latest Event" list(Minimum_Age) as Minimum_Age list(range) as Threshold list(stINT) as Sourcetype--Interval list(stLast) as Sourcetype--HoursSinceLast list(pcChange) as Sourcetype--PercChange by host

| convert ctime("Latest Event") timeformat="%Y/%m/%d %H:%M"
| eventstats avg(Minimum_Age) as average by host
| eval average=round(average,2)
| rename Minimum_Age as "Hours Since Last Seen" average as "Avg Hours Since Last Seen" lintervals as ST_Interval
| sort "Latest Event"
| fields - "Avg Hours Since Last Seen"
| table host "Latest Event" Threshold Sourcetype--Interval Sourcetype--HoursSinceLast Sourcetype--PercChange

0 Karma

spluzer
Communicator

Here is what I ended up doing.

| tstats count 
         latest(_time) as _time
  where index=* earliest=-48h latest=-24h
  by host index sourcetype
| join type=left max=0 host index sourcetype [
    | tstats count         as  currentcount
      where index=* earliest=-24h latest=-0h
      by host index sourcetype ]
| where isnull(currentcount)
| eval host=upper(host)
| convert ctime(_time) as "Data Last Received"

| table host index sourcetype count "Data Last Received"
| sort  host index sourcetype
0 Karma

adonio
Ultra Champion

there are many many answers here on how to report when a data source / host stopped sending data.
here are couple:
https://answers.splunk.com/answers/151532/how-to-create-an-alert-if-no-data-is-generated-from-a-host...
https://answers.splunk.com/answers/9860/email-alert-when-a-data-source-dont-sends-events-to-splunk.h...
https://answers.splunk.com/answers/626214/alert-if-data-not-received-to-an-index-for-1-hour.html
and on top of that, there are other options as well ...
use your preferred method

hope it helps

Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...