Alerting

How to capture missing timeframe when the reports are runned for everyhour

bipin12
New Member

we've a file that is created every 5th minute of an hour for every every hour in a day. Like the file is created at 6:05am, 7:05am, and then at 9:05am. I want to capture that missing 8:05am timestamp to create an alert.

My query is like
index=* sourcetype=* /temp/........*.zip

Tags (1)
0 Karma
1 Solution

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

View solution in original post

0 Karma

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...