Alerting

How to capture missing timeframe when the reports are runned for everyhour

bipin12
New Member

we've a file that is created every 5th minute of an hour for every every hour in a day. Like the file is created at 6:05am, 7:05am, and then at 9:05am. I want to capture that missing 8:05am timestamp to create an alert.

My query is like
index=* sourcetype=* /temp/........*.zip

Tags (1)
0 Karma
1 Solution

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

View solution in original post

0 Karma

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...