Alerting

How to capture missing timeframe when the reports are runned for everyhour

bipin12
New Member

we've a file that is created every 5th minute of an hour for every every hour in a day. Like the file is created at 6:05am, 7:05am, and then at 9:05am. I want to capture that missing 8:05am timestamp to create an alert.

My query is like
index=* sourcetype=* /temp/........*.zip

Tags (1)
0 Karma
1 Solution

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

View solution in original post

0 Karma

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...