Splunk Enterprise Security

In Splunk Enterprise Security, why am I missing alerts due to time gaps?

CodyQ
Explorer

Question: is there a way to append the index time to the time of an event for alerting purposes?

My system failed to catch an alert because the reporting system went down, and when it started forwarding logs again, I missed several potential alerts because the alert was constructed on a "now minus 1-hour" concept for performance reasons. I realize the current solution is to just expand my search hours, but I was wondering if anyone has any other solutions.

Has anyone ever created a retrospective search that can look for events that should have fired, but haven't?

0 Karma
1 Solution

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

View solution in original post

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

Get Updates on the Splunk Community!

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...

Integrating Splunk Search API and Quarto to Create Reproducible Investigation ...

 Splunk is More Than Just the Web Console For Digital Forensics and Incident Response (DFIR) practitioners, ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...