Splunk Enterprise Security

In Splunk Enterprise Security, why am I missing alerts due to time gaps?

CodyQ
Explorer

Question: is there a way to append the index time to the time of an event for alerting purposes?

My system failed to catch an alert because the reporting system went down, and when it started forwarding logs again, I missed several potential alerts because the alert was constructed on a "now minus 1-hour" concept for performance reasons. I realize the current solution is to just expand my search hours, but I was wondering if anyone has any other solutions.

Has anyone ever created a retrospective search that can look for events that should have fired, but haven't?

0 Karma
1 Solution

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

View solution in original post

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

Get Updates on the Splunk Community!

SOC Modernization: How Automation and Splunk SOAR are Shaping the Next-Gen Security ...

Security automation is no longer a luxury but a necessity. Join us to learn how Splunk ES and SOAR empower ...

Ask It, Fix It: Faster Investigations with AI Assistant in Observability Cloud

  Join us in this Tech Talk and learn about the recently launched AI Assistant in Observability Cloud. With ...

Index This | How many sides does a circle have?

  March 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...