Splunk Enterprise Security

Splunk correlation search with throttling generating duplicates on an ES clustered environment

mjones414
Contributor

We recently moved from a stand-alone ES splunk search head to a clustered splunk ES search head, and we've started to see doubling, and in some cases tripling up of some of our correlation search results where we've configured throttling that we have not seen on the stand-alone machine. 

Scenario:

correlation search scheduled to run 23 minutes after the hour every 6 hours. search looks back 24 hours to now().   Throttling is set to 1 day. 

Search runs, generates notable events.  12 hours later, search generates notable for the same events it found in the first run, implying that the search likely ran once on the same search head, and on a different search head the second time.

 

Is there a way to confirm that all search heads have the same criteria for what should be throttled and for how long?  

 

DanielSp
Explorer

Did you found the issue? Alerts with throttle works correctly in distributed environment?

0 Karma

mjones414
Contributor

It took a case opened with Splunk Enterprise support, but ultimately there was a setting set on the SH cluster that solved this issue.  As I'm no longer in a splunk admin-esque role, I can't tell you exactly what was changed, but there was certainly something that was changed in how SH cluster peers reconciled correlation searches where throttling was used.

0 Karma
Get Updates on the Splunk Community!

Build Scalable Security While Moving to Cloud - Guide From Clayton Homes

 Clayton Homes faced the increased challenge of strengthening their security posture as they went through ...

Mission Control | Explore the latest release of Splunk Mission Control (2.3)

We’re happy to announce the release of Mission Control 2.3 which includes several new and exciting features ...

Cloud Platform | Migrating your Splunk Cloud deployment to Python 3.7

Python 2.7, the last release of Python 2, reached End of Life back on January 1, 2020. As part of our larger ...