Splunk Enterprise Security

Correlation search looking for at least x events within y seconds

Sven
Engager

Thanks in advance for your time and assistance. 

I have a Splunk Enterprise Security correlation search intended to trigger when there are at least 20 events having the same values of FieldA and FieldB within 60 seconds. 

Consistent with multiple resources within Splunk docs, I am using the below search.  The correlation search runs once every 30 minutes. 

(Main search) 
| bin _time span=60s 
| stats count by FieldA FieldB 
| where count > 19

This should be straightforward, but the search has fired when there are 20 or more log entries with the same FieldA and FieldB values during the entire 30 minutes since the last correlation search, but definitely not 20 instances within the defined bin window of 60s.

Is there some caveat of using the bin command that I am unaware of?  Is there a more reliable method to achieve the same objective?

_Thanks_

Labels (1)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Tell stats to group results by time as well as the other fields.

(Main search) 
| bin _time span=60s 
| stats count by _time FieldA FieldB 
| where count > 19

 

---
If this reply helps you, Karma would be appreciated.

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

Tell stats to group results by time as well as the other fields.

(Main search) 
| bin _time span=60s 
| stats count by _time FieldA FieldB 
| where count > 19

 

---
If this reply helps you, Karma would be appreciated.

Sven
Engager

Of course. Thank you.

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...