Splunk Search

Why are we getting excessive number of alerts?

danielbb
Motivator

We have an All time (real time) alert which produced 315 alerts in the first eight hours of the day.

When running the search query of the alert for these eight hours, we get six events.

The alert itself is as simple as it gets -

index=<index name>
AND (category="Web Attack"
NOT src IN (<set of IPs>)
)

| table <set of fields>

What's going on here?

Tags (1)
0 Karma
1 Solution

Sukisen1981
Champion

we perhaps need 1-2 more iterations, but I believe we are making progress 🙂
_index_earliest=-15m _index_latest=now index=your index | rest of the stuff...

Now, this should calculate only events that were indexed from 15 mins ago till now...bit closer?

View solution in original post

0 Karma

danielbb
Motivator

Right.
I found out these six events with a similar query to yours.

The real thing for me at the moment is that -

Is there a way to schedule these "regular" alerts based on _indextime. Meaning, we'll have the alert fires for all events that got indexed in the past 15 minutes, for example.

0 Karma

Sukisen1981
Champion

another option you could try perhaps is to throttle the alerts, https://docs.splunk.com/Documentation/Splunk/7.3.1/Alert/Alertexamples

So if you throttle the alert for 1 hr from the UI does it reduce your alert received counts?

Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...