Splunk Search
Highlighted

Why are we getting excessive number of alerts?

Motivator

We have an All time (real time) alert which produced 315 alerts in the first eight hours of the day.

When running the search query of the alert for these eight hours, we get six events.

The alert itself is as simple as it gets -

index=<index name>
AND (category="Web Attack"
NOT src IN (<set of IPs>)
)

| table <set of fields>

What's going on here?

Tags (1)
0 Karma
Highlighted

Re: Why are we getting excessive number of alerts?

Champion

hi @danielbb - Can you please post the alert configuration, particularly interested in the real time look back wondow

0 Karma
Highlighted

Re: Why are we getting excessive number of alerts?

Motivator

Is this the right view @Sukisen1981?

alt text

0 Karma
Highlighted

Re: Why are we getting excessive number of alerts?

Champion

hi @danielbb - see this, https://docs.splunk.com/Documentation/Splunk/7.3.1/Search/Specifyrealtimewindowsinyoursearch
and this
https://docs.splunk.com/Documentation/Splunk/7.3.1/Search/Specifyrealtimewindowsinyoursearch

Try setting the default_backfill to false and see?

[realtime]

default_backfill =
* Specifies if windowed real-time searches should backfill events
* Defaults to true

Highlighted

Re: Why are we getting excessive number of alerts?

Motivator

The doc says - For windowed real-time searches, you can backfill, but we don't use windowed real-time searches.

From the UI, the only relevant option seems to be the Expires at 10 hours. Can it have anything to do with us?

Btw, where can we set "windowed" real time searches versus "all-time" real-time searches?

alt text

0 Karma
Highlighted

Re: Why are we getting excessive number of alerts?

Champion

Hi @danielbb
May I ask why you need a real time alert in the first place? As a thumb rule, it is better to avoid a real time alert.
Going by the frequency of the hits you mentioned earler (6 events in 8 hrs) can you not make it a scheduled alert running say every hour / hourly frequency or even on a 3 mins scheduled window?

Highlighted

Re: Why are we getting excessive number of alerts?

Champion

another option you could try perhaps is to throttle the alerts, https://docs.splunk.com/Documentation/Splunk/7.3.1/Alert/Alertexamples

So if you throttle the alert for 1 hr from the UI does it reduce your alert received counts?

Highlighted

Re: Why are we getting excessive number of alerts?

Motivator

Ok, makes perfect sense, however these events have indexing delay that we can't avoid. For these 6 events the delay varies between 1.7 and 12.32 minutes.

So, is there a way to schedule these "regular" alerts based on _indextime. Meaning, we'll have the alert fires for all events that got indexed in the past 15 minutes, for example.

0 Karma
Highlighted

Re: Why are we getting excessive number of alerts?

Champion

interesting, try this in search
index=yourindex|your search
| eval indextime=strftime(indextime,"%Y-%m-%d %H:%M:%S") | table indextime ,time
| eval time=strptime(indextime,"%Y-%m-%d %H:%M:%S")
| eval time=time
| stats count by indextime,
time
Is there a 'proper' capture based on indextime or _time

0 Karma
Highlighted

Re: Why are we getting excessive number of alerts?

Motivator

It shows -

alt text

0 Karma