Splunk Search
Highlighted

Alerting on multiple emails from grouping IPs

Engager

I'm running into an issue where I am receiving a flood of emails for an alert.

The alert works as expected when I alert on values greater than one; however, raising the value breaks the alert.

Search:
sourcetype="backend" | regex "User with email .* used an invalid password." | rex "User with email (?<email>.*) used an invalid password." | rex "client_ip=(?<client_ip>\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b)" | transaction client_ip maxspan=1000s | search eventcount > 2 | stats values(email), count(eval(email)) as EmailCount by client_ip | where EmailCount > 1

Example email output (attached csv):
>>>> client_ip | values(emails) | EmailCount
123.123.123.123 | a@gmail.com b@gmail.com | 2

When I raise EmailCount > 2, I get about 50 emails in the span of 5 minutes with csvs that look like this:

_time | _raw | client_ip | values(email) | EmailCount | email | index | ...
... | ... | x.x.x.x | [BLANK] | [BLANK] | a@b.com | ... | ...

I get more information (like _raw and _time), but values(email) and EmailCount are left blank. A new email field appears and it only contains one email. Doing a plain realtime search, no stats are reported.

Why is this happening? Why does raising the value by one break the alerting, when keeping the value at 1 produces the expected result?

Notes:
- I am using 1000s temporarily. This will decrease to less than 5s after debugging this search
- I tried using dc() instead of count(), but that also didn't work

0 Karma
Speak Up for Splunk Careers!

We want to better understand the impact Splunk experience and expertise has has on individuals' careers, and help highlight the growing demand for Splunk skills.