Splunk Search

Search Chain

phillipmadm
Explorer

Hopefully this is an easy one.
We have an alert setup that notifies us if a specific error occurs more than 30 times in 1 minute. It works and gives us a nice little report of the hit count. Since this report is a mass disconnect alert, it was based on the quantity of termination messages. We now need to notify the users associated with the mass disconnect. Username field is available in the source messages but I'm having an issue chaining the logic together.

Base alert below.
index=security source="application.log" application_message=termination| bucket _time span=1m | stats count by _time | WHERE count > 30

Tags (1)
0 Karma
1 Solution

DalJeanis
Legend
index=security source="application.log" application_message=termination
| bucket _time span=1m 
| stats count as termCount, values(username) as username by _time 
| WHERE termCount > 30 

The result will be a multivalue field with each username in it.

If you wanted to break out the field again for individual notification, then use

| mvexpand username

View solution in original post

0 Karma

DalJeanis
Legend
index=security source="application.log" application_message=termination
| bucket _time span=1m 
| stats count as termCount, values(username) as username by _time 
| WHERE termCount > 30 

The result will be a multivalue field with each username in it.

If you wanted to break out the field again for individual notification, then use

| mvexpand username
0 Karma

phillipmadm
Explorer

Worked like a charm.
Thanks

Get Updates on the Splunk Community!

Application management with Targeted Application Install for Victoria Experience

  Experience a new era of flexibility in managing your Splunk Cloud Platform apps! With Targeted Application ...

Index This | What goes up and never comes down?

January 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Splunkers, Pack Your Bags: Why Cisco Live EMEA is Your Next Big Destination

The Power of Two: Splunk + Cisco at "Ludicrous Scale"   You know Splunk. You know Cisco. But have you seen ...