Alerting

Trigger Alert based on Unique Values and Throttle so unique value doesn't get sent again

Kohtea16
Explorer

Hello,

Background:

I am generating alerts around our Office 365 Environment using the Content Pack for Microsoft 365. I have limited search query experience but willing to put in the time to learn more as I go.

About the Content Pack for Microsoft 365 - Splunk Documentation

Trying to accomplish:

Runs every 10 minutes > Trigger single alert if "id"/"Ticket" is unique for every result > Throttle for 24 hours

This is just an example of my search query:

 

(index=Office365) sourcetype="o365:service:healthIssue" service="Exchange Online" classification=incident OR advisory status=serviceDegradation OR investigating
| eventstats max(_time) as maxtime, by id
| where _time = maxtime
| mvexpand posts{}.description.content
| mvexpand posts{}.createdDateTime
| rename posts{}.description.content AS content posts{}.createdDateTime AS postUpdateTime
| stats latest(content) AS Content latest(status) AS Status earliest(_time) AS _time latest(postUpdateTime) AS postUpdateTime by service, classification id isResolved
| fields _time service  classification id  Content postUpdateTime Status isResolved
| sort + isResolved -postUpdateTime
| rename isResolved AS Resolved? service AS Workload id AS Ticket classification AS Classification postUpdateTime AS "Last Update"

 

would I need a custom trigger? and what result would be required for suppressing?would I need a custom trigger? and what result would be required for suppressing?

 


What Is happening:

There could be technically be 3 events based on the search query but the alert will only send 1 email to me (with only 1 event) instead of 3 individual alert emails, with 3 separate events.

I am trying to prevent the same alert being generated for the same "Ticket/ID" so if a new event happens it will trigger the alert should I be using a custom trigger? and if so what result would I suppress to prevent multiple alerts of the same "ticket/id"?

Any help would be greatful!

 

Thank you! 

 

Labels (3)
0 Karma
1 Solution

marnall
Motivator

Is there a reason you are using "$result.title$" instead of "Ticket" in the "Suppress results containing field value" field?

View solution in original post

marnall
Motivator

If I understand correctly, you want an alert for every unique Ticket (id) value, but every unique Ticket (id) value will be throttled for 24 hours after it triggers an alert.

You can accomplish this by setting the trigger conditions:

Trigger alert when: Number of Results
is greater than 0

Trigger: For each result

Throttle: (checked)

Suppress results containing field value: Ticket

Suppress triggering for: 24 hours

Kohtea16
Explorer

I have setup based your suggested settings (this is actually what I was using first) however it only captures 1 event instead of the 3 that are available:

I uploaded some more screenshots below on what I am experiencing and hope this makes more sense now.

 

trigger configtrigger configsample email alert that gets generatedsample email alert that gets generatedsearch query shows three eventssearch query shows three events

0 Karma

marnall
Motivator

Is there a reason you are using "$result.title$" instead of "Ticket" in the "Suppress results containing field value" field?

Kohtea16
Explorer

That was my mistake was testing out other possibilities on the "result" thinking that would help.

I changed it to just "Ticket" and I received three separate email alerts, thank you!

 

 

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...