Hi all,
I have a strange issue that I cant seem to find any info on and I'm hoping someone can help me.
I have a few alerts that are sent to Slack using the slack_alerts addon from Splunkbase.
Recently, the results of these alerts are duplicated within the alert itself. (I am not receiving multiple of the same alert individually)
This is the alert config:
AWS Authentication Failed
*Time:* $result.eventTime$
*Event:* $result.eventName$
*Account:* $result.recipientAccountId$
*User:* $result.userName$
*Source IP:* $result.sourceIPAddress$
*EventID:* $result.eventID$
And the result:
AWS Authentication Failed
Time: 2021-01-18T10:40:16Z
2021-01-18T10:40:16Z
Event: AssumeRoleWithSAML
AssumeRoleWithSAML
Account: 22xxxxxxxxxx
22xxxxxxxxxx
User: xxxxxxxxx
Source IP: 54.89.xxxxxxx
54.89.xxxxxxxx
EventID: 862b7079-e38b-4b1e-xxxxxxxxx
862b7079-e38b-4b1e-xxxxxxxx
I have tried recreating the alerts, removing and readding the Slack addon but same behaviour
Hoping someone can help!
Hi @poiromaniax,
If there is no typo, I can see User field is not duplicated. Is it possible that you alert result has multi-values grouped by userName ? If not can you please share your alert search?
If this reply helps you an upvote is appreciated.
@scelikok thanks for the response.
I expanded all fields and I cannot see any duplicates.
I also sent the alert via email in case it was specific to Slack, but same result.
Search is:
index=aws tag=authentication NOT (action=success user=*$)
| search (action="failure") src_user!=unknown
| search "requestParameters.roleArn"!="arn:aws:iam:xxxxxxxxxxxxx"
| search "requestParameters.roleSessionName"!=xxxxxxxxxxxxx