Alerting

How to alert when second event does not occur within 20min of the first event with matching field value?

mabelmora
Observer

Hello, I am very new to Splunk. I want to trigger an alert when a second event does not occur within 20min of the first event. Index and souurcetype is the same for both the events. Both these events have the same value for the requestId field but different value for the message field.

For example,

First Event:

Event A

 

 

level: INFO
logger_name: filename1
message: First event
requestId: 12345
thread_name: http-t2
timestamp: 2022-12-19T05:44:51.757Z

 

 

Event B

 

 

level: INFO
logger_name: filename1
message: First event
requestId: 67890
thread_name: http-t2
timestamp: 2022-12-19T05:44:51.757Z

 

 

Second Event:

Event C

 

 

level: INFO
logger_name: filename2
message: Second Event
requestId: 12345
thread_name: http-t1
timestamp: 2022-12-19T05:44:51.926Z

 

 

Since Event B with request ID: 67890 does not have a second event with the same request ID, I want Event B as my output, 

 

 

level: INFO
logger_name: filename1
message: First event
requestId: 67890
thread_name: http-t2
timestamp: 2022-12-19T05:44:51.757Z

 

 

Please any help is appreciated

Labels (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @mabelmora,

please try something like this:

index=your_index (message="first message" OR message="second message"
| stats 
   earliest(_time) AS earliest 
   latest(_time) AS latest 
   dc(message) AS message_count
   BY requestId
| where (message_count=1 AND message="first message") OR latest-earliest>1200

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...