Alerting

How can I account for delay in response

tkwaller1
Path Finder

Hello


I have 2 searches that return message ids given certain field values.

The first search

index=messages* MSG_src="AAAAA" MSG_DOMAIN="BBBBBB" MSG_TYPE="CC *"
| rename MSGID AS MSGID1


The second search

index=messages* MSG_src="CCCCCC", MSG_DOMAIN="DDDDDDD", MSG_TYPE="Workflow Start"
| rex field=_raw "<pmt>(?<pmt>.*)</pmt>"
| rex field=_raw <EventId>(?<MSGID1>.*)</EventId>
| search pmt=EEEEEEE

The results from the second search could come in up to an hour after the results from the first search. It is not an issue unless it takes over an hour.

How can I account for this time delay so I can accurately alert if the span is longer than an hour?

Thanks for the help!

Labels (3)
0 Karma
1 Solution

dtburrows3
Builder

Maybe something like this?

| multisearch
    [
        | search index=messages* MSG_src="AAAAA" MSG_DOMAIN="BBBBBB" MSG_TYPE="CC *"
            | rename
                MSGID as MSGID1
        ]
    [
        | search index=messages* MSG_src="CCCCCC", MSG_DOMAIN="DDDDDDD", MSG_TYPE="Workflow Start"
            | rex field=_raw "<pmt>(?<pmt>.*)<\/pmt>"
            | rex field=_raw "<EventId>(?<MSGID1>.*)<\/EventId>"
            | search pmt="EEEEEEE"
        ]
    | stats
        ``` first occurrence timestamp of msg_id in search_1 ```
        earliest(eval(case(match(MSG_TYPE, "^C{2}\s+"), _time))) as first_event_epoch,
        ``` first occurrence timestamp of msg_id in search_2 ```
        earliest(eval(case('MSG_TYPE'=="Workflow Start", _time))) as second_event_epoch
            by MSGID1
    ``` calculate the time difference between the msg_id showing up in each search ```
    | eval
        diff_seconds=if(
            ``` if the msg_id didn't show up in the second search but did show up in the first ```
            isnull(second_event_epoch) AND isnotnull(first_event_epoch),
                ``` calculate how long ago from now the msg_id was seen in search_1 ```
                now()-'first_event_epoch',
                ``` msg_id exists in both searches, calculate the time difference between them in seconds ```
                'second_event_epoch'-'first_event_epoch'
            ),
        ``` convert time difference to hours```
        diff_hours='diff_seconds'/(60*60),
        ``` human readable format ```
        duration_seconds=tostring(diff_seconds, "duration")
    ``` filter off everything the has less than a 1 hour difference ```
    | where 'diff_hours'>1

View solution in original post

dtburrows3
Builder

Maybe something like this?

| multisearch
    [
        | search index=messages* MSG_src="AAAAA" MSG_DOMAIN="BBBBBB" MSG_TYPE="CC *"
            | rename
                MSGID as MSGID1
        ]
    [
        | search index=messages* MSG_src="CCCCCC", MSG_DOMAIN="DDDDDDD", MSG_TYPE="Workflow Start"
            | rex field=_raw "<pmt>(?<pmt>.*)<\/pmt>"
            | rex field=_raw "<EventId>(?<MSGID1>.*)<\/EventId>"
            | search pmt="EEEEEEE"
        ]
    | stats
        ``` first occurrence timestamp of msg_id in search_1 ```
        earliest(eval(case(match(MSG_TYPE, "^C{2}\s+"), _time))) as first_event_epoch,
        ``` first occurrence timestamp of msg_id in search_2 ```
        earliest(eval(case('MSG_TYPE'=="Workflow Start", _time))) as second_event_epoch
            by MSGID1
    ``` calculate the time difference between the msg_id showing up in each search ```
    | eval
        diff_seconds=if(
            ``` if the msg_id didn't show up in the second search but did show up in the first ```
            isnull(second_event_epoch) AND isnotnull(first_event_epoch),
                ``` calculate how long ago from now the msg_id was seen in search_1 ```
                now()-'first_event_epoch',
                ``` msg_id exists in both searches, calculate the time difference between them in seconds ```
                'second_event_epoch'-'first_event_epoch'
            ),
        ``` convert time difference to hours```
        diff_hours='diff_seconds'/(60*60),
        ``` human readable format ```
        duration_seconds=tostring(diff_seconds, "duration")
    ``` filter off everything the has less than a 1 hour difference ```
    | where 'diff_hours'>1

tkwaller1
Path Finder

From what I can tell in testing this over the last few hours this solution works really well. Still testing it out and validating accuracy but so far, it's great. I was actually working on adding duration but you definitely beat me to it.

Thanks!

0 Karma

dtburrows3
Builder

Awesome! Glad it's working out so far. 
Feel free to leave reply if you run into any issues and I we can try to resolve.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Will MSGID1 always appear in the first search if it is found in the second search?

If so, then the first search should be at least 1 hour longer than the second search, and if MSGID1 is not found in the first search but is in the second search, then it has taken longer than an hour.

0 Karma

tkwaller1
Path Finder

I was thinking something like this would work but its probably not the best way?

 

index=messages* earliest=-2h MSG_src="AAAAA" MSG_DOMAIN="BBBBBB" MSG_TYPE="CC *"
| rename MSGID AS MSGID1
| append [search index=messages* MSG_src="CCCCCC", MSG_DOMAIN="DDDDDDD", MSG_TYPE="Workflow Start"
| rex field=_raw "<pmt>(?<pmt>.*)</pmt>"
| rex field=_raw <EventId>(?<MSGID1>.*)</EventId>
| search pmt=EEEEEEE]
| stats count by MSGID1
| search count<2

 

The problem I see in testing is that this triggers on new IDs that have come in but are still within the hour timeframe.

 

0 Karma

richgalloway
SplunkTrust
SplunkTrust

What links the results of the first search to the results of the second search?  Without that, there is no solution to the problem.

---
If this reply helps you, Karma would be appreciated.
0 Karma

tkwaller1
Path Finder

The records are linked via ID in the first search its MSGID in the second search its extracted from 

| rex field=_raw "<EventId>(?<MSGID1>.*)</EventId>"
0 Karma
Get Updates on the Splunk Community!

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Splunk Education Goes to Washington | Splunk GovSummit 2024

If you’re in the Washington, D.C. area, this is your opportunity to take your career and Splunk skills to the ...