Splunk ITSI

CREATE ALERT IF ANY OF THE TWO KEYWORD MISSING

dhiraj
Loves-to-Learn Lots

Screenshot 2024-09-18 124304.pngPlease help me with SPL for WHENEVER THERE IS ERROR OCCURED DURING MESSAGE EXCHANGE KEYWORD OCCURS AND REQ=INI didn't occur within few minutes raise and alert. 

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Splunk doesn't look "backwards" so you have to think backwards 🙂

So as Splunk by default returns events in reverse chronological order, you have to

| reverse

them to get them in straight chronological order.

2. Assuming that you already have the REQ field extracted, keep track of its values for a 7-minute long window

| streamstats time_window=7m values(REQ) AS reqvals

3. Now you can find those events matching your searchnstring and not having the value of REQ copied over from earlier events

| search "Error occurred during message exchange" AND NOT reqvals="INI"

Two caveats

1. The search might be slow. Depending on your actual data you might make it faster by searching only for

"Error occurred during message exchange" OR REQ

2. Remember that a!=b is not the same as NOT a=b. Especially when dealing with multivalued fields.

0 Karma

jawahir007
Communicator

Try This:

index="yourindex"  latest=-7m | transaction startswith="Error occurred during message exchange" 
endswith="REQ\=INI" keepevicted=true | search closed_txn=0
0 Karma

dhiraj
Loves-to-Learn Lots

Can you please elaborated a bit.

Its not working

 

0 Karma

jawahir007
Communicator

Transaction command will Group events based on the event content..  and will generate some extra fields like "closed_txn, eventcount, etc."

In this case we have selected the starting event with content "Error occurred during message exchange"  and ending event with content "REQ\=INI".  If both the events are present then the generated field "closed_txn=1" will set, else closed_txn=0 will set.

jawahir007_0-1726655290701.png

 

Adding below condition only will show the events which doesn't have a pair (REQ=INI) event. In the above screenshot you can see the second event is actually a group of 2 events (closed_txn=1) and the first event is standing alone (closed_txn=0).

Adding the below line to the search will only keep the event, for that REQ=INI not yet received in last 7 min (Please note: 'latest =-7m' added as early filter)

| search closed_txn=0

The result will look like below, for that you can create an alert as you wish

jawahir007_1-1726656046930.png

 

I hope this is what you are looking for.

 

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @dhiraj ,

I suppose that you already extracted the REQ field, so you could try something like this:

index=your_index ("Error occurred during message exchange" OR REQ="INI") earliest=-3600s
| eval type=if(REQ="INI","INI","Message")
| stats dc(type) AS type_count values(type) As type
| where type_count=1 AND type="Message"

You can define the time period for the search (e.g. last hour).

If you eventually have more servers, you can group results by host in the stats command.

Ciao.

Giuseppe

 

0 Karma

dhiraj
Loves-to-Learn Lots

Hi @gcusello 

It's not working, we are monitoring log and Whenever the line Error occurred during message exchange and if REQ=INI line didn’t occur in last 7 minutes , it should trigger an alert.
With above search I am getting type_count=1 in both the condition, if “REQ=INI” is present and if not present.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @dhiraj ,

you have to change only the time period (7 minutes), then the search shoudl be correct:

index=your_index ("Error occurred during message exchange" OR REQ="INI") earliest=-420s
| eval type=if(REQ="INI","INI","Message")
| stats dc(type) AS type_count values(type) As type
| where type_count=1 AND type="Message"

using this search you select only events with your two conditions and using the eval and the stats you identify the presence of one or both the conditions.

In your use case you want to fire the alert if there's the error message but there isn't the REQ=INI condition, the other conditions are excluded.

Ciao.

Giuseppe

0 Karma

dhiraj
Loves-to-Learn Lots

Hi @gcusello 

I am checking for whole day for testing and it's giving me count as 1 and only type is message.

dhiraj_0-1726648469894.png

But in actual we have  both keyword in data, which means no alert required.

dhiraj_1-1726648695916.png

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @dhiraj ,

are you sure that the REQ field is already extracted?

otherwise you ha to search a different condition:

index=your_index ("Error occurred during message exchange" OR "REQ=INI") earliest=-420s
| eval type=if(searchmatch("REQ=INI"),"INI","Message")
| stats dc(type) AS type_count values(type) As type
| where type_count=1 AND type="Message"

Ciao.

Giuseppe

0 Karma

dhiraj
Loves-to-Learn Lots

Hi @gcusello 
Stil I am getting same output with both situation. If I have only "Error occurred during message exchange" then also getting type_count=1 and type =message and when I have both keyword "Error occurred during message exchange" and "REQ=INI" than also type_count=1 and type =message

FYI, I have not extracted any data, just monitoring data logs.

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @dhiraj ,

let me understand:

if you have both the events (message and REQ=INI), running the first two items of my search, you should have two types of events (check this in the interesting fields).

So the following stats command, should give you type_count=2 (if both present) and type_count=1 if there's only one.

If you have both the strings to search ("Error occurred during message exchange" and "REQ=INI") in two different events (as in your screenshots), you should have both the types; if not, check the strings to search and the eval condition.

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...