Splunk Search

How to show only events that are not Closed

tdavison76
Path Finder

Hello,

I really appreciate any help on this one, I can't figure it out.  I am using the following to show only the "Create" events that don't have a corresponding "Close" event.  

| transaction "alert.id", alert.message startswith=Create endswith=Close keepevicted=true
| where closed_txn=0

This works, but, the search is running for "All Time", and we only keep events up to 1 yr.  I've ran into the issue of once one of the "Create" events reach that 1 yr and is deleted.  The "Close" event will make it appear in the Search results.

I'm not sure why a "Close" event without a corresponding "Create" event would be counted, or how I can prevent if a single "Create" or "Close" event from being returned once one of the events have been deleted or is beyond the Search time frame selected.

Any ideas on this one? 🙂

Thanks for any help, you will save me some sleepless nights.

Tom

 

Labels (5)
0 Karma
1 Solution

tdavison76
Path Finder

Hello,

That is awesome, by removing:

table alert.message, 

And adding the "by alert.id". only the events that are created with no close appear as expected.  Thank you for that.  The last piece of the puzzle is how can I create a table that contains other fields that aren't in the "stats" command?

If I add a field from the source, nothing is returned.  Here's the full working "Search" you helped me with, it includes the field entity.source, where nothing is returned.

index=healthcheck integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*" "entity.source"=Meraki, action IN ("Create","Close") 
| eval Create=IF(action=="Create",1,0)
| eval Close=IF(action=="Close",1,0)
| stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose by alert.id, alert.message
| where isClose=0
| table entity.source, alert.id, alert.message

 

Splunk-Search.png

I wish I could give you 20 kudos.

Thanks again,

Tom

 

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Try to avoid using the transaction command because it's very non-performant.  Try this, instead.  Search for all Create and Close events then keep only the most recent for each alert.id/alert.message pair.  Throw out the Close events and what's left will be Creates without a Close.

index=foo ("Create" OR "Close")
```Select the most recent event for each id/message ```
| dedup alert.id alert.message
```Discard the Close events```
| where NOT "Close"
---
If this reply helps you, Karma would be appreciated.
0 Karma

tdavison76
Path Finder

Thank you so much for the details, I gave it a shot, but it produced the following error:

Error in 'where' command: Type checking failed. 'XOR' only takes boolean arguments.

 

Here's the full search I am doing:

index=healthcheck ("Create" OR "Close") integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*"
| dedup alert.id alert.message
| where NOT "Close"
| table alert.message

 

Any ideas what I am doing wrong? 🙂

Thanks again,

Tom

0 Karma

richgalloway
SplunkTrust
SplunkTrust

I didn't make it clear, but my example code is more pseudo-code than pure SPL, since I don't know exactly what to look for to locate "Close" or "Create" messages.  However, you should be able to fix your problem by replacing where with search.

---
If this reply helps you, Karma would be appreciated.
0 Karma

tdavison76
Path Finder

Thanks again Rich,

Changing it to "search" got me past the error. 🙂  Sorry, I didn't give all the details, I found out the "Create" "Close" is in the "action" field.  So an example event is:

{"actionType": "custom", "customerId": "3a1f4387-b87b-4a3a-a568-cc372a86d8e4", "ownerDomain": "integration", "ownerId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9", "discardScriptResponse": true, "sendCallbackToStreamHub": false, "requestId": "dc4c0970-e1fa-492a-999b-10979478d980", "action": "Create", "productSource": "Opsgenie", "customerDomain": "siteone", "integrationName": "Opsgenie Edge Connector - Splunk", "integrationId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9", "customerTransitioningOrConsolidated": false, "source": {"name": "Meraki", "type": "Zapier"}, "type": "oec", "receivedAt": 1739802456801, "params": {"type": "oec", "alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "customerId": "3a1f4387-b87b-4a3a-a568-cc372a86d8e4", "action": "Create", "integrationId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9", "integrationName": "Opsgenie Edge Connector - Splunk", "integrationType": "OEC", "customerDomain": "siteone", "alertDetails": {}, "alertAlias": "STORE_674_BOXONE_MX_674", "receivedAt": 1739802456801, "customerConsolidated": false, "customerTransitioningOrConsolidated": false, "productSource": "Opsgenie", "source": {"name": "Meraki", "type": "Zapier"}, "alert": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}, "entity": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}, "mappedActionDto": {"mappedAction": "postActionToOEC", "extraField": ""}, "ownerId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9"}, "integrationType": "OEC", "alert": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}, "customerConsolidated": false, "mappedActionDto": {"mappedAction": "postActionToOEC", "extraField": ""}, "alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "alertAlias": "STORE_674_BOXONE_MX_674", "alertDetails": {}, "entity": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}}

 

When I run the following Search, it gives me every event that has an action of "Create", but I need it to return only the "Create" that doesn't have a corresponding "Close".   The alert.id would be unique with each Create and Close event.

index=healthcheck ("Create","Close") integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*" 
| dedup alert.id, action
| search NOT "Close"
| table alert.message

Really appreciate the help, going crazy trying to figure this one out 🙂

Thanks,

Tom

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @tdavison76 

I think you might be able to achieve this by adding an 'AND _time <= relative_time(now(), "-1y@y")' to your search (adjusting the data accordingly) so that you ignore old events where the created event is missing because it has aged out.

I would also look to change your search to not use the transaction command, which is very resource intensive and has limitations, instead you could use/adapt the following to get similar outputs:

index=YourIndex earliest=-1y latest=now alert.message IN ("Create","Close") 
| eval {alert.message}=1
``` or use | eval Create=IF(alert.message=="Create",1,0) Close=IF(alert.message=="Close",1,0) ```
| stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose
| where isClose=0

Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped.
Regards

Will

tdavison76
Path Finder

Thanks, I figured it out with your help.  Very much appreciated, and I hope you have a great day.

0 Karma

tdavison76
Path Finder

Thank you very much for your help,

I gave it a shot with the:

eval {alert.message}=1

But, didn't get any results back,  I then tried with the:

| eval Create=IF(alert.message=="Create",1,0) Close=IF(alert.message=="Close",1,0)  | stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose | where isClose=0

and got back a:

Error in 'EvalCommand': The expression is malformed.

I really suck at this  😞

Thank you for the help,

Tom

 

0 Karma

tdavison76
Path Finder

Hello,

Sorry, I found out the "Create" and "Close" is in the "action" field.  I ran the following Search and it for some reason I get 0 results in the table, and all Create and Close events are returned.

index=healthcheck integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*" "entity.source"=Meraki, action IN ("Create","Close") 
| eval Create=IF(action=="Create",1,0)
| eval Close=IF(action=="Close",1,0)
| stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose
| where isClose=0
| table alert.message

 

Sorry for the confusion, and thank you very much for the help.  🙂

Thanks,

Tom

0 Karma

livehybrid
SplunkTrust
SplunkTrust

What happens if you remove the SPL after and including the "where"?
I also realised I'm missing the BY statement on the end of the stats command (BY alert.id) - presumably?

Hope this helps

0 Karma

tdavison76
Path Finder

Hello,

That is awesome, by removing:

table alert.message, 

And adding the "by alert.id". only the events that are created with no close appear as expected.  Thank you for that.  The last piece of the puzzle is how can I create a table that contains other fields that aren't in the "stats" command?

If I add a field from the source, nothing is returned.  Here's the full working "Search" you helped me with, it includes the field entity.source, where nothing is returned.

index=healthcheck integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*" "entity.source"=Meraki, action IN ("Create","Close") 
| eval Create=IF(action=="Create",1,0)
| eval Close=IF(action=="Close",1,0)
| stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose by alert.id, alert.message
| where isClose=0
| table entity.source, alert.id, alert.message

 

Splunk-Search.png

I wish I could give you 20 kudos.

Thanks again,

Tom

 

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...