Splunk Search

Searching for 2 different events on the same order number

Flaxamax
Engager

Hello Splunk Community,

I'm fairly new to splunk and am using it to search and alert me for testing failures in my manufacturing environment.

I have a search in which I would like to match up two different events and to get a search hit ONLY when both failures occured on the same order number. I have 3 primary fields I'll be using. OrderNum, adviseText, and testName. I want my search result to return the order number when all criteria are met. To me, logically this looks like

((adviseText = "Diagnostic Error" AND testName = "Test 1") AND (adviseText = "Diagnostic error" AND testName = "Test 2")).

I've used this to test and got no results and I understand that it's because no single event matches both criteria. Many orderNums fail one or the other, but I need search to single out orderNums that fail both. Can anyone help me with this? Much appreciated.

Labels (3)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

We should be able to count the number of failed tests for each order number and display only those where the count is 2.

index=foo OrderNum=* adviseText="Diagnostic Error" testName=*
| stats dc(testName) as testCount by OrderNum
| where testCount=2
| table OrderNum

 

---
If this reply helps you, Karma would be appreciated.

Flaxamax
Engager

This is going the right direction but isn't doing what I need. I needed to be more specific, I may get 1-5 errors from one test and this is triggering if it failed one multiple times. I need it to trigger if it failed both tests specifically, not just one multiple times.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

The query is supposed to be counting distinct test names and so should not be counting the same name twice.  Can you share the exact query you're using and the results?

---
If this reply helps you, Karma would be appreciated.
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...