Splunk Search

How to find duplicates on multiple fields?

sejiweji
New Member

I have logs with the following three fields:

-category 

-price 

-requestID (unique per entry)

I want to find all requestID's for entries that have BOTH the same category and price within a 1 hour time span.

I started off with this query: 

index=foo component="shop-service" | streamstats count as dupes by category, price
| search dupes> 1

But I cannot seem to calculate the duplicate entries nor tie it to the requestID

 

Labels (4)
Tags (2)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

I assume you are searching a time window longer than 1 hour if you are using streamstats. If you are only searching 60 minutes, then stats will work.

To collect the requestIDs, use values(requestID) in the streamstats command

index=foo component="shop-service" 
| streamstats time_window=1h values(requestID) as requestIDs by category price
| where mvcount(requestIDs) > 1

This will collect all unique requestIDs that have the same category and price and the mvcount() does the > 1 test. 

Note that there are event limitations using streamstats with long time windows, see the docs, so be aware.

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try eventstats not streamstats

index=foo component="shop-service" | eventstats count as dupes by category, price
| search dupes> 1
0 Karma

PaulPanther
Motivator
index=foo component="shop-service"
| stats list(request_id) count as dupes by category, price 
| where dupes > 1
0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...