Splunk Search

Group and count events within 100ms occurence

svukov
Loves-to-Learn

Hello, 

I have the following data. I want to return tabled data if the events happened within 100ms, and they match by same hashcode and same thirdPartyId. So essentially the search has to be sorted by each combination of thirdPartyId and hashcode and then compare events line by line to see if the previous line and current happened within 100ms. How should the query look like?

| makeresults format=csv data="startTS,thirdPartyId,hashCode,accountNumber
2024-04-16 21:53:02.455-04:00,AAAAAAAA,00000001,11111111
2024-04-16 21:53:02.550-04:00,AAAAAAAA,00000001,11112222
2024-04-16 21:53:02.650-04:00,BBBBBBBB,00001230,22222222
2024-04-16 21:53:02.650-04:00,CCCCCCCC,00000002,12121212
2024-04-16 21:53:02.730-04:00,DDDDDDDD,00000005,33333333
2024-04-16 21:53:02.830-04:00,DDDDDDDD,00000005,33334444
2024-04-16 21:53:02.670-04:00,BBBBBBBB,00000002,12121212
2024-04-16 21:53:02.700-04:00,CCCCCCCC,00000002,21212121"
| sort by startTS, thirdPartyId

Labels (7)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| eval startTS=strptime(startTS, "%F %T.%3N%z")
| sort by startTS, thirdPartyId
| fieldformat startTS=strftime(startTS, "%F %T.%3N")
| streamstats range(startTS) as difference window=2 global=f by hashCode thirdPartyId
0 Karma

bharathkumarnec
Contributor

@svukov please have a look at transaction command after sort if needed on time  here https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Transaction ?

| transaction thirdPartyId,hashcode maxspan=?

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...