Splunk Search

How to calculate properly calculate delay for multiple events with same reference

DPOIRE
Path Finder

Hi,

Here is a scenario:

Step 1
9h30 TradeNumber 13400101 gets created in system
9h32 TradeNumber 13400101 gets sent to market

Step 2
9h45 TradeNumber 13400101 gets modified in system
9h50 TradeNumber 13400101 gets sent to market with modification

Step 3
9h55 TradeNumber 13400101 gets cancelled in system
9h56 TradeNumber 13400101 gets sent to market as cancelled

I need to monitor the Delay for sending the order to market.
In the above scenario we have 3 steps for the same TradeNumber and each needs to be calculated separately.

  1. Delay for sending new trade
  2. Delay for modifying
  3. Delay for cancelling

The log does not allow to differenciate the steps but the sequence is always in the right order.

If I use
| stats range(_time) as Delay by TradeNumber
| stats max(Delay)
For TradeNumber 13400101, it will return 26mins

I am looking to have a result of 5mins (gets modified ,9h45 to 9h55)

Anyway Splunk can match by sequence (or something else) and TradeNumber to calculate 3 values for the same TradeNumber ?

Labels (1)
Tags (5)
0 Karma
1 Solution

PickleRick
SplunkTrust
SplunkTrust

OK. So, assuming you have your TradeNumber and _time extracted, you can do something like this:

<your basic search>
| eval timesent=if(searchmatch("Trade sent"),_time,null())
| streamstats current=f window=1 last(timesent) as lastsent
| eval delay=lastsent-_time
| stats sum(delay) by TradeNumber

This creates an additional field and fills it with event's time only when the event is sent. Then the streamstats copies over the value of that field to the next event so you have the "received" event containing both the "sent" time as well as the "received" time (current event's time). Now all that's left is calculate the difference between those two timestamps and sum up your delays.

View solution in original post

gcusello
SplunkTrust
SplunkTrust

Hi @DPOIRE ,

you have to extract the correct delays and then use them as you like:

<your_search>
| stats 
     earliest(eval(if(searchmatch("gets created in system"),_time,""))) AS gets_created_in_system
     latest(eval(if(searchmatch("gets sent to market"),_time,""))) AS gets_sent_to_market
     earliest(eval(if(searchmatch("gets modified in system"),_time,""))) AS gets_modified_in_system
     latest(eval(if(searchmatch("gets sent to market with modification"),_time,""))) AS gets_sent_to_market_with_modification

     earliest(eval(if(searchmatch("gets cancelled in system"),_time,""))) AS gets_cancelled_in_system
     latest(eval(if(searchmatch("gets sent to market as cancelled"),_time,""))) AS gets_sent_to_market_as_cancelled
BY TradeNumber

In this way you'll have the epochtime of each event in the same row and you can calculate all the diffs you need.

Ciao.

Giuseppe

DPOIRE
Path Finder

Unfortunately, the log do not have the string "gets created in system"," gets modified..." or get whatever.

The only information we see in the logs are 

_time tradeNumber received
_time tradeNumber sent
_time tradeNumber received
_time tradeNumber sent
_time tradeNumber received
_time tradeNumber sent

0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK.

1. Instead of this ping-pong we need an _example_ of the data (filter out/obfuscate sentitive part if needed but leave the important things in place so we have something to work on)

2. Can you guarantee that

2a) It's always "received" and then "sent"?

2b) There's never an overlap between two different "transactions". So that you don't have something like received,received,sent,sent.

0 Karma

DPOIRE
Path Finder

1) Real example from today where trade was created and sent to market(7h46) and another action on the trade(7h57):
INFO | xxxxx | 2025/03/07 07:57:49 |226123425|Trade sent to market|
INFO | xxxxx | 2025/03/07 07:57:48 |226123425|Trade received
INFO | xxxxx | 2025/03/07 07:46:43 |226123425|Trade sent to market|
INFO | xxxxx | 2025/03/07 07:46:42 |226123425|Trade received

2) Yes I guarantee they always arrive in the same order and do not overlap.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK. So, assuming you have your TradeNumber and _time extracted, you can do something like this:

<your basic search>
| eval timesent=if(searchmatch("Trade sent"),_time,null())
| streamstats current=f window=1 last(timesent) as lastsent
| eval delay=lastsent-_time
| stats sum(delay) by TradeNumber

This creates an additional field and fills it with event's time only when the event is sent. Then the streamstats copies over the value of that field to the next event so you have the "received" event containing both the "sent" time as well as the "received" time (current event's time). Now all that's left is calculate the difference between those two timestamps and sum up your delays.

gcusello
SplunkTrust
SplunkTrust

Hi @DPOIRE ,

how can you identify the group of the received and sent jobs?

are they present always and in the predefined order?

if one or more are missing hoc can you define which group is missing?

maybe the missing ones are always the last ones?

Ciao.

Giuseppe

0 Karma

DPOIRE
Path Finder

I cannot differentiate except for the order they arrive.
And yes I can assume they are somehow present in the same order.
The only available info is _time, Receive/sent, tradenumber.

Now this is the problematic scenario.  In most cases, the trade only gets created in the system and is sent to the market.

Most of the time, the trade are not being modified or cancelled.

The problem is that when another action occurs on trade (modified or cancelled) it screws-up the time range and delays for that trade and it also screws up the alerts triggered on max(Delay) and avg(Delay).

 

0 Karma

kiran_panchavat
Champion

@DPOIRE 

Simulates trade events using makeresults, assigns timestamps, and labels each step (New Order, Modification, Cancellation). Uses streamstats to track event sequence, capture previous timestamps, and calculate time delay for each step.

 

kiran_panchavat_0-1741280566757.png

 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Introduction to Splunk AI

How are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. Lucky for ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Maximizing the Value of Splunk ES 8.x

Splunk Enterprise Security (ES) continues to be a leader in the Gartner Magic Quadrant, reflecting its pivotal ...