Splunk Search

transaction command for same type of multiple events

sunny_871
Observer

Hi, I have a scenario where I want to calculate the duration between 1st and last event. The thing is these events can happen multiple times for the same session. 

The 1st event can happen multiple times and everytime it is the exact same thing but I only want the transaction to start from very first event so that we know what is the exact duration.

Sample events below - See the last 2 events where one says MatchPending and another one says MatchCompleted.

What I want is to calculate the duration between 1st event and last event where it says MatchCompleted

 

2024-08-16 13:43:34,232|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Sending GET request to https://myapi.com/test
2024-08-16 13:43:38,630|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Response Received in 114 milliseconds "200 OK" response for GET request to https://myapi.com/test: "status":"MatchPending"
2024-08-16 13:43:50,516|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Sending GET request to https://myapi.com/test
2024-08-16 13:43:57,630|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Response Received in 114 milliseconds "200 OK" response for GET request to https://myapi.com/test: "status":"MatchPending"
2024-08-16 13:44:15,516|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Sending GET request to https://myapi.com/test
2024-08-16 13:43:50,510|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Response Received in 114 milliseconds "200 OK" response for GET request to https://myapi.com/test: "status":"MatchCompleted"

 

 

Any help is appreciated. 

Best Regards,
Shashanlk

Labels (1)
0 Karma

renjith_nair
Legend

If you have identifier of each transaction such as transaction id, use stats to get the earliest and latest

for e.g.

your search |earliest(_time) as starttime,latest(_time) as endtime by transactionID|eval duration=endtime-starttime

 

---
What goes around comes around. If it helps, hit it with Karma 🙂
0 Karma

sunny_871
Observer

@renjith_nair Thanks for the response but I don't think your solution is fully working.

I tried it like below but then _time will not be available for me to plot the graph. I need to plot that duration on a graph. Is there a way to do that?

| stats earliest(_time) as starttime,latest(_time) as endtime by uniqueId 
| eval duration=endtime-starttime 
| timechart span=15m p95(duration) as p95Responsetime
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Since you're aggregating a relatively long-spanned set of events into a single data point you have to make a concious decision which point in time to assume as the timestamp for the result. You can easily assign a value to the _time field just by doing

| eval _time=something

But you have to decide which timestamp to use.

Is it the start time for your transaction? Is it the endtime? Maybe it's a middle of the transaction... It's up to you to make that decision.

Anyway, when dealing with _time in stats, there's not much point in using latest() and earliest(). min() and max() suffice 🙂

0 Karma

sunny_871
Observer

Hey @PickleRick Apologies I dont think I have fully understood what you are trying to imply here.

My objective is to calculate duration between 2 set of events but one of those 2 events can happen multiple times. It is like sending a request to an API and then validate the response. If the response is not what was expected then send the same request again and keep sending until you get the expected response.

So my objective is to calculate the time when the 1st request was sent and when the last expected response was received.

2024-08-16 13:43:34,232|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Sending GET request to https://myapi.com/test
2024-08-16 13:43:50,232|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Sending GET request to https://myapi.com/test
2024-08-16 13:44:14,232|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Sending GET request to https://myapi.com/test
2024-08-16 13:43:44,232|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Sending GET request to https://myapi.com/test

2024-08-16 13:43:57,510|catalina-exec-192|INFO|LoggingClientHttpRequestInterceptor|Response Received in 114 milliseconds "200 OK" response for GET request to https://myapi.com/test: "status":"MatchCompleted"

 Please find the set of events again here. 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK. From the top.

You have a set of events. Each event has the _time field describing when the event happened.

You're using the stats command to find earliest and latest (or min and max which in this case boils down to the same thing) values of this field for each uniqueId.

As an output you have three fields - starttime, endtime and uniqueId.

You no longer have the _time field.

Timechart must have the _time field since it, well, it charts over the time.

So you have to assign some value to the _time field manually. You can do it either by using eval as I showed previously or simply by adding another aggregation to your stats. For example

| stats earliest(_time) as starttime,latest(_time) as endtime avg(_time) as _time by uniqueId

That's just one of possible ways of doing that (of course you can use avg(_time), min(_time), max(_time) or any other aggregation function which makes sense in this context).

0 Karma
Get Updates on the Splunk Community!

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Splunk Education Goes to Washington | Splunk GovSummit 2024

If you’re in the Washington, D.C. area, this is your opportunity to take your career and Splunk skills to the ...