Splunk Search

Finding duration of a transaction

msarro
Builder

Hey everyone. First let me start by saying I don't think that the "duration" field generated by a transaction will work here.
I am joining together transactions by a particular field. Let's call that field FieldX.
Inside each record, there is a field X, a start time, and an end time. The _time field is equal to the UTC time that the event occurred. The end time is some time after that.

Currently my search basically does the following:
index=myrecords
|transaction FieldX keepevicted=t mvlist=t

At this point, i'll have a transaction which will look like this:

_time,start_time,end_time,X
1:01:00,1:01:00,1:02:01,XYZ
1:02:01,1:02:01,1:50:00,XYZ

Now when I use the duration field, it will show me that the duration was 1 minute. However what I really want is the time from the first event in the transaction's start_time, to the last event in the transaction's end_time. Is this doable??

Tags (1)
0 Karma
1 Solution

kristian_kolb
Ultra Champion

Hi, I believe you could do something like this

... | eval end_epoch = strptime(END_TIME,"%H:%M:%S") 
| stats max(end_epoch) AS high min(_time) AS low by FIELD_X 
| eval dur = high - low

but it depends a little on how your timestamps for START_TIME and END_TIME are formatted, and also if events might span across several days. In the example above the START_TIME is not converted as you say it's the same as _time, which is already in epoch.

The key however is that you can use the min() and max() functions for stats to find the high/low values inside each transaction.

Have a look at the convert command and the strptime function for eval.

Hope this helps,

Kristian

View solution in original post

kristian_kolb
Ultra Champion

Hi, I believe you could do something like this

... | eval end_epoch = strptime(END_TIME,"%H:%M:%S") 
| stats max(end_epoch) AS high min(_time) AS low by FIELD_X 
| eval dur = high - low

but it depends a little on how your timestamps for START_TIME and END_TIME are formatted, and also if events might span across several days. In the example above the START_TIME is not converted as you say it's the same as _time, which is already in epoch.

The key however is that you can use the min() and max() functions for stats to find the high/low values inside each transaction.

Have a look at the convert command and the strptime function for eval.

Hope this helps,

Kristian

msarro
Builder

This method worked like a charm, thank you!

0 Karma

lguinn2
Legend

Updating to match Kristian's correct answer -

Yes, and I don't think you even need the transaction command to do it!

index=myrecords OR othersearch
| eval stime=strptime(start_time,"%H:%M:%S") | eval etime=strptime(end_time,"%H:%M:%S")
| stats min(stime) as begin max(etime) as end by FieldX
| eval duration = end - begin
| fieldformat duration=tostring(duration,"duration")
| fields - stime etime
| eval begin=strftime(begin,"%H:%M:%S") | eval end=strftime(end,"%H:%M:%S")

This should run much faster than the version with the transaction command, and work better too.

0 Karma

lguinn2
Legend

Kristian - you are right! My brain took a mini-vacation there. Your answer is the correct one.

0 Karma

_gkollias
Builder

what about doing a sum of the duration for all transactions? does it need to be formatted after sum is complete? I'd like to sum in order to find avg/median response times for the total number of transactions.

Thanks!

0 Karma

kristian_kolb
Ultra Champion

Ooops, I realize that I spent too much time editing and testing my answer... 🙂

...but would earliest() and latest() work, i.e. look at the field values for start_time and end_time and use them instead of _time???

Wouldn't latest(end_time) produce the _time for that event, i.e. 1:02:01

I agree though that you may not need to use transaction (depending on the uniqueness of Field_X over time).

/k

Get Updates on the Splunk Community!

Splunk Edge Processor | Popular Use Cases to Get Started with Edge Processor

Splunk Edge Processor offers more efficient, flexible data transformation – helping you reduce noise, control ...

Introducing New Splunkbase Governance!

Splunk apps are essential for maximizing the value of your Splunk Experience. Whether you’re using the default ...

3 Ways to Make OpenTelemetry Even Better

My role as an Observability Specialist at Splunk provides me with the opportunity to work with customers of ...