Splunk Search

Total duration of multiple events

zoebanning
Path Finder

Hello Splunk Community,

Can anyone help me build a query based on the below;

I have a batch job that has multiple steps logged as separate events. How can I calculate the total duration of the batch job (Step 1 Start - Step 5 End). Example of my output format (Dummy Data Used):

StepStart_TimeEnd_TimeDuration (Hours)
12021-09-11 22:45:002021-09-11 22:45:0100:00:01
22021-09-11 22:45:012021-09-11 22:45:2000:00:19
32021-09-11 22:45:202021-09-11 22:58:1500:12:55
42021-09-11 22:58:152021-09-11 22:58:3900:00:24
52021-09-11 22:58:392021-09-11 24:20:3101:21:52

 

THANK YOU!

Labels (1)
0 Karma
1 Solution

kamlesh_vaghela
SplunkTrust
SplunkTrust

@zoebanning 

I hope this will help you

YOUR_SEARCH
| table Step	Start_Time	End_Time	Duration*
| eval start_epoch=strptime(Start_Time,"%Y-%m-%d %H:%M:%S"),end_epoch=strptime(End_Time,"%Y-%m-%d %H:%M:%S")
| stats min(start_epoch) as start_epoch max(end_epoch) as end_epoch
| eval diff_in_sec=end_epoch-start_epoch,duration=tostring(diff_in_sec,"duration")

 

My Sample Search :

| makeresults | eval _raw="Step	Start_Time	End_Time	Duration (Hours)
1	2021-09-11 22:45:00	2021-09-11 22:45:01	00:00:01
2	2021-09-11 22:45:01	2021-09-11 22:45:20	00:00:19
3	2021-09-11 22:45:20	2021-09-11 22:58:15	00:12:55
4	2021-09-11 22:58:15	2021-09-11 22:58:39	00:00:24
5	2021-09-11 22:58:39	2021-09-12 00:20:31	01:21:52" | multikv forceheader=1
| table Step	Start_Time	End_Time	Duration*
| eval start_epoch=strptime(Start_Time,"%Y-%m-%d %H:%M:%S"),end_epoch=strptime(End_Time,"%Y-%m-%d %H:%M:%S")
| stats min(start_epoch) as start_epoch max(end_epoch) as end_epoch
| eval diff_in_sec=end_epoch-start_epoch,duration=tostring(diff_in_sec,"duration")

 

Thanks
KV
▄︻̷̿┻̿═━一   😉

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.
 

View solution in original post

kamlesh_vaghela
SplunkTrust
SplunkTrust

@zoebanning 

I hope this will help you

YOUR_SEARCH
| table Step	Start_Time	End_Time	Duration*
| eval start_epoch=strptime(Start_Time,"%Y-%m-%d %H:%M:%S"),end_epoch=strptime(End_Time,"%Y-%m-%d %H:%M:%S")
| stats min(start_epoch) as start_epoch max(end_epoch) as end_epoch
| eval diff_in_sec=end_epoch-start_epoch,duration=tostring(diff_in_sec,"duration")

 

My Sample Search :

| makeresults | eval _raw="Step	Start_Time	End_Time	Duration (Hours)
1	2021-09-11 22:45:00	2021-09-11 22:45:01	00:00:01
2	2021-09-11 22:45:01	2021-09-11 22:45:20	00:00:19
3	2021-09-11 22:45:20	2021-09-11 22:58:15	00:12:55
4	2021-09-11 22:58:15	2021-09-11 22:58:39	00:00:24
5	2021-09-11 22:58:39	2021-09-12 00:20:31	01:21:52" | multikv forceheader=1
| table Step	Start_Time	End_Time	Duration*
| eval start_epoch=strptime(Start_Time,"%Y-%m-%d %H:%M:%S"),end_epoch=strptime(End_Time,"%Y-%m-%d %H:%M:%S")
| stats min(start_epoch) as start_epoch max(end_epoch) as end_epoch
| eval diff_in_sec=end_epoch-start_epoch,duration=tostring(diff_in_sec,"duration")

 

Thanks
KV
▄︻̷̿┻̿═━一   😉

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.
 

zoebanning
Path Finder

Hi @kamlesh_vaghela

Thank you, this is exactly what I was trying to achieve!

In the example below it only takes into consideration the batch steps for 1 batch job and you helped calculate the duration for this one job. Would you happen to know how to create a timechart which will show the duration of the batch jobs over a period of time (the batch usually runs overnight everyday)?

Let me know if you require additional information. 

Thanks again for your outstanding help!!!

Zoe

0 Karma
Get Updates on the Splunk Community!

Celebrating Fast Lane: 2025 Authorized Learning Partner of the Year

At .conf25, Splunk proudly recognized Fast Lane as the 2025 Authorized Learning Partner of the Year. This ...

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...