Getting Data In

Calculate duration per day as a percentage

ssaenger
Communicator

Hi Splunkers,

I would like to calculate the duration of an event as a percentage of the day.

I have data in a database that is being extracted, one of the fields is duration;
DURATION="01:00:00"

As this is already in human readable format, I thought i would convert it to epoch to sum and i got the returned value;
06:00:00

So far so good, or so I thought, but looking at the percentages things were not quite right. So i included the epoch in the results and it showed me this;

20412540000  (Wed Nov 06 2616 06:00:00 GMT+0000)

 

 

 

| eval DURATION=strptime(DURATION,"%H:%M:%S")
| stats 
              sum(DURATION) as event_duration
              by NAME 
          | eventstats sum(event_duration) as total_time
          
          | eval percentage_time=(event_duration/total_time)*100
          | eval event_duration1=strftime(event_duration,"%H:%M:%S")
          | eval total_time1=strftime(total_time,"%H:%M:%S")
          | eval av_time_hrs=(event_duration1/total_time1)

 

 

 

 

based on the data is it possible to get a percentage?

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

You incorrectly assumed that just because you didn't provide the year, month and day parts, your strptime will use zero values. It didn't. Strptime uses zero values for time but "current" values for date if you don't provide them in your time spec. So if you try to strptime() just hours and minutes, it will indeed use zero seconds but will parse today's date with it. So obviously your numerical timestamp will be way way way more than you wanted.

You can either as @ITWhisperer suggested, parse the duration manually or can do a very ugly hack (all time-manipulation hacks are dirty and ugly) by supplying the Epoch date as your "base" for strptime and of course telling strptime that it's in UTC (otherwise you'll get your timestamp parsed as being in your local timezone).

The more you look, the uglier it gets 😉

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Using strptime() and strftime() are for handling epoch date times which is why you are getting strange numbers. You might be better to do something like this:

| rex field=DURATION "(?<hours>\d+):(?<minutes>\d+):(?<seconds>\d+)"
| eval DURATION=((hours*60)+minutes)*60+seconds
| stats 
              sum(DURATION) as event_duration
              by NAME 
          | eventstats sum(event_duration) as total_time
          
          | eval percentage_time=(event_duration/total_time)*100
          | eval event_duration1=tostring(event_duration,"duration")
          | eval total_time1=tostring(total_time,"duration")
          | eval av_time_hrs=(event_duration/total_time)

Having said that, I am not sure what the final calculation is supposed to be showing

isoutamo
SplunkTrust
SplunkTrust

As @ITWhisperer said, you cannot use str*time functions to convert those correctly.

Here is another example for converting those correctly and calculate avg and sum and then convert those back to duration. This is not handling durations which are greater than 23:59:59.

| makeresults
| eval duration="01:00:01,00:15:00,10:10:10,05:04:03"
| eval duration = split(duration,",")
| mvexpand duration
``` above create test data ```
| eval d1 = split(duration,":"), d=tonumber(mvindex(d1,2)) + 60 * tonumber(mvindex(d1,1)) + 3600 * tonumber(mvindex(d1,0))
| stats sum(d) as tD1 avg(d) as aD1
| eval sum_duH = floor(tD1/3600), sum_duM = floor((tD1%3600) / 60), sum_duS = floor(tD1 % 3600 % 60)
| eval avg_duH = floor(aD1/3600), avg_duM = floor((aD1%3600) / 60), avg_duS = floor(aD1 % 3600 % 60)
| eval avg_D = printf("%02d:%02d:%02d", avg_duH, avg_duM, avg_duS)
| eval sum_D = printf("%02d:%02d:%02d", sum_duH, sum_duM, sum_duS)
| table avg_D sum_D

r. Ismo 

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...