Dashboards & Visualizations

How to calculate the Average of time in splunk

aditsss
Motivator

Hi Everyone,

I have one requirement.

I have one field TimeTaken that I have calculated from BuildDuration/1000 from the logs.

My requirement is suppose on JAN 7th the Time taken for the build is like this:

BuildStartDate                                  TimeTaken

Thu Jan 7 2021                                  00:08:20

Thu Jan 7 2021                                  00:05:13

so I want average of time Taken for JAN 7

similarly for JAN 8 

BuildStartDate                               TimeTaken

Fri Jan 8 2021                                 00:11:50

Fri Jan 8 2021                                 00:16:10

Fri Jan 8 2021                                 00:10:59

so I want the Average of TimeTaken for JAN 8th

Similarly if I select last 7  days I want Average TimeTaken for each day. Can someone guide me how I can write query for Avergae TimeTaken.

Currently my query is:

index="abc" sourcetype="xyz" BuildName!="g*" (BuildResult ="*")|eval TimeTaken=round('BuildDuration'/1000) | fieldformat TimeTaken = tostring(TimeTaken, "duration")|rex mode=sed field=BuildStartDate "s/\d{2}:\d{2}:\d{2}\s[A-Z]{3}\s//g"| table ORG BuildResult BuildStartDate TimeTaken| where ORG="gc"

Thanks in advance

Labels (3)
0 Karma
1 Solution

cnuguri_ncc
Path Finder

@aditsss for your search in the earlier comment,  it should be as below.

| eval AverageBuildDurationInMinutes=tostring(AvgTimeTaken, "duration")

 

 

View solution in original post

bowesmana
SplunkTrust
SplunkTrust

There are a number of ways to do this, one simple way is to add 

| eventstats avg(TimeTaken) as AvgTimeTaken by BuildStartDate

 which would calculate the average time taken by date and just add it as an additional column. If you want to also split by the org and result you could add those fields to the 'by' clause.

However, your position of the where ORG="gc" is important - unless you want the stats to be calculated on all orgs then you must do the eventstats after the where clause.

However, it is always good to remove data you don't want sooner in the pipeline - so unless there's a good reason for filtering only gc at the end of the search, you should move it up.

You could always add an | appendpipe [  subsearch ] to the query, if you wanted to calculate the average and add it as an extra row between dates

Alternatively, if you just need a single split by date clause, then you could do a simple timechart too.

Lots of options, which really depends on the rendering you plan/prefer.

Hope this helps

 

aditsss
Motivator

@bowesmana 

I have written the query as suggested but I am getting time in seconds I want in minutes.

Below is my current query

index="abc" sourcetype="xyz" BuildName!="o*" (BuildResult ="*") | lookup Org_Alias.csv OrgFolderName OUTPUT OrgName | search OrgName=OneForce | fillnull value=ABORTED DeploymentStatus | rex field=DeploymentStatus mode=sed "s/SUCCEEDED FAILED*/FAILED/g" | eval TimeTaken=round('BuildDuration'/1000) | search TimeTaken="*" | fieldformat TimeTaken = tostring(TimeTaken, "duration") | dedup BuildName | rex mode=sed field=BuildStartDate "s/\d{2}:\d{2}:\d{2}\s[A-Z]{3}\s//g"|eventstats avg(TimeTaken) as AvgTimeTaken by BuildStartDate|table BuildStartDate TimeTaken AvgTimeTaken

 

so for 13th JAN

BuildStartDate                                     Time Taken                                  AvgTimeTaken

Wed Jan 13 2021                               00:21:49                                         1031

 Wed Jan 13 2021                              00:06:59                                          1031

Wed Jan 13 2021                              00:15:23                                             1031

Wed Jan 13 2021                                 00:24:33                                             1031

1031 is coming in complete seconds . How can I convert it into minutes.

0 Karma

cnuguri_ncc
Path Finder

@aditsss you can use tostring() with duration option

example: 
| eval readableAvgTime=tostring(averageTimeInSecs, "duration")

Usage from docs

tostring(X,"duration")Converts seconds X to the readable time format HH:MM:SS.

aditsss
Motivator

@cnuguri_ncc 

I used the below query:

| eval readableAvgTime=tostring(averageTimeInSecs, "duration")

I want AverageBuildDurationInMinutes .

As of now I am getting data like this:

BuildStartDate                 readableAvgTime

Fri Jan 8 2021                00:14:00

  Fri Jan 13 2021          00:10:39

I want average minutes.

Can you guide me on that.

0 Karma

cnuguri_ncc
Path Finder

@aditsss 

1. Did you use correct field names with the example I sent ?

2. 00:14:00 - This means 14 mins
      00:10:39 - 10 mins 39 secs
Is this not what you were expecting ?

 

0 Karma

cnuguri_ncc
Path Finder

@aditsss for your search in the earlier comment,  it should be as below.

| eval AverageBuildDurationInMinutes=tostring(AvgTimeTaken, "duration")

 

 

aditsss
Motivator

@bowesmana 

Please guide me.

0 Karma

bowesmana
SplunkTrust
SplunkTrust

@aditsss 

I suggest you have a look at the eval command and all the evaluation functions it supports. There are a number of ways you can solve the problem, each of which uses an eval function and simple maths to achieve this.

Consider a number of seconds. How do you convert seconds to minutes = divide by 60 right?

So and of these commands will give you some form of value in minutes.

| eval minutes=round(seconds/60)
| eval minutes=round(seconds/60,2)
| eval minutes=floor(seconds/60)

What do you want your minute value to look like, a whole number, rounded up or down?

Do you want 10 minutes and 45 seconds to be represented as 10.75 or 10:45

all of these can be done with eval and it's associated functions and simple maths.

Hope this helps

 

Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...