Splunk Search

Average of Seconds and Milliseconds using extracted values

tjsnow
Explorer

I am trying to put together and average duration (calculated and logged by product) as well as count. however the logs show an "s" or "ms" at the end of the value to reflect how long processing took. I need to convert the results into an average duration but have been unable to figure it out.  In this example is shows ms, but could be seconds on the next records and the value end in s

Example Record:

 

request.status='completed'; status.message=''; request.start='2021-01-29 15:50:25.402471006 +0000 UTC m=+501.139572300'; request.end='2021-01-29 15:50:26.193830852 +0000 UTC m=+501.930932145'; request.duration='791.359845ms'" 

 

Here is my current query (note requestduration is an extracted field resulting in the value of request.duration as 791.359845ms in this case) :

 

| stats count(requestduration) as count avg(requestduration) as Average by source

 

I get that I cannot average the values with the alpha characters in there, but I don't know how to convert the seconds into milis second then remove the character and average them.. Any help would be appreciated!

Labels (3)
0 Karma
1 Solution

tjsnow
Explorer

@scelikok   Thank you this got me on the right path... There were a couple changes though, so I will post them incase anyone else runs into something similar...

To go from Milliseconds to Seconds we need to divide not multiply, and in this case I still had to strip the S out of the records that were displayed in seconds, because the average wouldn't work if half the records had an alpha character in them. So I added a second if statement with no calculation. Here is the final solution.  Thanks @scelikok 

| eval requestduration=if(match(requestduration,"ms"),tonumber(replace(requestduration,"ms",""))/1000,requestduration) 
| eval requestduration=if(match(requestduration,"s"),tonumber(replace(requestduration,"s","")),requestduration)
| stats count(requestduration) as count avg(requestduration) as "Average Seconds" by source

View solution in original post

0 Karma

scelikok
SplunkTrust
SplunkTrust

Thank you @tjsnow,

I confused the conversion missed the "s" option. I am glad you found right way..

If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma

tjsnow
Explorer

@scelikok   Thank you this got me on the right path... There were a couple changes though, so I will post them incase anyone else runs into something similar...

To go from Milliseconds to Seconds we need to divide not multiply, and in this case I still had to strip the S out of the records that were displayed in seconds, because the average wouldn't work if half the records had an alpha character in them. So I added a second if statement with no calculation. Here is the final solution.  Thanks @scelikok 

| eval requestduration=if(match(requestduration,"ms"),tonumber(replace(requestduration,"ms",""))/1000,requestduration) 
| eval requestduration=if(match(requestduration,"s"),tonumber(replace(requestduration,"s","")),requestduration)
| stats count(requestduration) as count avg(requestduration) as "Average Seconds" by source
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @tjsnow,

Please try below;

| eval requestduration=if(match(requestduration,"ms"),tonumber(replace(requestduration,"ms",""))*1000,requestduration) 
| stats count(requestduration) as count avg(requestduration) as Average by source
If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...