Splunk Search

## Arithmetic on time fields Path Finder

I have a record that has a field with a processing stat on it.

myField=00:00:12.12456

i.e. it took 12 and a bit seconds.

I want to report on the average time for a particular operation over a time bucket, something like this:

``````source="MySource" | timechart avg(myField) by Activity
``````

The problem is that `avg(myField)` always yields 0. So I am assuming that the default conversion is giving rubbish. So I tried explicitly using convert to push into a computable format, something like:

``````| convert dur2sec(myField) as fieldAsSecs
``````

But this always comes out as zero as well!! So I tried being explicit about the format:

``````convert timeformat="%H:%M:%S:" dur2sec(myField) as convertTime
``````

But that didn't create the field at all!! Even if it did, how do I handle millisecs?

Any clues?

Tags (2) Path Finder

Thanks for the other answers guys, useful techniques.

In the end I figured out a way to do it and support millisecs as follows:

``````| eval ProcessingTime= strptime("2000 01 01 " + myField, "%Y %d %m %H:%M:%S.%q") - strptime("2000 01 01 00:00:00.000000", "%Y %d %m %H:%M:%S.%q")
``````

Basically strp my time then strp the base time period and then do the diff. Works a treat but a bit ugly!! Super Champion

You could try using a `sed` based `rex` command to convert your timeformat into the expected `dur2sec()` format, or into an expression usable by `relative_time`.

I think you may run into issues with your sub-seconds, I would try stripping off the milliseconds first, then once you get a working solution, add them back in.

Heres one possible approach:

Convert `00:00:12.12456` into `+0h+0m+12s`

Here is a test search I used to test this out on my system:

``````* | head 1 | eval myField="00:00:12.12456" | eval myField_orig=myField
| rex mode=sed field=myField "s/^0?(\d+):0?(\d+):0?(\d+)(?:\.\d+)?\$/+\1h+\2h+\3s/"
| eval output=relative_time(0, myField)
| table myField_orig myField output
``````

So I guess you could simply this down to a single eval, like this:

``````... | eval output=relative_time(0, replace(myField, "^0?(\d+):0?(\d+):0?(\d+)(?:\.\d+)?\$", "+\1h+\2h+\3s"))
``````

Unfortunately, `relative_time` doesn't accept sub-seconds (as far as I can tell), but you can do some simple string trickery to get around that. There's an updated example that handles sub-seconds:

``````... | eval output=tostring(round(relative_time(0, replace(myField, "^0?(\d+):0?(\d+):0?(\d+)(?:\.\d+)?\$", "+\1h+\2h+\3s")),0)) + replace(myField, "^.+:\d+(\.\d+)?\$", "\1")
``````

This seems like a good candidate for a macro; so you don't have to keep copy-n-pasting this beast (especially if you find some corner case that I missed; it's nice to only have to fix it in one location)

I've run into some other weird time manipulation situation before that may interest you as well. It would be nice if there were some more direct ways to handle some of these situations, but for now there are at least workarounds... Influencer

You could calculate the seconds by hand:

``````... | rex field=myField "(?<myFieldHrs>\d+):(?<myFieldMins>\d+):(?<myFieldSecs>\d+)(?<myFieldSub>\.\d+)" | eval myField= tonumber(myFieldHrs)*3600 + tonumber(myFieldMins)*60 + tonumber(myFieldSecs) + tonumber(myFieldSub)
`````` Path Finder

Ok I have part of the answer from other postings. If I spoof the date portion of the time format it at least converts, so:

| eval convertTime = strptime("0000 01 01 " + myField, "%Y %m %d %H:%M:%S.%q")

This actually works but has a large second component (presumably because I guessed the base year wrong). I tried the other obvious one:

| eval convertTime = strptime("1970 01 01 " + myField, "%Y %m %d %H:%M:%S.%q")

This yields an even bigger offset Get Updates on the Splunk Community!