Splunk seems to be ignoring numbers less than 1.0 regardless of incoming precision. If my tField value is 1.000 or greater the math works just fine. But if it is less than 1.000 the eval function treats it as if it were 0.0 (or NULL?)
(REMOVED MY ATTEMPT TO SHOW SPLUNK OUTPUT AS A TABLE). I am trying to runs streamstats on a set of log records with millisecond-accurate entries for the completion of a task. Each includes the task run time expressed as seconds to up to 4 decimal places (my "tField"). When I try to compute the accurate start time of a task, any task that lasts more than a second computes accurately. Every task that lasts less than a second results in no values for computed fields. For example:
Timestamp = 11:34:08.707
if the tField = 1.001, I can subtract it from the _time value (1401982448.707)
and get the correct result (1401982447.706)
Same timestamp, next entry in the log:
the tField value = .2426 (log file does not include a leading 0) no computed fields
are produced and I only have the _time value (1401982448.707) for that record.
I have experimented with all kinds of permutations of using the exact function and nothing seems to work. The function that produced this output is:
eval t_ms=exact(_time)*1000.0|eval tX_ms=exact(tField)*1000.0|eval t0_ms=exact(tms)-exact(tXms)| sort 0 _time |table _time tField tX_ms t0_ms t_ms
I have an image of the output but I don't have enough Karma to upload it.
Sorry about the wierd attempt to show the output as a table. Meanwhile, I have a workaround. (tried answering my own question but the site threw a 500 error.)
The real problem turns out to be that Splunk is failing to convert a real number that starts with the decimal point. I was able to get by this by adding a leading "0" to every instance of tField.