Splunk Search

_time, calculations in transactions and mvlist

ErikaE
Communicator

When I run a transaction command to group events together, I lose the _time information originally associated with those events.

I have PLC sensor data in the form unit/unit time that I am trying to totalize over time. A previous question: http://answers.splunk.com/answers/261503/totalize-a-rate-over-time.html#answer-266956 solved the problem of how to implement a total, but I found that it didn't work for discontinuous data.

I am now trying something like this:

sensor Value># | transaction maxpause=2m maxevents=-1  

which will successfully filter the events and group them into continuous runs of good data. In the set I'm working with this returns 5 transactions.

Next, I need to be able to do a trapezoidal sum on the values grouped by those transactions, with _time as the x unit. What I can't figure out is how to get the solution I was given to work with transaction data. I found some examples, but none that needed to use the _time field after applying a transaction.

If I ask mvlist to return the _time value the search breaks down and does not return the correct number of events from the transaction command.

sensor Value># | transaction maxpause=2m maxevents=-1 mvlist=_time, Value 

The above does not return the correct # of events.

I am new to splunk, coming from a manufacturing and not a coding background, and trying to figure out how to get splunk to do routine tasks I already do in other programs. Any help would be much appreciated!

Tags (3)
0 Karma
1 Solution

woodcock
Esteemed Legend

Given the following clarifications, I do not believe there can be a perfect solution:

There is a field which identifies the sensor, like a SensorName="Sensor A Units/Unit Time". By internal convention the units are in the name of the sensor.  For now I am looking at a single sensor--still getting used to splunk.  I have rate data. I am trying to calculate a total. So for example if my sensor was an endurance runner and then Value would a speed in m/sec, which updates while he is running. I am trying to take this speed and calculate how far he ran. The events are when the PLC has recorded the sensor data, and they are not evenly spaced. So I might have one event, get one 5 sec later, 4 sec later, 4 sec later, 10 sec later, etc.  I only want to count miles he ran, so I will insist that the speed is above a certain threshold before I want to consider it for totaling.  The method provided (in the other thread, linked in my question) for a trapezoidal sum works great if the runner is always running during the time period of interest. It breaks down if the runner stops. The distance total jumps up across the gap. I think I understand why that is -- you're just iterating through the table, so from the computer's perspective that time in between running is multiplied by whatever the rate was when he stopped and then added to the total.

However, this should work pretty well, I think (note that I reversed the events so that last does the correct thing); also be sure to change the threshholdSpeed value from 10 to whatever is proper:

 sensor Value># | reverse | eval thresholdSpeed=10 | streamstats current=f last(_time) AS prevTime BY last(Value) AS prevValue BY SensorName | eval duration=_time-prevTime | eval avgSpeed=(Value+prevValue)/2 | eval avgSpeed=if(avgSpeed>=threshholdSpeed, avgSpeed, 0) | eval distance=duration*avgSpeed | stats sum(distance) AS total BY SensorName

View solution in original post

0 Karma

ErikaE
Communicator

Hm. So when I expanded the time it assumed the same value for every row, instead of keeping individual values for the times throughout the transaction.

I was able to get this to work:

sensorA Value>10 | eval time=_time |  transaction maxpause=2m maxevents=-1 | table _time time Value | stats avg(Value) by _time

But couldn't find a way to turn the transaction data into a table with each row containing _time time Value

I'm essentially trying to use the _time field from transaction as the grouping field. If there's a better way to accomplish that then I'm all ears!

Thank you for your help in answering this question! I really appreciate it!

0 Karma

aljohnson_splun
Splunk Employee
Splunk Employee

If it really has to be trapezoidal though maybe the answers you got here might help?

https://answers.splunk.com/answers/261503/totalize-a-rate-over-time.html

0 Karma

ErikaE
Communicator

I can see them as a list after the transaction command but I'm not sure how to proceed from there. So for example the first transaction (331 events) has lines like the one below:

2015-06-30 21:02:10.859 +0000 Sensor="SensorA" Value="5" Quality="good"

Below that is listed:

Sensor=SensorA Value = 5 Value = 6 Value = 7 Value = 8 Value = 9 Value = 10 Value = 11 Value = 12

For example.

The command that produced that output was:

 sensorA Value># | eval time = _time | transaction maxpause=2m maxevents=-1
0 Karma
Get Updates on the Splunk Community!

See just what you’ve been missing | Observability tracks at Splunk University

Looking to sharpen your observability skills so you can better understand how to collect and analyze data from ...

Weezer at .conf25? Say it ain’t so!

Hello Splunkers, The countdown to .conf25 is on-and we've just turned up the volume! We're thrilled to ...

How SC4S Makes Suricata Logs Ingestion Simple

Network security monitoring has become increasingly critical for organizations of all sizes. Splunk has ...