Splunk Search

Take an average of values in a cell from every four rows

harshal_chakran
Builder

Hi,

I have a CSV file, which looks as follows:

ID time value parameter
1 0000-0015 12 param1
1 0015-0030 3 param1
1 0030-0045 3 param1
1 0045-0100 45 param1
1 0100-0115 4 param1
1 0115-0130 6 param1
1 0130-0145 65 param1
1 0145-0200 75 param1

I am trying to get the average of column "value" for every 4 rows, with time displayed as "0000-0100", to get the output as displayed below:-

ID time value parameter
1 0000-0100 15.74 param1
1 0100-0200 37.5 param1

I have tried many search commands but couldn't succeed.

Please Help...!!!

0 Karma
1 Solution

somesoni2
Revered Legend

Try this

source=yourcsvfile.csv | table ID, time, value parameter | streamstats count as sno | eval sno=ceil(sno/4) | rex field=time "(?<start>.*)-(?<end>.*)" | stats first(start) as start last(end) as end avg(value) as value by ID, parameter,sno | eval time=start."-".end | table ID, time, value, parameter

View solution in original post

somesoni2
Revered Legend

Try this

source=yourcsvfile.csv | table ID, time, value parameter | streamstats count as sno | eval sno=ceil(sno/4) | rex field=time "(?<start>.*)-(?<end>.*)" | stats first(start) as start last(end) as end avg(value) as value by ID, parameter,sno | eval time=start."-".end | table ID, time, value, parameter

harshal_chakran
Builder

Thanks a ton somesomi2...!!!

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...