I would like something like a stats
command that groups events only if they form a contiguous run of a particular field. Suppose I have a sequence of events that have a field called 'name' that appears in the following time-ordered sequence: A,A,A,A,B,C,B,Y,Y,Y,A,A,A,B,B,B. I would like to do something like:
| stats-like-command count, min(_time), max(_time) by name
This should produce something like:
A, 4, time1, time2
B, 1, time3, time4
C, 1, time5, time6
B, 1, time7, time8
Y, 3, time9, time10
A, 3, time11, time12
B, 3, time13, time14
It shouldn't matter how much time is between the events, I just want to collapse based on "runs" within the sequence of events.
I think I figured it out.
| streamstats count, min(_time) as min_time, max(_time) as max_time BY name reset_on_change=true
| eventstats max(count) as max_count by max_time, name
| where count=max_count
| convert ctime(*_time)
| table name, count, min_time, max_time
For the Karma, feel free to paste this into the answer or provide a better way - Thanks.
I think I figured it out.
| streamstats count, min(_time) as min_time, max(_time) as max_time BY name reset_on_change=true
| eventstats max(count) as max_count by max_time, name
| where count=max_count
| convert ctime(*_time)
| table name, count, min_time, max_time
For the Karma, feel free to paste this into the answer or provide a better way - Thanks.
Actually, this occasionally returns multiple consecutive results with the same name. Could it be that the eventstats is grabbing partial results from the streamstats command before "max_time" is fully resolved?