Splunk Search

How to find the time difference between the current and previous event per field?

rodrigorenie
Explorer

Hello everyone.

I'm using "eventstats" to generate the average of a certain field in every event that Splunk collects, something like this:

host=host1 index=itcam app=* | eventstats avg(answer) as answer_avg by app

The result is something like this:

11/26/14 12:00:00   host='host1';app='app1';answer=90.00;answer_avg=80
11/26/14 11:50:00   host='host1';app='app2';answer=5.00;answer_avg=5.6
11/26/14 11:40:00   host='host1';app='app3';answer=10.00;answer_avg=11
11/26/14 11:30:00   host='host1';app='app1';answer=80.00;answer_avg=80
11/26/14 11:20:00   host='host1';app='app2';answer=4.00;answer_avg=5.6
11/26/14 11:10:00   host='host1';app='app3';answer=12.00;answer_avg=11
11/26/14 11:00:00   host='host1';app='app1';answer=70.00;answer_avg=80
11/26/14 10:50:00   host='host1';app='app2';answer=8.00;answer_avg=5.6
11/26/14 10:40:00   host='host1';app='app3';answer=11.00;answer_avg=11

Which works great, because I have the average answer per "app" field in every event.

What I need to do now is calculate how long the current event took to occur based on the previous event, also separated by the "app" field.

For example, in the above result, latest event is from "app1", which ocurred at "12:00". The previous event of "app1" ocurred at "11:30", which means that the latest event from "app1" (at 12:00) took 30 minutes since the last one (at 11:30).

I would like to create a field, called "delay" (for example) in every event, including the latest one, with the time difference in seconds (or minutes) between an event and it's predecessor PER APP, resulting in something like this:

 11/26/14 12:00:00   host='host1';app='app1';answer=90.00;answer_avg=80;delay=1800
 11/26/14 11:50:00   host='host1';app='app2';answer=5.00;answer_avg=5.6;delay=1800
 11/26/14 11:40:00   host='host1';app='app3';answer=10.00;answer_avg=11;delay=1800
 11/26/14 11:30:00   host='host1';app='app1';answer=80.00;answer_avg=80;delay=1800
 11/26/14 11:20:00   host='host1';app='app2';answer=4.00;answer_avg=5.6;delay=1800
 11/26/14 11:10:00   host='host1';app='app3';answer=12.00;answer_avg=11;delay=1800
 11/26/14 11:00:00   host='host1';app='app1';answer=70.00;answer_avg=80;delay=1800
 11/26/14 10:50:00   host='host1';app='app2';answer=8.00;answer_avg=5.6;delay=1800
 11/26/14 10:40:00   host='host1';app='app3';answer=11.00;answer_avg=11;delay=1800

In this poor example, every app took exactly 30 minutes (or 1800 seconds) to execute.

Thanks in advance!!

1 Solution

aweitzman
Motivator

Well, either the earliest or the latest one has to have no value, right? If you want the latest one to have values (but obviously, the earliest ones will not), sort and resort the data and flip the calculation:

host=host1 index=itcam app=*
| eventstats avg(answer) as answer_avg by app
| sort 0 _time
| streamstats current=f last(_time) as LastTime by app
| eval delay=_time-LastTime
| sort 0 -_time

View solution in original post

aweitzman
Motivator

Well, either the earliest or the latest one has to have no value, right? If you want the latest one to have values (but obviously, the earliest ones will not), sort and resort the data and flip the calculation:

host=host1 index=itcam app=*
| eventstats avg(answer) as answer_avg by app
| sort 0 _time
| streamstats current=f last(_time) as LastTime by app
| eval delay=_time-LastTime
| sort 0 -_time

aweitzman
Motivator

There's a way to use streamstats where it works on the previous item, not the current one, by using current=f. If you add such an item to your results, you'll have what you need to do a time difference, like so:

host=host1 index=itcam app=*
| eventstats avg(answer) as answer_avg by app
| streamstats current=f last(_time) as LastTime by app
| eval delay=LastTime-_time

Hope this helps!

rodrigorenie
Explorer

Thanks for your answer, but I've tried that before and this search gives me the following result:

 11/26/14 12:00:00   host='host1';app='app1';answer=90.00;answer_avg=80
 11/26/14 11:50:00   host='host1';app='app2';answer=5.00;answer_avg=5.6
 11/26/14 11:40:00   host='host1';app='app3';answer=10.00;answer_avg=11
 11/26/14 11:30:00   host='host1';app='app1';answer=80.00;answer_avg=80;delay=1800
 11/26/14 11:20:00   host='host1';app='app2';answer=4.00;answer_avg=5.6;delay=1800
 11/26/14 11:10:00   host='host1';app='app3';answer=12.00;answer_avg=11;delay=1800
 11/26/14 11:00:00   host='host1';app='app1';answer=70.00;answer_avg=80;delay=1800
 11/26/14 10:50:00   host='host1';app='app2';answer=8.00;answer_avg=5.6;delay=1800
 11/26/14 10:40:00   host='host1';app='app3';answer=11.00;answer_avg=11;delay=1800

I.E., the delay of the first event for each app is calculated only on the second event.

0 Karma

aweitzman
Motivator

I promoted the one that worked for you from comment to answer (somehow it lost your response to it).

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...