Hello everyone.
I'm using "eventstats" to generate the average of a certain field in every event that Splunk collects, something like this:
host=host1 index=itcam app=* | eventstats avg(answer) as answer_avg by app
The result is something like this:
11/26/14 12:00:00 host='host1';app='app1';answer=90.00;answer_avg=80
11/26/14 11:50:00 host='host1';app='app2';answer=5.00;answer_avg=5.6
11/26/14 11:40:00 host='host1';app='app3';answer=10.00;answer_avg=11
11/26/14 11:30:00 host='host1';app='app1';answer=80.00;answer_avg=80
11/26/14 11:20:00 host='host1';app='app2';answer=4.00;answer_avg=5.6
11/26/14 11:10:00 host='host1';app='app3';answer=12.00;answer_avg=11
11/26/14 11:00:00 host='host1';app='app1';answer=70.00;answer_avg=80
11/26/14 10:50:00 host='host1';app='app2';answer=8.00;answer_avg=5.6
11/26/14 10:40:00 host='host1';app='app3';answer=11.00;answer_avg=11
Which works great, because I have the average answer per "app" field in every event.
What I need to do now is calculate how long the current event took to occur based on the previous event, also separated by the "app" field.
For example, in the above result, latest event is from "app1", which ocurred at "12:00". The previous event of "app1" ocurred at "11:30", which means that the latest event from "app1" (at 12:00) took 30 minutes since the last one (at 11:30).
I would like to create a field, called "delay" (for example) in every event, including the latest one, with the time difference in seconds (or minutes) between an event and it's predecessor PER APP, resulting in something like this:
11/26/14 12:00:00 host='host1';app='app1';answer=90.00;answer_avg=80;delay=1800
11/26/14 11:50:00 host='host1';app='app2';answer=5.00;answer_avg=5.6;delay=1800
11/26/14 11:40:00 host='host1';app='app3';answer=10.00;answer_avg=11;delay=1800
11/26/14 11:30:00 host='host1';app='app1';answer=80.00;answer_avg=80;delay=1800
11/26/14 11:20:00 host='host1';app='app2';answer=4.00;answer_avg=5.6;delay=1800
11/26/14 11:10:00 host='host1';app='app3';answer=12.00;answer_avg=11;delay=1800
11/26/14 11:00:00 host='host1';app='app1';answer=70.00;answer_avg=80;delay=1800
11/26/14 10:50:00 host='host1';app='app2';answer=8.00;answer_avg=5.6;delay=1800
11/26/14 10:40:00 host='host1';app='app3';answer=11.00;answer_avg=11;delay=1800
In this poor example, every app took exactly 30 minutes (or 1800 seconds) to execute.
Thanks in advance!!
... View more