I have log lines that looks this:
mm-dd-yyyy hh:mm:ss Item counts: 1000 Process ID: 12345
...
mm-dd-yyyy hh:mm:ss Save time: 34.75 seconds Process ID: 12345
...
mm-dd-yyyy hh:mm:ss Item counts: 2500 Process ID: 7890
...
mm-dd-yyyy hh:mm:ss Save time: 30.16 seconds Process ID: 7890
My goal is to find that, there are 1500 items increase from the previous event to the current event, and that the save time took 30.16 seconds to save those new 1500 items. There are several of those log lines and I can group them by Process ID. How can I accomplish this?
Appreciate any help.
For simple solutions to this problem, you can apply the delta
command. But, delta
does not work so well when you have to also do a split, in this case by Process_ID.
One approach that can work is using the streamstats
command. Following your example, and assuming your data is (fairly) normal - that is your events always come in pairs with a Save Time and a Item Count - something like this should work:
<your search>
| streamstats current=true global=false window=3
first(save_time) as first_save_time,
first(item_count) as first_item_count,
last(save_time) as last_save_time,
last(item_count) as last_item_count
by process_id
| eval delta_time = first_save_time - last_save_time
| eval delta_count = first_item_count - last_item_count
There are some wrinkles here - like the formatting of the timestamps may not make them subtractable. But this should be a good start to where you need to be.