Splunk Search

Can I optimize an mvzip, mvappend, and mvexpand combination for creating additional events?

andrewtrobec
Motivator

Hello,

I am looking for optimization advice for a use case in which I need to create new event data and then calculate time delta between two timestamps.

I have a set of tasks for which I need to calculate their lifetime, either from open to close, or from open to now(). Each task can be subject to multiple state changes: New, Waiting, In Progress, Complete, Closed. The objective is to calculate the time that a task is in each state (easy), and for those that are not closed I have to calculate the time up until now() that it has been in its last state (difficult). I would like some advice on how to manage the tasks that are not closed.

I will now describe what I have currently done using a simplified data structure. Firstly the data:

id,state,timestamp,prev_timestamp
id001,New,2018-11-01 14:32:50,
id001,In Progress,2018-11-01 16:54:43,2018-11-01 14:32:50
id001,Complete,2018-11-02 09:23:01,2018-11-01 16:54:43
id001,Closed,2018-11-02 15:41:53,2018-11-02 09:23:01
id002,New,2018-11-01 16:04:36,
id002,Waiting,2018-11-01 16:54:23,2018-11-01 16:04:36
id002,In Progress,2018-11-05 11:12:51,2018-11-01 16:54:23

id001 is the simple case because it is currently not open so I just calculate all the deltas, while id002 is the more difficult case since it is not closed. It's difficult because I first have to create a new event that represents the time between the last state change until now(), and it is here that I would like some advice. Saving the above data as sample.csv and loading it into Splunk, I currently use the following search to get what I'm after:

| inputlookup sample.csv
| streamstats count as running by id
| eventstats count as final by id
| eval last = if(running=final,1,0)
| eval now = strftime(now(),"%Y-%m-%d %H:%M:%S")
| table id state timestamp prev_timestamp running final last now
| eval timestamp = if(last=1 AND NOT state="Closed",mvappend(timestamp,mvzip(timestamp,now,",")),timestamp)
| mvexpand timestamp
| eval timestamp = split(timestamp,",")
| eval prev_timestamp = if(mvcount(timestamp)>1,mvindex(timestamp,0),prev_timestamp)
| eval timestamp = if(mvcount(timestamp)>1,mvindex(timestamp,1),timestamp)
| eval hours_in_state = round((strptime(timestamp,"%Y-%m-%d %H:%M:%S") - strptime(prev_timestamp,"%Y-%m-%d %H:%M:%S"))/3600,2)

To summarize what the search is doing:

  1. Use streamstats and eventstats combination to identify the last event for each task (field called last)
  2. For each task's last event, determine whether it is Closed
  3. If a task's last event is not Closed (such as for id002), use mvzip and mvexpand to create a new event
  4. Calculate time in hours

Step 3 is the most expensive operation because I have to concatenate two timestamps, create a multivalue, expand that multivalue to get the additional event, then split the timestamps again into their appropriate fields so that I can make the delta calculation. If you run it with two tasks it's fast, but I have more than 20000 tasks which sometimes causes the search to fail.

What I'd like to ask is: is there a less expensive and more elegant way to accomplish the creation of an additional event for tasks that are not closed?

Thank you in advance and best regards,

Andrew

0 Karma
1 Solution

jeffland
SplunkTrust
SplunkTrust

You don't need to add any events nor do any multivalue calculations, you have everything you need to know in your existing events. The idea here is the following: number the events per id with streamstats, find the number of events per id with eventstats, and check if the state of the last event of each id is "Closed". If it is, calculate the duration as usual, and use the difference to now() if it isn't.

| inputlookup sample.csv
| streamstats count as count_by_id by id | eventstats max(count_by_id) as max_count_by_id by id
| eval duration = if(state="Closed" AND max_count_by_id=count_by_id, now()-strptime(timestamp, "%F %T"), strptime(timestamp, "%F %T")-strptime(last_timestamp, "%F %T")

This will give you the seconds to reach the given stage. You could certainly use other ways to find the last event of an id, like using | eventstats count instead of max(count_by_id), or something with latest... this is just the first thing that came to my mind.

View solution in original post

jeffland
SplunkTrust
SplunkTrust

You don't need to add any events nor do any multivalue calculations, you have everything you need to know in your existing events. The idea here is the following: number the events per id with streamstats, find the number of events per id with eventstats, and check if the state of the last event of each id is "Closed". If it is, calculate the duration as usual, and use the difference to now() if it isn't.

| inputlookup sample.csv
| streamstats count as count_by_id by id | eventstats max(count_by_id) as max_count_by_id by id
| eval duration = if(state="Closed" AND max_count_by_id=count_by_id, now()-strptime(timestamp, "%F %T"), strptime(timestamp, "%F %T")-strptime(last_timestamp, "%F %T")

This will give you the seconds to reach the given stage. You could certainly use other ways to find the last event of an id, like using | eventstats count instead of max(count_by_id), or something with latest... this is just the first thing that came to my mind.

andrewtrobec
Motivator

How did I not figure this out?! Thank you so much for the suggestion!

0 Karma

renjith_nair
Legend

@andrewtrobec,

First of all , appreciate the efforts you have put in to provide sample data and explain your requirements clearly 🙂

Try this and see if it works for you,

|inputlookup sample.csv
|eval _now= strftime(now(),"%Y-%m-%d %H:%M:%S")
|appendpipe [stats last(timestamp) as prev_timestamp,last(state) as state,last(_now) as timestamp by id]
|eval hours_in_state=round((strptime(timestamp,"%Y-%m-%d %H:%M:%S") - strptime(prev_timestamp,"%Y-%m-%d %H:%M:%S"))/3600,2)
|where NOT ( isnull(_now) AND state="Closed")

We eliminate all mv* . Not sure how you get the prev_timestamp and timestamp in your search but since we are using it in calculations, try not to format them using strftime before the calculation. If those are really needed in the end result, we can change them at the last.

---
What goes around comes around. If it helps, hit it with Karma 🙂
Get Updates on the Splunk Community!

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...

Cisco Use Cases, ITSI Best Practices, and More New Articles from Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Build Your First SPL2 App!

Watch the recording now!.Do you want to SPL™, too? SPL2, Splunk's next-generation data search and preparation ...