Splunk Search

Create a backlog summary with the lastest status

francoisternois
Path Finder

Hi there,

I'm trying to do a search that look at the latest status of a given actionid everyday to make a kind of day by day backlog (event if no action was done). To be clearer, let's say we have the events : 

12/01/2021 actionid=actionid1 status=start
12/02/2021 actionid=actionid2 status=start
12/06/2021 actionid=actionid1 status=sent
12/08/2021 actionid=actionid2 status=sent
12/09/2021 actionid=actionid1 status=done
12/10/2021 actionid=actionid2 status=done

 The status of a given action doesn't evolve (until it change) but I've no event before it change. And the result that I want is something like (representing the backlog of the actionid by status day by day) :

 startsentdone
12/01/2021100
12/02/2021200
12/03/2021200
12/04/2021200
12/05/2021200
12/06/2021110
12/07/2021110
12/08/2021020
12/09/2021011
12/10/2021002

 

I hope my request is clear enough. Any help would by great 🙂

Best regards,

Francois

Labels (2)
0 Karma

somesoni2
Revered Legend

Give this a try

Your base search
|  timechart span=1d count by status usenull=f 
|  eval start=if(sent>0, start-sent, start) 
|  accum start 
|  eval sent=if(done>0, sent-done, sent) 
|  accum sent 
|  accum done
0 Karma

johnhuang
Motivator

There are probably better ways, here's my stab at it:

 

| makeresults 
| eval _raw="event_time=12/01/2021 actionid=actionid1 status=start,
event_time=12/02/2021 actionid=actionid2 status=start,
event_time=12/06/2021 actionid=actionid1 status=sent,
event_time=12/06/2021 actionid=actionid2 status=sent,
event_time=12/09/2021 actionid=actionid1 status=done,
event_time=12/10/2021 actionid=actionid2 status=done"
| eval events=SPLIT(_raw, ",") | mvexpand events
| rex field=events "event_time=(?<event_time>[^\s]*)\sactionid=(?<actionid>[^\s]*)\sstatus=(?<status>[^\s]*)"
| eval _time=strptime(event_time, "%m/%d/%Y")

| bucket _time span=1d | eval epoch_time=_time
| append [| makeresults | eval date_range=mvrange(relative_time(now(), "-365d"), relative_time(now(), "-0d"), 86400) | mvexpand date_range | eval _time=date_range | bucket _time span=1d | table _time]
| eventstats min(epoch_time) AS start_time max(epoch_time) AS end_time
| where _time>=start_time AND _time<=end_time
| eval {status}=1 
| eval start=IF(sent=1, sent * -1, start)  
| eval sent=IF(done=1, done * -1, sent)  
| sort 0 _time 
| fillnull value=0 start sent done
| streamstats min(_time) AS min_time max(_time) AS max(_time) sum(start) AS start sum(sent) AS sent sum(done) AS done
| dedup _time | table _time start sent done

 

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Take Action Automatically on Splunk Alerts with Red Hat Ansible Automation Platform

 Are you ready to revolutionize your IT operations? As digital transformation accelerates, the demand for ...

Calling All Security Pros: Ready to Race Through Boston?

Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...

Beyond Detection: How Splunk and Cisco Integrated Security Platforms Transform ...

Financial services organizations face an impossible equation: maintain 99.9% uptime for mission-critical ...