Hello all,
I have a set of data as below. In the column is value of each id according to the time
_time | id = 12345 | id = 12347 | id = 12349 |
01-févr | 10 | 20 | 5 |
02-févr | 12 | 45 | 9 |
03-févr | 15 | 53 | 12 |
04-févr | 17 | ||
05-févr | |||
06-févr | 120 | ||
07-févr | 140 | 56 | |
08-févr | 57 | 150 | 60 |
09-févr | 60 | 155 | 75 |
10-févr | 70 | 175 | 90 |
I would like to calculate delta then fill the null delta.
I have this piece of code, until here I can calculate the delta for each id, I am finding the solution for the filling null delta:
index="index" [|inputlookup test.csv
| search id=1234**
|timechart latest(value) as valLast span=1d by id
|untable _time id valLast
|streamstats current=false window=1 global=false first(valLast) as p_valLast by id
| eval delta=valLast-p_valLast
| xyseries _time id delta
|streamstats count(eval(if(isnull(delta),0,null()))) as count by id
Result: columns display delta values according to each id in a time
_time | id = 1 | id = 2 | id = 3 |
01-févr | |||
02-févr | 2 | 25 | 4 |
03-févr | 3 | 8 | 3 |
04-févr | 2 | ||
05-févr | |||
06-févr | 120 | ||
07-févr | 20 | 56 | |
08-févr | 57 | 10 | 4 |
09-févr | 3 | 5 | 15 |
10-févr | 10 | 20 | 15 |
Thanks in advanced!
I am not 100% clear what it is you are trying to achieve but does this help?
ndex="index" [|inputlookup test.csv
| search id=1234**
|timechart latest(value) as valLast span=1d by id
|untable _time id valLast
|streamstats current=false window=1 global=false first(valLast) as p_valLast by id
| eval delta=valLast-p_valLast
| fillnull value=0
| xyseries _time id delta
Hi @ITWhisperer ,
Thanks for your feedback and sorry it's not clear.
Here is the original data (base on delta)
And here is what I aim to do:
Get the value of 08 feb and divide by the total null bar + 1 (8 feb) and refill to null and 8 feb.
The other id is expected the same way of doing
Thanks
Is valLast always the same or higher than the previous value for each id?
Assuming ascending values and events in time order, try something like this
``` Assuming your search gives events in time order ```
``` fill nulls with -1 (so they can be detected after untable) ```
| fillnull value=-1
``` untable so events can be processed by id ```
| untable _time id valLast
``` split off original null fields ```
| eval null=if(valLast=-1,1,0)
| eval valLast=if(valLast=-1,null(),valLast)
``` filldown using max (assumes valLast doesn't decrease) ```
| streamstats max(valLast) as valLast by id
``` find change in valLast and detect start and end of sequence of nulls ```
| streamstats range(valLast) as diff range(null) as nulls window=2 global=f by id
``` count nulls by id ```
| streamstats sum(null) as nullnumber global=f by id
``` calculate null number at start of sequence ```
| eval start=if(null=1 AND nulls=1,nullnumber,null())
``` calculate null number at end of sequence ```
| eval end=if(null=0 AND nulls=1,nullnumber,null())
``` filldown null number by id ```
| streamstats max(start) as start by id
``` calculate number of events to spread the difference over ```
| eval nullsplusone=end-start+1+1
``` spread the difference across nulls and end of sequence ```
| eval diffspread=diff/nullsplusone
``` reverse events ```
| reverse
``` filldown spread diff by id ```
| streamstats last(diffspread) as lastdiff by id
``` calculate new difference based on whether first non-null after a sequence or originally null ```
| eval newdiff=if(isnotnull(end) OR null=1, lastdiff, diff)
``` reverse to original order ```
| reverse
``` rechart by time and id ```
| xyseries _time id newdiff
Comments to hopefully make it clear what's going on