Splunk Search

Delta over Multiple SIDs

markschoonover
Explorer

In my raw data I have a lot of values for a field called "sid". For each of those values I want to calculate the delta of another field "Number" from the previous value for that sid. Then I need to do a little eval on it and then timechart the values for each sid. For each of those values I want to calculate the delta of another field "Number" from the previous value for that sid. Then I need to do a little eval on it and then timechart the values for each sid.

Basically I want to do this:

index=foo sid=="bar" | delta p=2 Number AS num | eval num=abs(num/300) | timechart avg(num)

BUT that just does the work for one sid "bar". I want to do it all in one search for all values of "sid", ie and get that sid="bar" out of there.

I've tried the foreach command but I'm doing something wrong because I keep getting an error like "none streaming commands not allowed"

0 Karma

niketn
Legend

@markschoonover, can you add some sample data (mocked) for foo and bar that should be correlated?

Following run anywhere search samples daily data with sid foo and bar and incremental Number field.

You can use streamstats command to compare current sid with previous one. However, streamstats requires that your data is sorted by sid and _time. Since streamstats evaluates events in streaming manner you can use current=f and window=1 to get the previous value of Number. PS: If you dont sort by _time (this example uses ascending data), you might get the Next value instead of Previous.

|  makeresults
|  eval _time=strptime("2017/11/01","%Y/%m/%d"), sid="bar", Number=12234
|  append 
    [|  makeresults
|  eval _time=strptime("2017/11/02","%Y/%m/%d"), sid="foo", Number=12212]
|  append 
    [|  makeresults
|  eval _time=strptime("2017/11/03","%Y/%m/%d"), sid="bar", Number=12546]
|  append 
    [|  makeresults
|  eval _time=strptime("2017/11/05","%Y/%m/%d"), sid="foo", Number=12241]
|  append 
    [|  makeresults
|  eval _time=strptime("2017/11/06","%Y/%m/%d"), sid="foo", Number=12371]
|  append 
    [|  makeresults
|  eval _time=strptime("2017/11/07","%Y/%m/%d"), sid="foo", Number=12497]
|  append 
    [|  makeresults
|  eval _time=strptime("2017/11/08","%Y/%m/%d"), sid="bar", Number=12876]
|  table _time sid Number
|  sort sid _time
|  streamstats current=f window=1 values(Number) as Prev by sid
|  eval num=round((Number-Prev)/300,1)
|  fillnull value=0 num
|  timechart span=1d avg(num) as num by sid

PS: I have used span=1d since I have daily data mocked, you should change as per your data. Command till | table ... are used to mock some sample data. In case the provided query does not work you would need to provide some sample data with Number field values and _time (since based on your current query and explanation seems like sid always increases for each sid.

Please refer to documentation for more details on streamstats: http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Streamstats

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

markschoonover
Explorer

@niketnilay

_time   sid Number
2017-11-09T13:00:16.000-0500    d_1 1564882
2017-11-09T13:00:15.000-0500    d_2 40410
2017-11-09T13:00:13.000-0500    d_3 320889027
2017-11-09T13:00:13.000-0500    d_4 48572
2017-11-09T13:00:13.000-0500    d_5 8529413
2017-11-09T13:00:12.000-0500    d_6 0
2017-11-09T13:00:11.000-0500    d_7 1393886
2017-11-09T13:00:10.000-0500    d_8 0
2017-11-09T13:00:09.000-0500    d_9 4235
2017-11-09T13:00:09.000-0500    d_10    11236
2017-11-09T13:00:07.000-0500    d_11    3
2017-11-09T13:00:07.000-0500    d_12    152821
2017-11-09T13:00:05.000-0500    d_13    0
2017-11-09T13:00:05.000-0500    d_14    302444
2017-11-09T13:00:05.000-0500    d_15    848404
2017-11-09T13:00:04.000-0500    d_16    183481075
2017-11-09T13:00:04.000-0500    d_17    31425241
2017-11-09T13:00:04.000-0500    d_18    10141571
2017-11-09T13:00:04.000-0500    d_19    283724658
2017-11-09T13:00:03.000-0500    d_20    6550
2017-11-09T13:00:03.000-0500    d_21    1525803
2017-11-09T13:00:03.000-0500    d_22    40036
2017-11-09T12:55:11.000-0500    d_7 1385927
2017-11-09T12:55:10.000-0500    d_8 0
2017-11-09T12:55:09.000-0500    d_9 4222
2017-11-09T12:55:09.000-0500    d_10    11110
2017-11-09T12:55:07.000-0500    d_11    3
2017-11-09T12:55:07.000-0500    d_12    152420
2017-11-09T12:55:05.000-0500    d_13    0
2017-11-09T12:55:05.000-0500    d_14    301228
2017-11-09T12:55:05.000-0500    d_15    844113
2017-11-09T12:55:04.000-0500    d_16    182795189
2017-11-09T12:55:04.000-0500    d_17    31303348
2017-11-09T12:55:04.000-0500    d_18    10100182
2017-11-09T12:55:04.000-0500    d_19    282271624
2017-11-09T12:55:03.000-0500    d_20    6533
2017-11-09T12:55:03.000-0500    d_21    1521699
2017-11-09T12:55:03.000-0500    d_22    39939
2017-11-09T12:55:01.000-0500    d_23    11083128
2017-11-09T12:55:01.000-0500    d_24    72696
2017-11-09T12:55:00.000-0500    d_25    4371
2017-11-09T12:55:00.000-0500    d_26    8659590
2017-11-09T12:55:16.000-0500    d_1 1559015
2017-11-09T12:55:15.000-0500    d_2 40263
2017-11-09T12:55:13.000-0500    d_3 319619261
2017-11-09T12:55:13.000-0500    d_4 48525
2017-11-09T12:55:13.000-0500    d_5 8507144
2017-11-09T12:55:12.000-0500    d_6 0
2017-11-09T13:00:01.000-0500    d_23    11180304
2017-11-09T13:00:01.000-0500    d_24    72955
2017-11-09T13:00:00.000-0500    d_25    4389
2017-11-09T13:00:00.000-0500    d_26    8669888
2017-11-09T12:50:05.000-0500    d_13    0
2017-11-09T12:50:05.000-0500    d_14    300057
2017-11-09T12:50:05.000-0500    d_15    840077
2017-11-09T12:50:04.000-0500    d_16    182174385
2017-11-09T12:50:04.000-0500    d_17    31183270
2017-11-09T12:50:04.000-0500    d_18    10061877
2017-11-09T12:50:04.000-0500    d_19    280821692
2017-11-09T12:50:03.000-0500    d_20    6520
2017-11-09T12:50:03.000-0500    d_21    1517609
2017-11-09T12:50:03.000-0500    d_22    39863
2017-11-09T12:50:01.000-0500    d_23    10985881
2017-11-09T12:50:01.000-0500    d_24    72423
2017-11-09T12:50:00.000-0500    d_25    4357
2017-11-09T12:50:00.000-0500    d_26    8633600
2017-11-09T12:50:16.000-0500    d_1 1553673
2017-11-09T12:50:15.000-0500    d_2 40147
2017-11-09T12:50:13.000-0500    d_3 318310620
2017-11-09T12:50:13.000-0500    d_4 48456
2017-11-09T12:50:13.000-0500    d_5 8484623
2017-11-09T12:50:12.000-0500    d_6 0
2017-11-09T12:50:11.000-0500    d_7 1377540
2017-11-09T12:50:10.000-0500    d_8 0
2017-11-09T12:50:09.000-0500    d_9 4212
2017-11-09T12:50:09.000-0500    d_10    10993
2017-11-09T12:50:07.000-0500    d_11    3
2017-11-09T12:50:07.000-0500    d_12    151986
0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...