Splunk Search

Calculating next row based on the first value generated in the new column called 12days?

w344423
Explorer

I got to calculate the rest of the row based on the first value generated in the new column called 12days.

Attempted solution which i can think of is using streamstats with foreach but i got an error that i cannot use a non streaming command and here is my attempted spl, and would like to ask the guru here for some help how can i do this differently.

| foreach 12day
[
| streamstats window=1 current=f last(12day) AS temp
| eval 12day = round((switch*(0.5)+(temp*(0.025)),2) ]

Example of the calculation done in excel,

1. row 12 is avg of row 1-12

=AVERAGE(a1:a2)

2. from row 13 it will take current value from switch and add to the previous value of 12

3. objective is to generate the subsequence 12day from previous value 

=ROUND((a13*(0.5))+(b12*(0.025)),2)

  col : a col : b
count switch 12day
1 1  
2 2  
3 1  
4 2  
5 1  
6 2  
7 0  
8 1  
9 2  
10 1  
11 0  
12 0 1.083333333
13 1 0.53
14 2 1.01
15 1 0.53
16 0 0.01
17 0 0
18 0 0
19 1 0.5
20 2 1.01
21 0 0.03
22 0 0
23 1 0.5
24 2 1.01
25 1 0.53

 

Labels (2)
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

Splunk SPL commands generally work one event at a time. There are exceptions to this which allow you to introduce values from other events, such as streamstats. In order to perform the calculations you outlined with SPL, you need to combine values from earlier events.

Given your simple example, the following seems to work

| streamstats window=12 avg(switch) as halfswitch
| eval halfswitch=if(count=12,halfswitch,null())
| eval halfswitch=if(count>12,switch*0.5,halfswitch)
| streamstats window=2 current=f last(halfswitch) as previoushalfswitch
| eval previoushalfswitch=previoushalfswitch*0.025
| eval new12day=if(count>12,round(previoushalfswitch+halfswitch,2),halfswitch)

The caveat to this is that because your values are low and you are multiplying by 0.025 and rounding to 2 decimal places, the impact of the value previous to the previous value is low and lost in rounding. With higher values, this is likely to give a different result to what Excel does. Also, bear in mind that, unlike Excel, the commands operate on a pipeline of events i.e. each command has a single pass through the events, whereas Excel will do multiple passes to resolve calculations.

View solution in original post

w344423
Explorer

thank you @ITWhisperer  for this artful way of doing it and i learn another new way to do such calculation via a temp column. noted on your suggestion on the 2 decimal place and thanks for highlighting that. 

I would consider my solution that i am coding in splunk will be performing batch processing which would be a snapshot of the time in every 5 min. 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Splunk SPL commands generally work one event at a time. There are exceptions to this which allow you to introduce values from other events, such as streamstats. In order to perform the calculations you outlined with SPL, you need to combine values from earlier events.

Given your simple example, the following seems to work

| streamstats window=12 avg(switch) as halfswitch
| eval halfswitch=if(count=12,halfswitch,null())
| eval halfswitch=if(count>12,switch*0.5,halfswitch)
| streamstats window=2 current=f last(halfswitch) as previoushalfswitch
| eval previoushalfswitch=previoushalfswitch*0.025
| eval new12day=if(count>12,round(previoushalfswitch+halfswitch,2),halfswitch)

The caveat to this is that because your values are low and you are multiplying by 0.025 and rounding to 2 decimal places, the impact of the value previous to the previous value is low and lost in rounding. With higher values, this is likely to give a different result to what Excel does. Also, bear in mind that, unlike Excel, the commands operate on a pipeline of events i.e. each command has a single pass through the events, whereas Excel will do multiple passes to resolve calculations.

Get Updates on the Splunk Community!

Splunk Enterprise Security 8.0.2 Availability: On cloud and On-premise!

A few months ago, we released Splunk Enterprise Security 8.0 for our cloud customers. Today, we are excited to ...

Logs to Metrics

Logs and Metrics Logs are generally unstructured text or structured events emitted by applications and written ...

Developer Spotlight with Paul Stout

Welcome to our very first developer spotlight release series where we'll feature some awesome Splunk ...