Hi Everyone,
I am new to Splunk. Could someone help me and provide the search for the below query: That would be Great!!
I have a search which gives below results by doing comparison of last week and current week and it shows the count of each status in weekly manner.
Date | Active | Inactive | Deleted | Added |
09/10/21 | 50 | 20 | 5 | 20 |
I will be sending the above result to lookup file every week data to provide the summary.(| outputlookup append=true summary.csv)
The lookup file looks like below after 3 weeks.
Date | Active | Inactive | Deleted | Added |
09/10/2021 | 50 | 20 | 5 | 20 |
16/10/2021 | 55 | 15 | 10 | 30 |
23/10/2021 | 60 | 10 | 8 | 15 |
Date column keeps on growing dynamically each week, always I need to calculate difference between last two column and create new column to tell the difference. The result am expecting is below.
Status | 09/10/2021 | 16/10/2021 | 23/10/2021 | Difference b/w last 2 columns |
Active | 50 | 55 | 60 | 5 |
Inactive | 20 | 15 | 10 | -5 |
Deleted | 5 | 10 | 8 | -2 |
Added | 20 | 30 | 15 | -15 |
The part before the blank lines sets up dummy data and would be replaced by your search/inputlookup
The process is to list the values from the previous and current line, then remove the additional values for all but the last line. Then mvexpand the last line, resetting the first copy to its original state. Now the last line has both values (note the addition of the "-" for the second value), which can then be summed. Finally, transpose to get the layout required.
Note the use of leading underscores in some of the field names to hide them from the foreach.
| makeresults
| eval _raw="Date Active Inactive Deleted Added
09/10/2021 50 20 5 20
16/10/2021 55 15 10 30
23/10/2021 60 10 8 15"
| multikv forceheader=1
| table Date Active Inactive Deleted Added
| streamstats list(*) as * window=2
| streamstats count as _row
| eventstats max(_row) as _last
| foreach *
[| eval <<FIELD>>=if(_row=_last,mvappend(mvindex('<<FIELD>>',1),"-".mvindex('<<FIELD>>',0)),if(mvcount('<<FIELD>>')>1,mvindex('<<FIELD>>',1),'<<FIELD>>'))]
| mvexpand Date
| streamstats count as _copy by _row
| eval Date=if(_copy=2,"Difference",Date)
| rename Date as _Date
| foreach *
[| eval <<FIELD>>=if(mvcount=1,<<FIELD>>,if(_copy=1,mvindex(<<FIELD>>,0),<<FIELD>>))]
| eventstats sum(*) as * by _row _copy
| rename _Date as Date
| fields - _*
| transpose 0 header_field=Date column_name=Status
The part before the blank lines sets up dummy data and would be replaced by your search/inputlookup
The process is to list the values from the previous and current line, then remove the additional values for all but the last line. Then mvexpand the last line, resetting the first copy to its original state. Now the last line has both values (note the addition of the "-" for the second value), which can then be summed. Finally, transpose to get the layout required.
Note the use of leading underscores in some of the field names to hide them from the foreach.
| makeresults
| eval _raw="Date Active Inactive Deleted Added
09/10/2021 50 20 5 20
16/10/2021 55 15 10 30
23/10/2021 60 10 8 15"
| multikv forceheader=1
| table Date Active Inactive Deleted Added
| streamstats list(*) as * window=2
| streamstats count as _row
| eventstats max(_row) as _last
| foreach *
[| eval <<FIELD>>=if(_row=_last,mvappend(mvindex('<<FIELD>>',1),"-".mvindex('<<FIELD>>',0)),if(mvcount('<<FIELD>>')>1,mvindex('<<FIELD>>',1),'<<FIELD>>'))]
| mvexpand Date
| streamstats count as _copy by _row
| eval Date=if(_copy=2,"Difference",Date)
| rename Date as _Date
| foreach *
[| eval <<FIELD>>=if(mvcount=1,<<FIELD>>,if(_copy=1,mvindex(<<FIELD>>,0),<<FIELD>>))]
| eventstats sum(*) as * by _row _copy
| rename _Date as Date
| fields - _*
| transpose 0 header_field=Date column_name=Status
Thanky @ITWhisperer .
Great! Awesome, it worked as excepted.
I don't understand - the second table is teansposed versus the first one. So what exactly do you want?
Hello,
From the lookup file i need to do transpose then I want to find the difference of last column(last 2 weeks) in new column called Difference.