Splunk Search

How do I get a subtraction of two fields when they share a common value?

pal_sumit1
Path Finder

I am having three columns in primary_key, service_name , timestamp.

I want to get a subtraction of values present in the timestamp where their corresponding service_name is same.

And, if we are having more that 2 same fields, then we should get the average of both of the results.

Sample Data :

primary_key  ------service_name------timestamp
123                  "before abc"                " 2-2-2018"
 123                 "after    "abc"               "2-3-2018"
123                     "before abc"               " 2-4-2018"
123                           "after abc"            "2-5-2018"

Desired result :

value=(("2-3-2018" - "2-2-2018")+("2-5-2018"-"2-4-2018" ))/2

Thanks in Advance

Tags (2)
0 Karma
1 Solution

renjith_nair
Legend

@pal_sumit1,

Try

"your current search"|eval _time=strptime(timestamp,"%d-%m-%Y")
|sort primary_key,_time
|streamstats last(_time) as prev_time current=f window=1 by primary_key
|eval diff=_time-prev_time
|streamstats count as rowno by primary_key reset_on_change=true
|eventstats avg(eval(if(rowno%2==0,diff,null()))) as average by primary_key
|table primary_key,service_name,timestamp,diff,average
  • You shall replace eventstats with stats if you just want the final result with primary_key and avg difference
  • Ignored events if there is only one event - for eg only "before" event and not an "after" event
  • Average is displayed in all events for that particular primary key

Sample data used :

primary_key,service_name,timestamp
123,"before abc",2-2-2018
123,"after  abc",2-3-2018
123,"before abc",2-4-2018
123,"after  abc",2-5-2018
456,"before xyz",2-2-2018
456,"after  xyz",2-3-2018
789,"before pqr",2-4-2018
789,"after  pqr",2-5-2018
789,"before pqr",2-6-2018    --> this event is ignored for the calculation since there is no pair for it.

Lets know if you need any modification.
Only those highlighted values are considered for avg calculation

alt text

---
What goes around comes around. If it helps, hit it with Karma 🙂

View solution in original post

renjith_nair
Legend

@pal_sumit1,

Try

"your current search"|eval _time=strptime(timestamp,"%d-%m-%Y")
|sort primary_key,_time
|streamstats last(_time) as prev_time current=f window=1 by primary_key
|eval diff=_time-prev_time
|streamstats count as rowno by primary_key reset_on_change=true
|eventstats avg(eval(if(rowno%2==0,diff,null()))) as average by primary_key
|table primary_key,service_name,timestamp,diff,average
  • You shall replace eventstats with stats if you just want the final result with primary_key and avg difference
  • Ignored events if there is only one event - for eg only "before" event and not an "after" event
  • Average is displayed in all events for that particular primary key

Sample data used :

primary_key,service_name,timestamp
123,"before abc",2-2-2018
123,"after  abc",2-3-2018
123,"before abc",2-4-2018
123,"after  abc",2-5-2018
456,"before xyz",2-2-2018
456,"after  xyz",2-3-2018
789,"before pqr",2-4-2018
789,"after  pqr",2-5-2018
789,"before pqr",2-6-2018    --> this event is ignored for the calculation since there is no pair for it.

Lets know if you need any modification.
Only those highlighted values are considered for avg calculation

alt text

---
What goes around comes around. If it helps, hit it with Karma 🙂

renjith_nair
Legend

@pal_sumit1, did it work for you ? If yes, accept it as answer or lets know whats missing.

---
What goes around comes around. If it helps, hit it with Karma 🙂
0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...

Updated Data Management and AWS GDI Inventory in Splunk Observability

We’re making some changes to Data Management and Infrastructure Inventory for AWS. The Data Management page, ...