Splunk Search

How do I get a subtraction of two fields when they share a common value?

pal_sumit1
Path Finder

I am having three columns in primary_key, service_name , timestamp.

I want to get a subtraction of values present in the timestamp where their corresponding service_name is same.

And, if we are having more that 2 same fields, then we should get the average of both of the results.

Sample Data :

primary_key  ------service_name------timestamp
123                  "before abc"                " 2-2-2018"
 123                 "after    "abc"               "2-3-2018"
123                     "before abc"               " 2-4-2018"
123                           "after abc"            "2-5-2018"

Desired result :

value=(("2-3-2018" - "2-2-2018")+("2-5-2018"-"2-4-2018" ))/2

Thanks in Advance

Tags (2)
0 Karma
1 Solution

renjith_nair
Legend

@pal_sumit1,

Try

"your current search"|eval _time=strptime(timestamp,"%d-%m-%Y")
|sort primary_key,_time
|streamstats last(_time) as prev_time current=f window=1 by primary_key
|eval diff=_time-prev_time
|streamstats count as rowno by primary_key reset_on_change=true
|eventstats avg(eval(if(rowno%2==0,diff,null()))) as average by primary_key
|table primary_key,service_name,timestamp,diff,average
  • You shall replace eventstats with stats if you just want the final result with primary_key and avg difference
  • Ignored events if there is only one event - for eg only "before" event and not an "after" event
  • Average is displayed in all events for that particular primary key

Sample data used :

primary_key,service_name,timestamp
123,"before abc",2-2-2018
123,"after  abc",2-3-2018
123,"before abc",2-4-2018
123,"after  abc",2-5-2018
456,"before xyz",2-2-2018
456,"after  xyz",2-3-2018
789,"before pqr",2-4-2018
789,"after  pqr",2-5-2018
789,"before pqr",2-6-2018    --> this event is ignored for the calculation since there is no pair for it.

Lets know if you need any modification.
Only those highlighted values are considered for avg calculation

alt text

---
What goes around comes around. If it helps, hit it with Karma 🙂

View solution in original post

renjith_nair
Legend

@pal_sumit1,

Try

"your current search"|eval _time=strptime(timestamp,"%d-%m-%Y")
|sort primary_key,_time
|streamstats last(_time) as prev_time current=f window=1 by primary_key
|eval diff=_time-prev_time
|streamstats count as rowno by primary_key reset_on_change=true
|eventstats avg(eval(if(rowno%2==0,diff,null()))) as average by primary_key
|table primary_key,service_name,timestamp,diff,average
  • You shall replace eventstats with stats if you just want the final result with primary_key and avg difference
  • Ignored events if there is only one event - for eg only "before" event and not an "after" event
  • Average is displayed in all events for that particular primary key

Sample data used :

primary_key,service_name,timestamp
123,"before abc",2-2-2018
123,"after  abc",2-3-2018
123,"before abc",2-4-2018
123,"after  abc",2-5-2018
456,"before xyz",2-2-2018
456,"after  xyz",2-3-2018
789,"before pqr",2-4-2018
789,"after  pqr",2-5-2018
789,"before pqr",2-6-2018    --> this event is ignored for the calculation since there is no pair for it.

Lets know if you need any modification.
Only those highlighted values are considered for avg calculation

alt text

---
What goes around comes around. If it helps, hit it with Karma 🙂

renjith_nair
Legend

@pal_sumit1, did it work for you ? If yes, accept it as answer or lets know whats missing.

---
What goes around comes around. If it helps, hit it with Karma 🙂
0 Karma
Get Updates on the Splunk Community!

Observability Unlocked: Kubernetes Monitoring with Splunk Observability Cloud

 Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team for an ...

Update Your SOAR Apps for Python 3.13: What Community Developers Need to Know

To Community SOAR App Developers - we're reaching out with an important update regarding Python 3.9's ...

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...