Splunk Search

How to compare pervious hour events with present hour

mahesh27
Communicator

I want to compare pervious hour data with present hour data and get the percentage using below query.

|mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application
Labels (1)
0 Karma
1 Solution

yuanliu
SplunkTrust
SplunkTrust

My mistake - I neglected groupby.

I know this has come up before (because some veterans here helped me:-)) But I can't find the old answer. (In fact, this delta with groupby question comes up regularly because it's a common use case.)  So, here is a shot:

 

|mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application
| streamstats window=2 global=false range(Trans) as delta max(Trans) as Trans_max max(_time) as _time by application
| sort application _time
| eval delta = if(Trans_max == Trans, delta, "-" . delta)
| eval pct_delta = delta / Trans * 100
| fields - Trans_max

 

Here is my full simulation

 

| mstats max(_value) as Trans
  where index=_metrics metric_name = spl.mlog.bucket_metrics.* earliest=-8h@h latest=-4h@h
  by metric_name span=1h
| rename metric_name as application
``` the above simulates
|mstats sum(transaction) as Trans where index=host-metrics service=login
  application IN(app1, app2, app3, app4) span=1h by application
```
| streamstats window=2 global=false range(Trans) as delta max(Trans) as Trans_max max(_time) as _time by application
| sort application _time
| eval delta = if(Trans_max == Trans, delta, "-" . delta)
| eval pct_delta = delta / Trans * 100
| fields - Trans_max

 

My output is

_timeapplicationTransdeltapct_delta
2024-03-28 12:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 13:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 14:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 15:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 12:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.current_total215.0000000.0000000.000000
2024-03-28 13:00spl.mlog.bucket_metrics.current_total215.0000000.0000000.000000
2024-03-28 14:00spl.mlog.bucket_metrics.current_total214.000000-1.000000-0.4672897
2024-03-28 15:00spl.mlog.bucket_metrics.current_total214.0000000.0000000.000000
2024-03-28 12:00spl.mlog.bucket_metrics.frozen0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.frozen0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.frozen1.0000001.000000100.0000
2024-03-28 15:00spl.mlog.bucket_metrics.frozen0.000000-1.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.total_removed0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.total_removed0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.total_removed1.0000001.000000100.0000
2024-03-28 15:00spl.mlog.bucket_metrics.total_removed0.000000-1.000000 

Obviously my results have lots of nulls because lots of my "Trans" values are zero.  But you get the idea.

View solution in original post

Tags (1)

yuanliu
SplunkTrust
SplunkTrust

Something like this?

|mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application
| delta Trans as delta_Trans
| eval pct_delta_Trans = delta_Trans / Trans * 100
Tags (1)
0 Karma

mahesh27
Communicator

hi @yuanliu , when i run the below query, trans values are fine, but getting negative values  and empty row for the delta_Trans and pct_delta_Trans fields
values are not correct.

_timeapplicationTransdelta_Transpct_delta_Trans
2022-01-22 02:00

app1

3456.000000  
2022-01-22 02:00app25632.000000-1839.000000-5438.786543
2022-01-22 02:00app35643.00000036758.00000099.76435678
2022-01-22 02:00app416543.00000-8796.908678-8607.065438
0 Karma

yuanliu
SplunkTrust
SplunkTrust

My mistake - I neglected groupby.

I know this has come up before (because some veterans here helped me:-)) But I can't find the old answer. (In fact, this delta with groupby question comes up regularly because it's a common use case.)  So, here is a shot:

 

|mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application
| streamstats window=2 global=false range(Trans) as delta max(Trans) as Trans_max max(_time) as _time by application
| sort application _time
| eval delta = if(Trans_max == Trans, delta, "-" . delta)
| eval pct_delta = delta / Trans * 100
| fields - Trans_max

 

Here is my full simulation

 

| mstats max(_value) as Trans
  where index=_metrics metric_name = spl.mlog.bucket_metrics.* earliest=-8h@h latest=-4h@h
  by metric_name span=1h
| rename metric_name as application
``` the above simulates
|mstats sum(transaction) as Trans where index=host-metrics service=login
  application IN(app1, app2, app3, app4) span=1h by application
```
| streamstats window=2 global=false range(Trans) as delta max(Trans) as Trans_max max(_time) as _time by application
| sort application _time
| eval delta = if(Trans_max == Trans, delta, "-" . delta)
| eval pct_delta = delta / Trans * 100
| fields - Trans_max

 

My output is

_timeapplicationTransdeltapct_delta
2024-03-28 12:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.created0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.created_replicas0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 13:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 14:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 15:00spl.mlog.bucket_metrics.current_hot12.0000000.0000000.000000
2024-03-28 12:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.current_hot_replicas0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.current_total215.0000000.0000000.000000
2024-03-28 13:00spl.mlog.bucket_metrics.current_total215.0000000.0000000.000000
2024-03-28 14:00spl.mlog.bucket_metrics.current_total214.000000-1.000000-0.4672897
2024-03-28 15:00spl.mlog.bucket_metrics.current_total214.0000000.0000000.000000
2024-03-28 12:00spl.mlog.bucket_metrics.frozen0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.frozen0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.frozen1.0000001.000000100.0000
2024-03-28 15:00spl.mlog.bucket_metrics.frozen0.000000-1.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 15:00spl.mlog.bucket_metrics.rolled0.0000000.000000 
2024-03-28 12:00spl.mlog.bucket_metrics.total_removed0.0000000.000000 
2024-03-28 13:00spl.mlog.bucket_metrics.total_removed0.0000000.000000 
2024-03-28 14:00spl.mlog.bucket_metrics.total_removed1.0000001.000000100.0000
2024-03-28 15:00spl.mlog.bucket_metrics.total_removed0.000000-1.000000 

Obviously my results have lots of nulls because lots of my "Trans" values are zero.  But you get the idea.

Tags (1)

mahesh27
Communicator

Hi @yuanliu , thank you so much, it worked

0 Karma

mahesh27
Communicator

Hi @yuanliu , as suggested, I tried below query, but i am not getting expected output. I mean i am not getting previous hour data under delta row.  all values are 0. please see my output.

 

|mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application
|streamstats window=2 global=false range(Trans) as delta max(Trans) as Trans_max max(_time) as _time by application
|sort application _time
|eval delta = if(Trans_max == Trans, delta, "-" . delta)
|eval pct_delta = delta / Trans * 100
|fields - Trans_max

 

 

Output:

_timeapplicationTransdelta_Transpct_delta_Trans
2022-01-22 02:00

app1

3456.0000000.0000000.000000
2022-01-22 02:00app25632.0000000.000000 0.000000
2022-01-22 02:00app35643.0000000.0000000.000000
2022-01-22 02:00app416543.000000.000000 0.000000
Tags (1)
0 Karma
Get Updates on the Splunk Community!

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...

[Coming Soon] Splunk Observability Cloud - Enhanced navigation with a modern look and ...

We are excited to introduce our enhanced UI that brings together AppDynamics and Splunk Observability. This is ...

Splunk Smartness with Patrick Tatro | Episode 4

Welcome to another episode of "Splunk Smartness," where we explore how Splunk Education can revolutionize your ...