Splunk Search

Calculate the delta based on the time for 2 fields - end of day value minus start of day value

HattrickNZ
Motivator

I have the following table from my search:

index=core ... |
timechart span=5m
sum(deliverySucceeded) as deliverySucceeded
sum(sentMessage) as sentMessage |

gives this table:

    _time   deliverySucceeded   sentMessage
801 20/01/2018 18:40    262719  329507
… … … …
863 20/01/2018 23:50    264437  331450
864 20/01/2018 23:55    264454  331467
865 21/01/2018 0:00 264464  331477
… … … …

1151    21/01/2018 23:50    271316  339177
1152    21/01/2018 23:55    271328  339189
1153    22/01/2018 0:00 271339  339201
1154    22/01/2018 0:05 271354  339221
1155    22/01/2018 0:10 271365  339232
1156    22/01/2018 0:15 271373  339240

What I want is:(I want it to this for every day in the time range)

_time   deliverySucceeded   sentMessage
21/01/2018  6875    7724

Which are the midnight values subtracted from each other

e.g. 271339 - 264464 = 6875, 339201 - 331477=7724

1153    22/01/2018 0:00 271339  339201
865 21/01/2018 0:00 264464  331477
        6875    7724

Is there a way to do this? Do I have to sue delta or is there a better way?

note: you would think that the max value would always be at midnight but there are instances where this is not the case

these are accumulated values every 5mins. What I want is the daily total of delivery and sent messages.

Tags (3)
0 Karma
1 Solution

mayurr98
Super Champion

I think you want something like this

| makeresults |   eval data = "
  801    20/01/2018 18:40    262719    329507;
  863    20/01/2018 23:50    264437    331450;
  864    20/01/2018 23:55    264454    331467;
  865    21/01/2018 0:00    264464    331477;
  1151    21/01/2018 23:50    271316    339177;
 1152    21/01/2018 23:55    271328    339189;
 1153    22/01/2018 0:00    271339    339201;
 1154    22/01/2018 0:05    271354    339221;
 1155    22/01/2018 0:10    271365    339232;
 1156    22/01/2018 0:15    271373    339240" | makemv delim=";" data | mvexpand data  | rex field=data "(?<serial>[\d]*)\s+(?<_time>\d{2}\/\d{2}\/\d{4}\s+\d{1,2}\:\d{1,2})\s+(?<delivery>\d+)\s+(?<sent>\d+)" | rex field=_time "(?<condition>\d{1,2}:\d{1,2})" | where condition="0:00" OR condition="00:00" | delta sent  | delta delivery

let me know if this helps!

View solution in original post

mayurr98
Super Champion

I think you want something like this

| makeresults |   eval data = "
  801    20/01/2018 18:40    262719    329507;
  863    20/01/2018 23:50    264437    331450;
  864    20/01/2018 23:55    264454    331467;
  865    21/01/2018 0:00    264464    331477;
  1151    21/01/2018 23:50    271316    339177;
 1152    21/01/2018 23:55    271328    339189;
 1153    22/01/2018 0:00    271339    339201;
 1154    22/01/2018 0:05    271354    339221;
 1155    22/01/2018 0:10    271365    339232;
 1156    22/01/2018 0:15    271373    339240" | makemv delim=";" data | mvexpand data  | rex field=data "(?<serial>[\d]*)\s+(?<_time>\d{2}\/\d{2}\/\d{4}\s+\d{1,2}\:\d{1,2})\s+(?<delivery>\d+)\s+(?<sent>\d+)" | rex field=_time "(?<condition>\d{1,2}:\d{1,2})" | where condition="0:00" OR condition="00:00" | delta sent  | delta delivery

let me know if this helps!

HattrickNZ
Motivator

wow, tks

 | makeresults |   eval data = "
   801    20/01/2018 18:40    262719    329507;
   863    20/01/2018 23:50    264437    331450;
   864    20/01/2018 23:55    264454    331467;
   865    21/01/2018 0:00    264464    331477;
   1151    21/01/2018 23:50    271316    339177;
  1152    21/01/2018 23:55    271328    339189;
  1153    22/01/2018 0:00    271339    339201;
  1154    22/01/2018 0:05    271354    339221;
  1155    22/01/2018 0:10    271365    339232;
  1156    22/01/2018 0:15    271373    339240" | 
  makemv delim=";" data | 
  mvexpand data  | 
  rex field=data "(?<serial>[\d]*)\s+(?<_time>\d{2}\/\d{2}\/\d{4}\s+\d{1,2}\:\d{1,2})\s+(?<delivery>\d+)\s+(?<sent>\d+)" | 
  fields - data serial | 
  rex field=_time "(?<condition>\d{1,2}:\d{1,2})" |
  where condition="0:00" OR condition="00:00" | 
  delta delivery as delivery | 
  delta sent as sent | 
  fields - condition

Tidied it up a bit and get the following:

    _time   delivery    sent
1   21/01/2018 0:00 264464  331477
2   22/01/2018 0:00 6875    7724

But how do I get the values 6875 and 7724 to be in the same _time row as 21/01/2018 0:00??

0 Karma

mayurr98
Super Champion

hey, I think it is pretty complex but I found a workaround to do it !

| makeresults 
| eval data = "
   801    20/01/2018 18:40    262719    329507;
   863    20/01/2018 23:50    264437    331450;
   864    20/01/2018 23:55    264454    331467;
   865    21/01/2018 0:00    264464    331477;
   1151    21/01/2018 23:50    271316    339177;
  1152    21/01/2018 23:55    271328    339189;
  1153    22/01/2018 0:00    271339    339201;
  1154    22/01/2018 0:05    271354    339221;
  1155    22/01/2018 0:10    271365    339232;
  1156    22/01/2018 0:15    271373    339240" 
| makemv delim=";" data 
| mvexpand data 
| rex field=data "(?<serial>[\d]*)\s+(?<_time>\d{2}\/\d{2}\/\d{4}\s+\d{1,2}\:\d{1,2})\s+(?<delivery>\d+)\s+(?<sent>\d+)" 
| rex field=_time "(?<condition>\d{1,2}:\d{1,2})" 
| where condition="0:00" OR condition="00:00" 
| delta delivery as delivery_diff 
| delta sent as sent_diff 
| search delivery_diff=* 
| eval _time=strftime((strptime(_time,"%d/%m/%Y %H:%M")-86400),"%d/%m/%Y") 
| fields _time delivery_diff sent_diff

let me know if this helps!

HattrickNZ
Motivator

tks, great thinking.A beautiful mind!

note for my reference: the 2nd last line I have changed as in your example the _time is in a string format, whereas with my real data _time is in date format. tks again.

  | eval _time=relative_time(_time,"-1d")
  | fields + _time delivery_diff sent_diff
0 Karma

nabeel652
Builder

Try something like this:

  | stats count | eval data = "
 801    20/01/2018 18:40    262719    329507;
 863    20/01/2018 23:50    264437    331450;
 864    20/01/2018 23:55    264454    331467;
 865    21/01/2018 0:00    264464    331477;
 1151    21/01/2018 23:50    271316    339177;
1152    21/01/2018 23:55    271328    339189;
1153    22/01/2018 0:00    271339    339201;
1154    22/01/2018 0:05    271354    339221;
1155    22/01/2018 0:10    271365    339232;
1156    22/01/2018 0:15    271373    339240" | makemv delim=";" data | mvexpand data | rex field=data "(?<serial>[\d]*)\s+(?<time>\d{2}\/\d{2}\/\d{4}\s+\d{1,2}\:\d{1,2})\s+(?<delivery>\d+)\s+(?<sent>\d+)"  | eval _time=strptime(time,"%d/%m/%Y %H:%M") | table serial time _time delivery, sent | timechart span=1d min(serial) as serial min(delivery) as delivered min(sent) as sent | eval result=sent - delivered | where isnotnull(result)

Or something like:

...| bin span=1d _time | stats  min(_time) as time  min(delivery) as delivered min(sent) as sent by  _time  | eval result=sent - delivered | eval time =strftime(time,"%d/%m/%Y %H:%M")
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...