Splunk Search

I need to compare two values which are generated dynamically in every hour and get the diffrence .

sandyIscream
Communicator

Basically my search looks like this

index=something | rex "(?), " | rex "(?\d+)" | eval _time=strftime(_time, "%d %H") | chart values(Count) over Filename by _time limit=0

my query output is like below if i run for two hours

Filename 1st Hour(Dynamically Generated) 2nd Hour(Dynamically Generated)
ABC 144 158
BDC 14 20

I need to get the difference between current hour and the previous hour and display that difference for every hour.

0 Karma

Richfez
SplunkTrust
SplunkTrust

If you need a third field in each row, like

Filename 1st Hour(Dynamically Generated) 2nd Hour(Dynamically Generated) Difference
ABC 144 158 24
BDC 14 20 6

Then to the end of your search add

... | eval Difference = "1st Hour(Dynamically Generated)" - "2nd Hour(Dynamically Generated)"

If instead you want the difference between the first and second lines you provided (e.g. 144-14 and 158-20), then there are a couple of ways to go about it. I'll provide my favorite.

Using streamstats with a window of 2 and an eval'd difference. Your field names are so big I'm going to shorten them to make it easier to follow the logic, and I'm breaking this into multiple lines to make it more readable. So, add your fieldnames back to that (or do you extractions and SPL with shortened versions and do a rename at the end).

... 
| streamstats window=2 first("1stHour") as Hour1First, last("1stHour") as Hour1Last, 
    first("2ndHour") as Hour2First, last("2ndHour") as Hour2Last
| eval Difference1stHour = Hour1First - Hour1Last, Difference2ndHour = Hour2First - Hour2Last

Obviously the above is wrong, but it gets you a pattern to follow. The math is probably backwards - streamstats might see the events in reverse order so it's view of first and last is backwards. Maybe. But just switch the subtration around if tha'ts the case, no big deal.

0 Karma

sandyIscream
Communicator

Thanks rich7177 for your time. But I wasn't looking for this.

But thankfully we are doing to via shell script and it looks kind of like below.

!/bin/sh

p=date -d "1 hour ago"|awk -F":" '{print $1}'|rev|cut -c1-2|rev

if [ -d /somepath..../data ];then
echo Access Start
for filename in $access_list;do
count=ls -lrt /...../data /..../processed|awk '{print $8$9}'|grep "$p:"|grep $filename|wc -l
echo $filename","$count","$p
done
echo Access End
fi

Then query the files like -

index= | rex "(?\w+)," | rex "(?\d+)" | rex "Hour\s-\s(?\d+)" | eval Time=strftime(_time,"%b %d %H") | mvexpand Count | eval Count = Count . " - (" . Time . ")" | mvcombine Count | chart values(Count) over Filename by Hour limit=0

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...