Splunk Search

Eval Time_Diff

cglowjr
New Member

I am having trouble getting a result to appear for the below query. I am trying to produce a column showing time_diff of the lastest timestamp result for lane_RFID subtracted from the time now. The table doesn't show a result for time_diff, but everything else shows properly. Hopefully it is something easy. Thank you.

index=*"RFID Message received for:" | stats latest(date_time) by LANE_RFID | eval time_now=now() | eval time_now=strftime(time_now,"%Y/%m/%d %H:%M:%S") | eval time_diff=strftime(time_diff,"%M:%S") | eval time_diff=time_now-date_time| table LANE_RFID time_now latest(date_time) time_diff

Tags (1)
0 Karma
1 Solution

to4kawa
Ultra Champion
index=*"RFID Message received for:" 
| stats latest(date_time) as  date_time by LANE_RFID 
| eval time_now=strftime(now(),"%Y/%m/%d %H:%M:%S")
| eval time_diff=now() - strptime(date_time,"%Y/%m/%d %H:%M:%S") 
| table LANE_RFID time_now date_time time_diff

View solution in original post

0 Karma

to4kawa
Ultra Champion
index=*"RFID Message received for:" 
| stats latest(date_time) as  date_time by LANE_RFID 
| eval time_now=strftime(now(),"%Y/%m/%d %H:%M:%S")
| eval time_diff=now() - strptime(date_time,"%Y/%m/%d %H:%M:%S") 
| table LANE_RFID time_now date_time time_diff
0 Karma

cglowjr
New Member

This works wonderfully! Thank you so much!

0 Karma

to4kawa
Ultra Champion

Is date_time epoch?

0 Karma

cglowjr
New Member

date_time is formatted 2020/02/24 16:14:34

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...