Splunk Search

Dealing with logs that have different update times (while comparing them)

aubine
Explorer

(This is a continuation of https://community.splunk.com/t5/Splunk-Search/Creating-a-search-that-looks-up-values-from-one-logfil...)

So what I'm trying to do is compare values across two different logs that have a unique ID (see above link for more info). The problem I have is that the one log runs on a cronjob every 5 minutes, while the other log only runs when required. I'm using the below code:

 

index=foo (sourcetype=test1 OR sourcetype=test2) host=A* (source="/data/stuff/logfile1.log" OR source="/data/stuff/logfile2.log")
| eval coalesce(lastupdate_direc,file1ID)
| stats values(lastupdate_time) as lastupdate_time, values(file1ID) as file1ID by host, ID
| eval int_time=strptime(lastupdate_time, "%F %H:%M")
| eval timenow=now()
| eval diff_new=timenow-int_time
| eval days_since=((diff_new-14400)/60/60/24)
| table lastupdate_time host name ID days_since

 

As I'm trying to be nice to my indexer, I'm only looking at the past 15 minutes (because of time drift between the servers) so I get multiple lastupdate_time entries in the table and because of that, the days_since field will be left blank. I've tried using chart latest of the values, but I get no results. Am I just not outputting to the correct visualization function or would something other than table be better?

Thanks!

Labels (2)
0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...