Splunk Search

Dealing with logs that have different update times (while comparing them)

aubine
Explorer

(This is a continuation of https://community.splunk.com/t5/Splunk-Search/Creating-a-search-that-looks-up-values-from-one-logfil...)

So what I'm trying to do is compare values across two different logs that have a unique ID (see above link for more info). The problem I have is that the one log runs on a cronjob every 5 minutes, while the other log only runs when required. I'm using the below code:

 

index=foo (sourcetype=test1 OR sourcetype=test2) host=A* (source="/data/stuff/logfile1.log" OR source="/data/stuff/logfile2.log")
| eval coalesce(lastupdate_direc,file1ID)
| stats values(lastupdate_time) as lastupdate_time, values(file1ID) as file1ID by host, ID
| eval int_time=strptime(lastupdate_time, "%F %H:%M")
| eval timenow=now()
| eval diff_new=timenow-int_time
| eval days_since=((diff_new-14400)/60/60/24)
| table lastupdate_time host name ID days_since

 

As I'm trying to be nice to my indexer, I'm only looking at the past 15 minutes (because of time drift between the servers) so I get multiple lastupdate_time entries in the table and because of that, the days_since field will be left blank. I've tried using chart latest of the values, but I get no results. Am I just not outputting to the correct visualization function or would something other than table be better?

Thanks!

Labels (2)
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...