I have two logfiles, logfile1.log and logfile2.log. I have created their own field extractions for both of them. Here is an example line for both logs:
logfile1.log:
file1time, epoch, file1ID, name, flag, stat1, stat2, stat3
logfile2.log:
lastruntime, file2ID, epoch
What I need to do is compare logfiles each against the ID's, ensure that they're the same, and output the "name" field in the search that has logfile2.log as it's source.
There's probably a very easy way to do this, but I can't think of it. Any help would be greatly appreciated. Thanks!
I think the same answer still applies. In this query, the two sources are combined based on the common field, then days_since is calculated, and the data displayed.
index=foo sourcetype=test host=* (source="/data/stuff/logfile2.log" OR source="/data/stuff/logfile2.log")
| eval ID = coalesce(file1ID, lastupdate_direc)
| stats values(lastupdate_time) as lastupdate_time, values(name) as name by host, ID
| eval new_time=strptime(lastupdate_time, "%F %H:%M")
| eval diff_new=now()-new_time
| eval days_since=((diff_new-14400)/60/60/24)
| table host ID lastupdate_time days_since name
| sort - days_since
| rename host as "Server" ID as "Directory Name" lastupdate_time as "Date/Time Last updated" days_since "Days since last update"
If I understand the request correctly, the solution is pretty straightforward. Read events from both logs, create a common field with ID in it, then group events by ID.
index=foo (source=logfile1.log OR source=logfile2.log)
| eval ID = coalesce(file1ID, file2ID)
| stats values(*) as * by ID
| table lastruntime, file2ID, epoch, name
Hi Rich,
I don't think I properly explained myself so here I'll put the Splunk search that I currently have and you can comment from there:
logfile1.log:
file1time, epoch, file1ID (this is the unique field), name (this is the field I would like added), flag, stat1, stat2, stat3
logfile2.log:
lastupdate_time, lastupdate_direc (this is the unique field), epoch
index=foo sourcetype=test host=* source="/data/stuff/logfile2.log"
| eval new_time=strptime(lastupdate_time, "%F %H:%M")
| eval diff_new=now()-new_time
| eval days_since=((diff_new-14400)/60/60/24)
| table host lastupdate_direc lastupdate_time days_since
| dedup lastupdate_direc
| sort days_since
| reverse
| rename host as "Server" lastupdate_direc as "Directory Name" lastupdate_time as "Date/Time Last updated" days_since "Days since last update"
The directory is a number field and I have another log file logfile1.log that has the same number field that I want to associate and make it so that I can associate the "name" field (because it's in another software) with the last time the directory updated (and that logs only when there's update rather than constantly and I don't want users to be using an "All Time" search if I can help it). Currently the output looks something like:
Server | Directory Name | Date/Time Last Updated | Days since last update |
server1 | 1 | 2021-08-30 11:00 | 0 |
server1 | 2 | 2021-08-01 11:00 | 29 |
server2 | 1 | 2021-08-29 00:00 | 1 |
What I would like to do is to add one field to that table from logfile1.log with the "name" field that has the same unique ID as the Directory Name (lastupdate_direc).
I hope that explained it better. Let me know if it makes sense. Thanks for your quick response! 😄
I think the same answer still applies. In this query, the two sources are combined based on the common field, then days_since is calculated, and the data displayed.
index=foo sourcetype=test host=* (source="/data/stuff/logfile2.log" OR source="/data/stuff/logfile2.log")
| eval ID = coalesce(file1ID, lastupdate_direc)
| stats values(lastupdate_time) as lastupdate_time, values(name) as name by host, ID
| eval new_time=strptime(lastupdate_time, "%F %H:%M")
| eval diff_new=now()-new_time
| eval days_since=((diff_new-14400)/60/60/24)
| table host ID lastupdate_time days_since name
| sort - days_since
| rename host as "Server" ID as "Directory Name" lastupdate_time as "Date/Time Last updated" days_since "Days since last update"
Thanks for all your help @richgalloway . Unfortunately, it made a new problem and I will post a new question about that. Thanks again!