Getting Data In

Eval and Dedup date fields

davidmonaghan
Explorer

I have the following search

sourcetype=dhcp | stats earliest(_time) as FirstSeen, latest(_time) as LastSeen by IP_Address | fieldformat FirstSeen=strftime(FirstSeen, "%b %d %Y %H:%M:%S") | fieldformat LastSeen=strftime(LastSeen, "%b %d %Y %H:%M:%S") | inputlookup append=true seen_dhcp_ip_addresses.csv | where IP_Address="192.168.1.1"

Which outputs the following:

IP_Address   FirstSeen             LastSeen
192.168.1.1    Nov 22 2017 16:52:44 Nov 22 2017 16:52:44
192.168.1.1 Jul 25 2017 08:52:00    Nov 19 2017 01:02:16

I would like to combine these two rows into the following output instead

IP_Address  FirstSeen              LastSeen
192.168.1.1   Jul 25 2017 08:52:00  Nov 22 2017 16:52:44

Thanks

0 Karma
1 Solution

kamlesh_vaghela
SplunkTrust
SplunkTrust

HI

Can you please try this search?

sourcetype=dhcp 
| stats earliest(_time) as FirstSeen, latest(_time) as LastSeen by IP_Address 
| inputlookup append=true seen_dhcp_ip_addresses.csv 
| where IP_Address="192.168.1.1" 
| stats min(FirstSeen) as FirstSeen max(LastSeen) as LastSeen by IP_Address
| fieldformat FirstSeen=strftime(FirstSeen, "%b %d %Y %H:%M:%S") 
| fieldformat LastSeen=strftime(LastSeen, "%b %d %Y %H:%M:%S")

Thanks

View solution in original post

kamlesh_vaghela
SplunkTrust
SplunkTrust

HI

Can you please try this search?

sourcetype=dhcp 
| stats earliest(_time) as FirstSeen, latest(_time) as LastSeen by IP_Address 
| inputlookup append=true seen_dhcp_ip_addresses.csv 
| where IP_Address="192.168.1.1" 
| stats min(FirstSeen) as FirstSeen max(LastSeen) as LastSeen by IP_Address
| fieldformat FirstSeen=strftime(FirstSeen, "%b %d %Y %H:%M:%S") 
| fieldformat LastSeen=strftime(LastSeen, "%b %d %Y %H:%M:%S")

Thanks

elliotproebstel
Champion

Hey @davidmonaghan - Based on the output you've shown in the post, it looks like you're storing the StartTime and EndTime in the csv file as strings, which means you'll need to convert those strings back into epoch timestamps in order to use the approach suggested by @kamlesh_vaghela. If you have the option to do so, I'd recommend storing those timestamps in the csv file as epoch timestamps and only converting them to strings right before you use them in a view. If you need to convert the contents of your csv file, you can do this:

| inputlookup seen_dhcp_ip_addresses.csv 
| eval FirstSeen=strptime(FirstSeen, "%b %d %Y %H:%M:%S"), LastSeen=strptime(LastSeen, "%b %d %Y %H:%M:%S")
| outputlookup seen_dhcp_ip_addresses.csv 

That will overwrite your lookup file in place with epoch timestamps instead of strings.

If you aren't in a position to replace the CSV contents, then I think you'll need to convert the timestamps from the lookup file to epoch timestamps before using the | stats min(FirstSeen)... part of the search in @kamlesh_vaghela's answer.

davidmonaghan
Explorer

Hi @elliotproebstel

Thanks for the tip...

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...