Splunk Search

Help with sorting a date

Explorer

Sorry for spamming this board (or so it feels like) but I have one more question before the weekend. This may not be even possible.... but I'm working on generating a list of IP addresses and user names that use them so we can create a white list for our FTP server. I had this list completed and thought I could call it weekend...

After I sent the list over to the security folks, they requested I have the most recent date that the combination of source IP and and user name was used. When adding the date field to my counted list of results, I get (obviously), a line for every time a source IP, user name, and date are the same. Is there a way I can keep the uniqueness of the source IP and user name while having the most recent date the two were used in conjunction added to the end?

Here is the search I'm using:

index="ftp" host="host" "bPasswordOK=1" | fields user, src_ip, log_date | stats count by user, src_ip, log_date | sort -count

Tags (1)
0 Karma
2 Solutions

Splunk Employee
Splunk Employee

This may not be the best route to go, but one option would be to use a join:

index="ftp" host="host" "bPasswordOK=1" | eval usersrc=user . src_ip | stats count by user, src_ip, usersrc | join type=outer usersrc [index="ftp" host="host" "bPasswordOK=1" | eval usersrc=user . src_ip | stats max(_time) as LatestTime by user, src_ip, usersrc] | convert ctime(LatestTime) | table src_ip user count LatestTime

That might not be exactly right, but it should get you pretty close. The model I used based on my firewall logs is:

sourcetype="firewall" action=Deny | eval srcdst=src . dst | stats count by src, dst, srcdst | join type=outer srcdst [search sourcetype="firewall" action=Deny | eval srcdst=src . dst | stats max(_time) as LatestTime by src, dst, srcdst] | convert ctime(LatestTime) | table src dst count LatestTime

View solution in original post

Splunk Employee
Splunk Employee

Just:

index="ftp" host="host" "bPasswordOK=1" 
| fields user, src_ip, log_date 
| stats count,max(log_date) as log_date by user, src_ip 
| sort - count

Assuming your log_date is in epoch time (or something where it counts up). I'm not sure why you have a field log_date instead of using _time, but okay. If log_date is just a formatted version of _time, then I would use that, then use eval strftime() or convert ctime() to convert _time to something readable at the end, after the stats. You could I suppose to the reverse and take log_date and use eval strptime() or convert mktime to generate an epoch time from log_date before you do stats instead, in which case:

index="ftp" host="host" "bPasswordOK=1" 
| fields user, src_ip, log_date 
| eval epochlogdate=strptime(log_date.":00:00:00","%Y-%m-%d:%H:%M:%S")
| stats count,max(epochlogdate) as log_date by user, src_ip 
| eval log_date=strftime(log_date,"%Y-%m-%d")
| sort - count

View solution in original post

Splunk Employee
Splunk Employee

Just:

index="ftp" host="host" "bPasswordOK=1" 
| fields user, src_ip, log_date 
| stats count,max(log_date) as log_date by user, src_ip 
| sort - count

Assuming your log_date is in epoch time (or something where it counts up). I'm not sure why you have a field log_date instead of using _time, but okay. If log_date is just a formatted version of _time, then I would use that, then use eval strftime() or convert ctime() to convert _time to something readable at the end, after the stats. You could I suppose to the reverse and take log_date and use eval strptime() or convert mktime to generate an epoch time from log_date before you do stats instead, in which case:

index="ftp" host="host" "bPasswordOK=1" 
| fields user, src_ip, log_date 
| eval epochlogdate=strptime(log_date.":00:00:00","%Y-%m-%d:%H:%M:%S")
| stats count,max(epochlogdate) as log_date by user, src_ip 
| eval log_date=strftime(log_date,"%Y-%m-%d")
| sort - count

View solution in original post

Explorer

Well there is an 8 hour offset on these logs. I don't want the offset applied to the results I'm sending over. If I use _time, will that include the timezone difference I have applied? I just assumed it would so I went ahead and made a new field capturing the actual time stamp written to the logs.

0 Karma

Splunk Employee
Splunk Employee

This may not be the best route to go, but one option would be to use a join:

index="ftp" host="host" "bPasswordOK=1" | eval usersrc=user . src_ip | stats count by user, src_ip, usersrc | join type=outer usersrc [index="ftp" host="host" "bPasswordOK=1" | eval usersrc=user . src_ip | stats max(_time) as LatestTime by user, src_ip, usersrc] | convert ctime(LatestTime) | table src_ip user count LatestTime

That might not be exactly right, but it should get you pretty close. The model I used based on my firewall logs is:

sourcetype="firewall" action=Deny | eval srcdst=src . dst | stats count by src, dst, srcdst | join type=outer srcdst [search sourcetype="firewall" action=Deny | eval srcdst=src . dst | stats max(_time) as LatestTime by src, dst, srcdst] | convert ctime(LatestTime) | table src dst count LatestTime

View solution in original post

Splunk Employee
Splunk Employee

try this:

index="ftp" host="host" "bPasswordOK=1" | fields user, src_ip, log_date | stats count by user, src_ip, log_date | sort -count | dedup src_ip | dedup user

Edit: okay, thinking a bit more about this i think the above might not be what you want. It dedups first by one field, and then dedups by the second. Reading your question again it feels like you want to dedup by a "concatenation" of the two. So if both A and B are the same, dedup by it. To achieve that i think this is the search you need:

index="ftp" host="host" "bPasswordOK=1" | fields user, src_ip, log_date | stats count by user, src_ip, log_date | sort -count | eval myfield=source.punct | fields myfield | dedup myfield

Im no search expert though, so you might want to hear from SS or GK for a better approach.

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!