Hi everyone and thanks in advance. I'm trying to collate all our SOCKS traffic on our network over the last 90 days. Our IP's rotate and as a result I can't run this search for all time, I have to run it for 90 days individually, Which is where I got to here: index=*proxy* SOCKS earliest=-1d latest=-0d
| eval destination=coalesce(dest, dest_port), userid=coalesce(user, username)
| rex field=url mode=sed "s/^SOCKS:\/\/|:\d+$//g"
| eval network=case(match(src_ip,"<REDACTED>"),"user",1=1,"server")
| stats
values(domain) as Domain
values(userid) as Users
values(destination) as Destinations
by url, src_ip, network
| convert ctime(First_Seen) ctime(Last_Seen)
| sort -Event_Count
| join type=left max=0 src_ip [
search index=triangulate earliest=-1d latest=-0d
|stats count by ip,username
|rename username AS userid
|rename ip as src_ip
]
| join type=left max=0 src_ip [
search index=windows_events EventID=4624 NOT src_ip="-" NOT user="*$" earliest=-1d latest=-0d
| stats count by IpAddress, user
| rename IpAddress as src_ip
| rename user as win_userid
| fields - count
]
|eval userid=coalesce(userid, win_userid)
| join type=left max=0 userid [
search index="active_directory" earliest=-1d latest=-0d
| stats count by username,fullname,title,division,mail
| rename username as userid
] Then a colleague suggested I do it slightly differently and run it over the 90 days but link it together which is where we got to here: index=*proxy* SOCKS
| eval destination=coalesce(dest, dest_port)
| rex field=url mode=sed "s/^SOCKS:\/\/|:\d+$//g"
| eval network=case(match(src_ip,"<Redacted>"),"user",1=1,"server")
| eval Proxy_day = strftime(_time, "%d-%m-%y")
| join type=left max=0 src_ip [
search index=windows_events EventID=4624 NOT src_ip="-" NOT user="*$"
| stats count by IpAddress, user
| rename IpAddress as src_ip
| rename user as win_userid
| fields - count
]
| eval userid=coalesce(userid, win_userid)
| join type=left max=0 userid [
search index="active_directory"
| stats count by username, fullname, title, division, mail
| rename username as userid
]
| rename src_ip as "Source IP"
| stats
values(mail) as "Email Address"
values(username) as "User ID"
values(destination) as Destination
values(network) as Network
values(Proxy_day) as Day
values(url) as URL
by "Source IP" However the problem I'm running into now is in the data produced there could be 100's of URL's / Emails / Day associated with the source IP which makes the data unactionable and actually starts to break a .csv when exported. Would anyone be able to help? Ideally I'd just like the top for example 5 results, but I've had no luck with that or a few other methods I've tried. Even SplunkGPT is failing me - is it even possible?
... View more