Splunk Search

Need to filter table results in Dashboard after stats and join commands

jspigler2010
Explorer

I am looking for a way to filter the results that I am returning from an initial SPL search, a join command keying off of the host name and finally running a stats commands with all joined results. Reason being, this will be an interactive dashboard panel where you can filter based off of user, host, client_ip and client_mac. I have tried using "where" but looks like this won't match all values with the initial wildcard value of "*" when the dashboard first loads. The customer I am working with is requesting to be able to key off of the previously mentioned fields and values but they only show up periodically in pcoip server logs. Hence the reason I am using join.

Any help on how I could tackle this would be awesome!

Heres my search. If there is anyway I can do a post search filter, meaning all stats are calculated and results are returned and filter that, that would be awesome. Maybe I'm over thinking this though...

index=vdi sourcetype="vmware_pcoip" host="*"
| join host [search index=vdi sourcetype="vmware_pcoip" (user="*" OR client_mac="*" OR  client_ip="*" OR nat_client_ip="*")]
| convert ctime(_time) as time 
| stats earliest(time) as "Earliest Time stamp" latest(time) as "Latest Time stamp" 
last(user) as user last(client_ip) as client_ip last(domain) as domain last(client_mac) as client_mac last(nat_client_ip) as nat_client_ip by host 
| fillnull value="0"
| table host "Earliest Time stamp" "Latest Time stamp" user client_ip domain client_mac nat_client_ip
0 Karma
1 Solution

elliotproebstel
Champion

If I'm understanding the requirements correctly, I don't think you need that join at all - because the stats call will effectively "join" the data together by host. So if your dashboard has text input boxes for tokens that are intuitively named $user$, $host$, $client_ip$, and $client_mac$ - then you could structure your query like this:

index=vdi sourcetype="vmware_pcoip" host="$host$"
| stats earliest(_time) as "Earliest Time stamp" latest(_time) as "Latest Time stamp" last(user) as user last(client_ip) as client_ip last(domain) as domain last(client_mac) as client_mac last(nat_client_ip) as nat_client_ip by host 
| fillnull value="0"
| search user="$user$" host="$host$" client_ip="$client_ip$" client_mac="$client_mac$"

Make sure the values for those tokens are defaulting to * on dashboard page load, and that last search line will effectively do nothing until one of the inputs is populated.

View solution in original post

0 Karma

elliotproebstel
Champion

If I'm understanding the requirements correctly, I don't think you need that join at all - because the stats call will effectively "join" the data together by host. So if your dashboard has text input boxes for tokens that are intuitively named $user$, $host$, $client_ip$, and $client_mac$ - then you could structure your query like this:

index=vdi sourcetype="vmware_pcoip" host="$host$"
| stats earliest(_time) as "Earliest Time stamp" latest(_time) as "Latest Time stamp" last(user) as user last(client_ip) as client_ip last(domain) as domain last(client_mac) as client_mac last(nat_client_ip) as nat_client_ip by host 
| fillnull value="0"
| search user="$user$" host="$host$" client_ip="$client_ip$" client_mac="$client_mac$"

Make sure the values for those tokens are defaulting to * on dashboard page load, and that last search line will effectively do nothing until one of the inputs is populated.

0 Karma

jspigler2010
Explorer

Wow...
I was wayyyyyyy over complicating that. Thanks elliot!

0 Karma
Get Updates on the Splunk Community!

Platform Newsletter Highlights | March 2023

 March 2023 | Check out the latest and greatestIntroducing Splunk Edge Processor, simplified data ...

Enterprise Security Content Updates (ESCU) - New Releases

In the last month, the Splunk Threat Research Team (STRT) has had 3 releases of new content via the Enterprise ...

Thought Leaders are Validating Your Hard Work and Training Rigor

As a Splunk enthusiast and member of the Splunk Community, you are one of thousands who recognize the value of ...