Splunk Search

search query stats multiple counts filteration

yasit
Explorer

i have a query where i am looking for multiple values with OR and then counting the occurrence with the stats the query is something like this 

index=****  ("value1") OR ("Value3") OR ...  |  stats count(eval(searchmatch("vlaue1"))) as value1,  count(eval(searchmatch("vlaue2"))) as value2 


now I just want to collect only those values which are found which mean there count is greater than 0. How can I achieve this where only stats of the values are displayed which are found in the events

 
also search values are mostly ips, URLs , domains, etc
Note: I'm making this query for dashboard

Labels (7)
0 Karma
1 Solution

yuanliu
SplunkTrust
SplunkTrust

Splunk (and most data query languages) treat columns as solemn.  But for display purposes, you can fool the system by converting columns to rows and take out those you don't want.  Of course we are talking about transpose.

 

| transpose 0
| search "row 1" > 0
| transpose 0 header_field=column
| fields - column

 

To demonstrate, run this search

 

index=_internal sourcetype!=splunkd_ui_access json OR python OR foobar
| stats count(eval(searchmatch("json"))) as json count(eval(searchmatch("python"))) as python count(eval(searchmatch("foobar"))) as foobar
``` data emulation above ```

 

It gives 0 for foobar.  

jsonpythonfoobar
40511350

But this search

 

index=_internal sourcetype!=splunkd_ui_access earliest=-5h json OR python OR foobar
| stats count(eval(searchmatch("json"))) as json count(eval(searchmatch("python"))) as python count(eval(searchmatch("foobar"))) as foobar
``` data simulation above ```
| transpose 0
| search "row 1" > 0
| transpose 0 header_field=column
| fields - column

 

eliminates foobar from table

jsonpython
4421232

(The numbers changed because this is a live splunkd.)

View solution in original post

Tags (1)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Splunk (and most data query languages) treat columns as solemn.  But for display purposes, you can fool the system by converting columns to rows and take out those you don't want.  Of course we are talking about transpose.

 

| transpose 0
| search "row 1" > 0
| transpose 0 header_field=column
| fields - column

 

To demonstrate, run this search

 

index=_internal sourcetype!=splunkd_ui_access json OR python OR foobar
| stats count(eval(searchmatch("json"))) as json count(eval(searchmatch("python"))) as python count(eval(searchmatch("foobar"))) as foobar
``` data emulation above ```

 

It gives 0 for foobar.  

jsonpythonfoobar
40511350

But this search

 

index=_internal sourcetype!=splunkd_ui_access earliest=-5h json OR python OR foobar
| stats count(eval(searchmatch("json"))) as json count(eval(searchmatch("python"))) as python count(eval(searchmatch("foobar"))) as foobar
``` data simulation above ```
| transpose 0
| search "row 1" > 0
| transpose 0 header_field=column
| fields - column

 

eliminates foobar from table

jsonpython
4421232

(The numbers changed because this is a live splunkd.)

Tags (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Your question is a bit vague so I'm not sure what you want so please be a little more descriptive. But from what you wrote I assume that you do some comditional aggregation and want to "go back" to raw events fulfilling your conditions. You can't do that this way.

Splunk "loses" all information not being explicitly passed from the command. So when you're doing the stats command only results of the stats command are available for further processing - the original events are no longer known in your pipeline.

So you have to approach it differently. Probably adding some artificial "classifier" field or two but can't really say without knowing what exactly you want to achieve.

0 Karma
Get Updates on the Splunk Community!

Customer Experience | Splunk 2024: New Onboarding Resources

In 2023, we were routinely reminded that the digital world is ever-evolving and susceptible to new ...

Celebrate CX Day with Splunk: Take our interactive quiz, join our LinkedIn Live ...

Today and every day, Splunk celebrates the importance of customer experience throughout our product, ...

How to Get Started with Splunk Data Management Pipeline Builders (Edge Processor & ...

If you want to gain full control over your growing data volumes, check out Splunk’s Data Management pipeline ...