Splunk Search

Limiting Matching Events


Hi Everyone, 

I am looking for a little advice, I am currently searching splunk against multiple sets of variables to see if there are any events in the past 90 days, however I am running into an issue with there being too many events that my search is parsing through. I dont need to see the total number of events that matched, only need to see if there were at least 10 events that matched. Since there are 100+ sets of variables to check, doing it by hand one at a time seems tedious and lengthy. Would you be able to help me limit the events parsed so that it stops checking a set once it reaches a predetermined amount?

Here is an example of my search: 

index=blah sourcetype=blah (name=Name1 ip=IP1 id=id1) OR (name=Name2 ip=IP2 id=id2) OR (name=Name3 ip=IP3 id=id3) OR .... (name=Name105 ip=IP105 id=id105) | stats count by name, ip, id

Any and all help would be appreciated

Labels (3)
Tags (2)
0 Karma


@EPitch  Do you mean if the sum of count is > 10 or if the number of distinct name/ip/id combinations is more than 10?

If the former, then if you put a 

| head 11

after your search, I believe it will speed up the search - although it will probably process the query data fully, it will only retain max 11 results, so then if you have stats count and the count is 11 then you have more than 10.


0 Karma


Hi @bowesmana,

So I actually need it do be limited a certain number per distinct name/ip/id combinations, because there are some combinations have rarer matching events compared to others and I did not want to search through millions of events for one combo before getting any hits on another, Thank you!

0 Karma


@EPitch I don't believe there is a break on condition function to abort the search, but what you could do, is to turn on sampling at an appropriately large ratio so you run the search on a subset of the data. This will be quicker - if you get >10 then you don't need to re-run - but if you get <10, you will need to re-run at a lower sampling ratio.

I'm not sure this solves the problem in that if you don't expect or want >10 then you will always end up running the search with 1:1 ratio.

The other alternative is to craft your search criteria to use the TERM() directive if possible and if these data fields can be reduced to TERM elements then you can even use tstats.

See this .conf presentation 


So maybe you can do 

index=blah sourcetype=blah (TERM(name=Name1) TERM(ip=IP1) TERM(id=id1)) OR...

but you will have to know your data well to know if the terms exist as real terms in the data and you need to understand major and minor breakers in the data.

If all the search criteria can be converted to TERM then you could do

| tstats count where index=blah sourcetype=blah (TERM(name=Name1) TERM(ip=IP1) TERM(id=id1)) OR... by PREFIX(name=) PREFIX(ip=) PREFIX(id=)
| rename *= as *
0 Karma


Hi @EPitch,

you could try to create a lookup (called e.g. "conditions.csv") containing in three columns your three conditions (use as column names the fields of your search:  name, ip, id).

then you can use the lookup in a subsearch running a simple search like the folowing:

index=blah sourcetype=blah [ | inputlookup conditions.csv | fields name ip id ] | stats count by name ip id

Remember to create also the lookup definition.



0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...