Hi,
I was wondering how harmful is it to get near that limit in a single search query?
If some of my searches take more than a few minutes to return, should I be questioning the way I built them?
By the way I am already using summary indexes and jobs in other places.
@yarsa, Not sure of your hardware configuration, how many concurrent searches/users, or searches. Reaching that limit isn't too harmful other than slow searches and possible slower indexing if your hardware resources have high utilization.
I also recommend obtaining a copy of Exploring Splunk.
Seach Tips from "Exploring Splunk":
Other things to note dense searches are faster than sparse search. Rare term search have high IO cost. Low cardinality seaches are also faster.
Additional Reading:
OptimizeSplunkforpeakperformance
Typesofsearches Other types of searches "Super-‐Sparse" and "Rare Term"
Hope this helps you.
Yarsa - I've never heard of issues on single search limits after all that is big data is all about, but would certainly check out your 'expensive searches', check which ones are machine intensive (Search>Status>Search...etc). You can also set the default time period a search covers changing it from 'All Time' to something more reasonable. If you run a lot of adhoc searches, see some results of interest, then also finalise the search...unless you need the full set...hope this helps!
Br
D