@yarsa, Not sure of your hardware configuration, how many concurrent searches/users, or searches. Reaching that limit isn't too harmful other than slow searches and possible slower indexing if your hardware resources have high utilization.
TroubleshootingSearchQuotas
I would install SOS (Splunk On Splunk), this will help determine which searches/users are eating up your resources.
Once you identified your malformed Searches using the "Seach Job Inspector" to find where your highest Excecution costs are.
Also you may want to limit how time range and number of searches each users can run
If you find that your dashboards are taking up most of your searches, you many want to invest time in trying to combine your searches into post-process searches.
Also look at how you have your indexes broken out.
Evaluate your saved searches. Can some of those be ran over night or off hours.
Review your disk metrics (Disk Q length, read/write per sec, etc).
Consider using bloom filters.
I also recommend obtaining a copy of Exploring Splunk.
Seach Tips from "Exploring Splunk":
Filter out unneeded fields as soon as possible
Filter out results before calculations
Turn of Field Discovery.
Use Advanced Charting view over Timeline view. Timeline has higher costs.
Other things to note dense searches are faster than sparse search. Rare term search have high IO cost. Low cardinality seaches are also faster.
Additional Reading:
Bloomfilters
SearchJobInspector
OptimizeSearchSpeed
Exploring_Splunk
OptimizeSplunkforpeakperformance
Typesofsearches Other types of searches "Super-‐Sparse" and "Rare Term"
PostProcessSeaches
Hope this helps you.
... View more