Hi Splunk Community, I'm new to Splunk and working on a deployment where we index large volumes of data (approximately 500GB/day) across multiple sources, including server logs and application metrics. I've noticed that some of our searches are running slowly, especially when querying over longer time ranges (e.g., 7 days or more). Here’s what I’ve tried so far: Used summary indexing for some repetitive searches. Limited the fields in searches using fields command. Ensured searches are using indexed fields where possible. However, performance is still not ideal, and I’m looking for advice on: Best practices for optimizing search performance in Splunk for large datasets. How to effectively use data models or accelerated reports to improve query speed. Any configuration settings (e.g., in limits.conf) that could help. My setup: Splunk Enterprise 9.2.1 Distributed deployment with 1 search head and 3 indexers Data is primarily structured logs in JSON format Any tips, configuration recommendations, or resources would be greatly appreciated! Thanks in advance for your help.
... View more