I have an autogenerated dashboard with 160 panels. The good news: Each panel's search uses an accelerated saved search. The bad news: I have so many panels that I frequently see this message in about 40 of the panels:
Out of curiosity, what is the use case for 160 different searches kicking off on a single dashboard on load? As a general rule of thumb that seems like either too much data to reason about at a glance on a single screen, poorly designed searches that are retrieving too little data from overlapping sources, or both.
Arguably if your SH and indexers are currently sized for 50 concurrent searches, the super expensive method would just mean tripling the number of CPU cores on the SH and each indexer. (Likely some memory and disk upgrades too...) obviously that scale up method incurs exponentially growing costs as you go.
So how do we do better with limited resources?
1) base searches with post processing. (You can have a single search that drives results across multiple panels)
2) referencing already executed Scheduled searches as base searches, especially if up-to-the-minute on dashboard load is not a requirement. (Accelerated searches are good for certain use cases, but if time windows are too soon, acceleration hasn’t run yet... Accelerated data models and Summary Indexing are also potential options for certain search patterns. )
3) think about your users and design your workflow of the dashboard... what are they looking for first? Why? What panels are only needed under specific conditions and otherwise ignored? would drill downs make sense to offload panels to other views based on the expected user workflow?
As for my use case: I have 80 metrics and 2 graphs per metric. One search is a timechart, the other search is a stats to get the overall P90. I can't identify a way to combine the two and keep the search accelerated. The metric corresponds to the performance of a single URL (so 80 URLs). The interested parties want to be able to view this on one screen, concisely.
The URLs come from web server logs. Would a better designed base search pull in all the URLs and then have other searches that filter on URLs? But that way doesn't seem any more optimized than using the index as is.
Although I am a Splunk novice (having last used Splunk about a decade ago), Splunk is the one tool that has made me change my behavior to fit how Splunk works. In my opinion, it's a glaring gap in the product (and bad customer experience) when the user needs to know so much about how the underlying system works. Though it is still better than a decade ago!