Hello everyone, so this time I don't have a problem with my code or dashboard, but my question is about the performance of my dashboard.
I have a dashboard with 15 panels and I want to know if there are tricks to optimize or accelerate the loading of the dashboard?
Reiterating what was said in previous posts try
index="YOUR_INDEX_NAME" feature="chat" | stats values(MESSAGE) AS MESSAGE , values(chatId) AS chatId , values(chatOrigin) AS chatOrigin , values(media) AS media , values(chatNbreRepliqueAgent_lo) AS chatNbreRepliqueAgent_lo , values(chatNbreRepliqueClient_lo) AS chatNbreRepliqueClient_lo , values(location) AS location , values(client) AS client , values(session) AS session , values(chatAvailability_lo) AS chatAvailability_lo , values(step) AS step , values(stepResult) AS stepResult , values(isEligible_lo) AS isEligible_lo , values(context) AS context by _time
I'm assuming you have 15 independent panels? If so then this is tieing up 15 CPU cores on your SH and possibly limiting the speed it returns values. You should look into using post processing so you can use 1-2 searches to drive those 15 panels. Once you have that, you can then explore other optimization techniques like tstats, metasearch or refactoring your search to make the panels load faster.
Feel free to share your search and I can help optimize it
Ok, so your issue is with search speed.. Coming back to the later part of my comment, post your search so we can help optimize it
The answer will be one of the following
1) Refactor the search
2) Use tstats or metasearch if possible
3) Get more indexers
4) Decrease your sample size
5) Use a summary index or accelerated data model
6) Use event sampling
You need a transformational command..
Share your search for more help or accept the answer. Not much more I can do without seeing your search
| where feature="chat"
| table _time, MESSAGE, chatId, chatOrigin, media, chatNbreRepliqueAgent_lo, chatNbreRepliqueClient_lo,location,client,session,chatAvailability_lo,step,stepResult,isEligible_lo,context
@taha13, if you have created base search as Table command for pulling _raw events for Post Processing even that is not a best practice, you should have transforming command instead. Refer to documentation for Post Processing Best Practices: http://docs.splunk.com/Documentation/Splunk/latest/Viz/Savedsearches#Best_practices
To add on to @skoelpin 's comment where feature="chat" should actually be a part of your main search i.e.
You should also understand that there is difference between table and fields command so use table only as your final command in case you need to arrange the fields in specific order as per your need.
Refer to the documentation for search query optimization
Also if you have 15 panels in a dashboard, do you need to view all of them at the same time? (usually it is 4, 6 or 8 panels at a time). If so you should enable Check Boxes/Links or Tabs to switch from One Panel to other or may be show details only when required. This way you restrict running only the searches which are required by default/or the users see first when the dashboard loads. Refer to Blog: https://www.splunk.com/blog/2015/03/30/making-a-dashboard-with-tabs-and-searches-that-run-when-click...
okey as you want,for me i did not accept your answer because is not really what i want,with your solution i pass from 3min30sec to 3min 10sec
I'm searshing a solution to load my dashboard on 2min.
We are here to help each other,and you are here to collect votes
Obviously your not looking for help when you claim that my solution decreased 30 seconds off your load time with no further details.. Then when I followed up asking if this solved your question, you still provide zero details and leave it unaccepted. I'm sorry, but I'm not spending anymore time on this
Lots of problems here.. How many indexes are you searching over? Is it possible to to explicitly call only the indexes needed? How large is your time frame? How many events are you searching over? How many indexers do you have? What storage are you using? You should replace
search. I think it would be faster with a stats then it would be with a table command too.
Once you replace that table with a transformational command, you can accelerate it as a data model or a report and get exponentially faster speeds.