Hi. I get the following error on one of my indexers.
The minimum free disk space (5000MB) reached for /opt/splunk/var/run/splunk/dispatch.
Having looked at the folder, all the files are from today. Is it safe for me to clean this folder? I understand that you could modify parameters in limits.conf, however, I want the indexer's minimum disk space to be left at 5000MB which I believe is safe?
Could you please advise?
Have a look at the docs http://docs.splunk.com/Documentation/Splunk/6.5.3/Search/Dispatchdirectoryandsearchartifacts#Clean_u... and learn about the
clean-dispatch command. Otherwise if you no longer need the search artifacts you can actually just delete the files.
Hope this helps ...
Hi MuS - I have familiarise myself with the commands already. I wanted to find out how I could determine if I need those files or not. May be they are being used by a current search running in the background? Could you advise?
These files are generated when running searches or saved searches. They can safely be deleted, and will be regenerated when you re-run those searches again. Think of it as web-cache for Splunk Searches. 🙂
No, because this all depends on your searches/use case.
Just my 2cents, I usually remove anything that is older than 15 minutes and never had any problems/troubles. Start deleting/removing the oldest ones only and slowly decrease the age.
But all the files seem to be around 5 to 10 mins old? Hence why I'm concerned if deleting it would cause any issues. Btw I'm talking about the dispatch directory on the indexer. Not sure why it should generate files in the first place as it's not used to carry out searches.
Plus they just seem to update in time constantly.
If a search builds out lookup tables, these files can be VERY large. If they're created in the last 5-10 minutes, then someone is running searches right now with LARGE results/outputs.
Did you recently install any new apps?
I have just taken over a project and have no clue about what has been done in the past. The lead consultant has left last night so Just trying to fix a list of errors and this one looked like it needed immediate attention.
If it continues to rebuild those large files, you'll want to look at the SID (search ID) to find out which one is creating these large outputs. Either that, or raise the 5000MB limit on the dispatch folder with the
dispatch_dir_warning_size = <int> setting.
Right, but the actual size of the dispatch directory is only 1230 MB so not really large. Plus the fact it's on the indexer (which no one should be using to run any searches), isn't it wiser to go ahead and just clean it?