I'm getting this message in search head :
"Dispatch Manager : The minimum free space (5000MB) reached for /opt/splunk/var/run/splunk/dispatch"
after researchs on server, i found:
- df -h => /opt/splunk full at 91% - 4.2Go free on 47Go (yesterday - 13/01/2020)
- df -h => /opt/splunk full at 92% - 3.9Go free on 47Go (today - 14/01/2020)
- the bigest directory is : /opt/splunk/var/lib/splunk/csmsisupervisionactive/datamodel_summary => 32Go
and is full of directories like 3374-ED6F9A3B-E103-4D86-8F41-xxxxxxxx
-the oldest directories are October 16 2019 and lots of them are refreshed each 10 minutes.
is-it safe to delete oldest directories?
why the alerte is about another directory than the one i found?
best practices are to exand fs ? resize server.conf? delete folders/restart?
thanks a lot for helping.
Your search head doesn't have enough space for the recommended minimum. The dispatch directory is what contain artifacts (such as search results) created by each search that get executed. Depending on time to live of the search this stuff ages out and keeps cycling so there is a fluctuation there. You want to have enough space for your results to reside there though. You can change the warning to appear at a different size if you want to get it closer to max <5GB by changing dispatchdirwarning_size in limits.conf.
The directories you are referencing as the largest/oldest are for datamodel acceleration. You can see your datamodels can be seen in "Settings-->Datamodels". I wouldn't go in and manually delete the directory, but if the datamodel is not used and you want to remove it you can remove the acceleration from the Datamodels page. Alternatively, you can reduce the acceleration duration so it retains a smaller set of data.
I converted the comment to an answer. Feel free to accept if this resolved your issue or let me know if you have any other questions. Thanks!