Splunk Search

What is an easy way to determine what searches are filling up the dispatch directory?

sail4lot
Path Finder

I'm seeing lots of dispatch directory threshold errors.
Is there an easy way to see what searches or reports are driving those values?

yannK
Splunk Employee
Splunk Employee

Having a lot of jobs artifacts in the dispatch folder, is directly proportional to the number of search jobs, and the time to live of the search artifact (dispatch.ttl)

  • ad-hoc search jobs artifacts usually stay in the dispatch 10 minutes.
  • scheduled search jobs stay for 2 times the scheduling interval (a job running every 4 hours stay on the dispatch for 8 hours)
  • scheduled search jobs that triggered an alert action ( like tracking or alert email) will stay in dispatch for 24h.
  • search jobs shares will be kept for 7 days.
  • realtime searches stay as long at the search is running

Knowing that you can look at your dispatch folder and figure what constitutes the mass of job artifacts.

https://docs.splunk.com/Documentation/Splunk/latest/Search/Dispatchdirectoryandsearchartifacts

Options to reduce the TTL are :
For searches
- edit the dispatch.ttl in savedsearch.conf. For a particular search, or in the generic settings.
- do the same on a per search basis using the UI > searches&reports > advanced edit

for alerts
- reduce the ttl in alert_actions.conf
- or reduce the number of unnecessary alerts.

0 Karma

adonio
Ultra Champion

what is the error that you are getting?
you can click on activity (top right drop down) and pick "Jobs" filter by status

0 Karma

sail4lot
Path Finder

I get an error that says "Dispatch COmmand: The number of search artifacts in the dispatch directory is higher than recommended...."

I am just trying to figure out the best way to determine what is driving the large number of artifacts specifically. (Since we are running ITSI, I'm wondering what part of that, if any is contributing to the issue).

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...