Splunk Search

What is an easy way to determine what searches are filling up the dispatch directory?

Path Finder

I'm seeing lots of dispatch directory threshold errors.
Is there an easy way to see what searches or reports are driving those values?

Splunk Employee
Splunk Employee

Having a lot of jobs artifacts in the dispatch folder, is directly proportional to the number of search jobs, and the time to live of the search artifact (dispatch.ttl)

  • ad-hoc search jobs artifacts usually stay in the dispatch 10 minutes.
  • scheduled search jobs stay for 2 times the scheduling interval (a job running every 4 hours stay on the dispatch for 8 hours)
  • scheduled search jobs that triggered an alert action ( like tracking or alert email) will stay in dispatch for 24h.
  • search jobs shares will be kept for 7 days.
  • realtime searches stay as long at the search is running

Knowing that you can look at your dispatch folder and figure what constitutes the mass of job artifacts.


Options to reduce the TTL are :
For searches
- edit the dispatch.ttl in savedsearch.conf. For a particular search, or in the generic settings.
- do the same on a per search basis using the UI > searches&reports > advanced edit

for alerts
- reduce the ttl in alert_actions.conf
- or reduce the number of unnecessary alerts.

0 Karma


what is the error that you are getting?
you can click on activity (top right drop down) and pick "Jobs" filter by status

0 Karma

Path Finder

I get an error that says "Dispatch COmmand: The number of search artifacts in the dispatch directory is higher than recommended...."

I am just trying to figure out the best way to determine what is driving the large number of artifacts specifically. (Since we are running ITSI, I'm wondering what part of that, if any is contributing to the issue).

0 Karma