Splunk Search

Too many search jobs found in the dispatch directory + Keep on extending the job expiry time + Search artifacts not removed as per ttl

splunker12er
Motivator

I see too many search jobs present in the dispatch directory.
Even after completing the jobs the expiry date keep on increasing and not removed from the dispatch folder.
This impact me in creating other search jobs via REST API. ("Job not yet scheduled by server")
Is there any way splunk automatically clears those jobs ?
every time i manually clear the dispatch directory for other jobs to get scheduled.

OS : Linux 2.6.32-504.16.2.el6.x86_64 x86_64
Splunk v6.0.4

jeremiahc4
Builder

I've noticed this behavior on 6.2.6 also. We have a 6 node search head cluster and I see jobs that get stuck for some reason... status=Done, sitting for nearly a month, refresh the view and the expiration moves to the current time... these are not scheduled searches where they triggered an alert action either. These are just users running a search in the UI as near I can tell.

also of note, we have the artifact cleanup script in place for the known issue in 6.2.6 also. Anything non-scheduler artifacts older than 2 hours get removed by the script.

0 Karma

Lucas_K
Motivator

Known issue with that particular version?

0 Karma

splunker12er
Motivator

Its not a known issue ., verified from the docs.
http://docs.splunk.com/Documentation/Splunk/6.0.3/ReleaseNotes/KnownIssues

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...