Splunk Search

Bulk-deleting expired search jobs?

w564432
Explorer

Hello,

We have an application pulling search results from a scheduled search using Splunk API periodically, but encountering an issue where there is an excess of expired jobs (5000+) which are being kept for 1 month+ for some reason. Because the application has to look through each of these jobs it's taking too long and timing out. 

We tried deleting the expired jobs through the UI but they keep popping back up/not going away. Some of these now say "Invalid SID" when I try to inspect them. Is there any way we can clear these bulk, preferable without resorting to UI (which only shows 50 at a time)? 

Labels (1)
0 Karma

tpickle
Splunk Employee
Splunk Employee

What was the resolution for this issue? Reassign SHC captain? Rolling restart?

0 Karma

w564432
Explorer

Not sure actually, I am actually not the admin of our cluster but I know both of those things (reassign captain, rolling restart) have happened between then and now either due to other issues or updates. 
Either way it seems to have cleared up some time ago now... 

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...