Splunk Search

Bulk-deleting expired search jobs?

w564432
Explorer

Hello,

We have an application pulling search results from a scheduled search using Splunk API periodically, but encountering an issue where there is an excess of expired jobs (5000+) which are being kept for 1 month+ for some reason. Because the application has to look through each of these jobs it's taking too long and timing out. 

We tried deleting the expired jobs through the UI but they keep popping back up/not going away. Some of these now say "Invalid SID" when I try to inspect them. Is there any way we can clear these bulk, preferable without resorting to UI (which only shows 50 at a time)? 

Labels (1)
0 Karma

tpickle
Splunk Employee
Splunk Employee

What was the resolution for this issue? Reassign SHC captain? Rolling restart?

0 Karma

w564432
Explorer

Not sure actually, I am actually not the admin of our cluster but I know both of those things (reassign captain, rolling restart) have happened between then and now either due to other issues or updates. 
Either way it seems to have cleared up some time ago now... 

0 Karma
Get Updates on the Splunk Community!

See your relevant APM services, dashboards, and alerts in one place with the updated ...

As a Splunk Observability user, you have a lot of data you have to manage, prioritize, and troubleshoot on a ...

Index This | What goes away as soon as you talk about it?

May 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this month’s ...

What's New in Splunk Observability Cloud and Splunk AppDynamics - May 2025

This month, we’re delivering several new innovations in Splunk Observability Cloud and Splunk AppDynamics ...