Splunk Search

Bulk-deleting expired search jobs?

w564432
Explorer

Hello,

We have an application pulling search results from a scheduled search using Splunk API periodically, but encountering an issue where there is an excess of expired jobs (5000+) which are being kept for 1 month+ for some reason. Because the application has to look through each of these jobs it's taking too long and timing out. 

We tried deleting the expired jobs through the UI but they keep popping back up/not going away. Some of these now say "Invalid SID" when I try to inspect them. Is there any way we can clear these bulk, preferable without resorting to UI (which only shows 50 at a time)? 

Labels (1)
0 Karma

tpickle
Splunk Employee
Splunk Employee

What was the resolution for this issue? Reassign SHC captain? Rolling restart?

0 Karma

w564432
Explorer

Not sure actually, I am actually not the admin of our cluster but I know both of those things (reassign captain, rolling restart) have happened between then and now either due to other issues or updates. 
Either way it seems to have cleared up some time ago now... 

0 Karma
Get Updates on the Splunk Community!

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

The Latest Cisco Integrations With Splunk Platform!

Join us for an exciting tech talk where we’ll explore the latest integrations in Cisco + Splunk! We’ve ...

AI Adoption Hub Launch | Curated Resources to Get Started with AI in Splunk

Hey Splunk Practitioners and AI Enthusiasts! It’s no secret (or surprise) that AI is at the forefront of ...