- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
All, my /opt/splunk/var/spool/splunk directory has 83,000 plus "*.stash_new" files in it and I would like to clear them out. I have seen references to this issue but no real solutions. If anyone has figured out how to accomplish this, can you please pass along the procedure?
I've noticed that the files go back to March of last year. Does anyone know the implications of simply deleting these real old files?
Thanks in advance.
UPDATE: I was troubleshooting another issue on this splunk instance that required a splunk restart. After the restart I noticed in the splunkd.log file that splunk was going through all 83,000 files trying to reread them, and failing. I understand that rereading the stash_new files in the spool directory at start up is normal splunk processing. Now I understand why I did not notice any current missing data.
So I'm back to the consequences of simply deleting the old stash_new files. Does anyone have experience with that?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

They are the files created for the summary indexing, and should have been deleted once indexed.
Look like you encountered the error described here :http://answers.splunk.com/answers/70072/summary-indexing-blocked-and-binary-file-warning
please upgrade to 5.0.3 or more recent, and verify that no new files get stuck in the folder (they should stay there only a few minutes)
About the old files, they are old summary reports.
- if you do not need them, delete them
- if want to force the indexing anyway, you can use the workaround described in the answer.
- if you want to start fresh and cover all your pervious months and have the data available in the original indexes, you can clean the files, clean the summaries, and run the backfill script. see http://docs.splunk.com/Documentation/Splunk/6.0.1/Knowledge/Managesummaryindexgapsandoverlaps
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

They are the files created for the summary indexing, and should have been deleted once indexed.
Look like you encountered the error described here :http://answers.splunk.com/answers/70072/summary-indexing-blocked-and-binary-file-warning
please upgrade to 5.0.3 or more recent, and verify that no new files get stuck in the folder (they should stay there only a few minutes)
About the old files, they are old summary reports.
- if you do not need them, delete them
- if want to force the indexing anyway, you can use the workaround described in the answer.
- if you want to start fresh and cover all your pervious months and have the data available in the original indexes, you can clean the files, clean the summaries, and run the backfill script. see http://docs.splunk.com/Documentation/Splunk/6.0.1/Knowledge/Managesummaryindexgapsandoverlaps
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yannj,
Thanks for the update. We just ran an upgrade to 5.0.5 on Saturday, 5 days ago. And the last file was from 02/15, when we updated. I just wanted to make sure that if I delete the old files something else won't blow up.
