Splunk Search

Data in /opt/splunk/var/spool/splunk filling up disk

responsys_cm
Builder

I'm seeing a number of very large files building up in /opt/splunk/var/spool/splunk:

drwx------ 2 root root 4096 Feb 27 02:08 .

drwx--x--x 4 root root 4096 Feb 7 23:12 ..

-rw------- 1 root root 360903734 Feb 27 01:28 1400673619_events.stash_new

-rw------- 1 root root 372663350 Feb 27 01:53 1504785327_1400673619_events.stash_new

-rw------- 1 root root 375269359 Feb 27 02:03 157257541_1400673619_events.stash_new

-rw------- 1 root root 373008730 Feb 27 01:43 1750025097_1400673619_events.stash_new

-rw------- 1 root root 359388989 Feb 27 02:08 1874146970_1400673619_events.stash_new

-rw------- 1 root root 355854760 Feb 27 01:38 314379920_1400673619_events.stash_new

-rw------- 1 root root 375817381 Feb 27 01:33 314379920_events.stash_new

-rw------- 1 root root 372663350 Feb 27 01:48 357150606_1400673619_events.stash_new

-rw------- 1 root root 353926431 Feb 27 01:58 378307516_1400673619_events.stash_new

Is there any way I can configure Splunk so it removes them automatically or times them out? I saw an error message in the GUI that says Splunk reached the minimum disk limit for that directory. Is that value configurable? What is the impact on Splunk when that threshold is hit?

Thx.

Craig

Tags (1)

yannK
Splunk Employee
Splunk Employee

those are summary indexing results.
the server is not picking up those files probably because they are considered as binary (check splunkd.log)

see this answer http://splunk-base.splunk.com/answers/70072/summary-indexing-blocked-and-binary-file-warning

yannK
Splunk Employee
Splunk Employee

Please do not play with the queues size, it will not solve the root cause.

Your issue is likely that the server (search-head I will bet) is unable to write to it's local indexes, OR is unable to forward to the indexers.
Check the indexing queue (the last one before forwarding / disk writing) on the your search-head, then on the indexers if any for any signs of congestion.

0 Karma

responsys_cm
Builder

I'm also seeing: Metrics - group=queue, name=stashparsing, max_size_kb=500, current_size_kb=449, current_size=7, largest_size=9, smallest_size=3

Is it possible to increase the size of the stashparsing queue?

0 Karma

responsys_cm
Builder

I'm not seeing anything about the binary warning in the logs. I am seeing:

BatchReader - Could not send data to output queue (stashparsing), retrying...

0 Karma
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf24, and Community Connections

Thank you to everyone in the Splunk Community who joined us for .conf24 – starting with Splunk University and ...

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

 (view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...