Splunk Search

Why are the queues full and searches in error on Search Head?

Path Finder


We are having some issues finalizing the installation of our Splunk environment. We have 2 Linux servers: 1 Search Head and 1 Indexer as search peer. We had just finished to set up the search peer in "Distributed search", so we tried to run a search "index=_internal sourcetype=splunkd" on the last 60 minutes but it only returned logs from the Indexer. 

But then we realized that TailReader-0 on the Search Head was in error "The monitor input cannot produce data because splunkd's processing queues are full. This will be caused by inadequate indexing or forwarding rate, or a sudden burst of incoming data.". and the related messages such as:

08-19-2020 12:4650.607 +0200 WARN TailReader - Could not send data to output queue (parsingQueue), retrying...

This is weird because we configured the outputs.conf on the Search Head to send data to the Indexerand we configured inputs.conf of the Indexer to receive data so we are not sure what's wrong.

outputs.conf on the SH:





inputs.conf on the IDX:




Both servers have been restarted, I guess the queues are full because the Search Head can't send the data, but why ?

The port 9997 is open and the connection from SH to IDX is fine. Also we don't have any forwarder or data input configured so it should not be because of a sudden burst of incoming data.

We restarted the Search Head after this, and now we are not able to run searches anymore, all searches return in error, the job inspector says "This search has encountered a fatal error and has been marked as zombied".

Could it be a performance issue ? Our servers have only 4CPU and 12GB of RAM. Do we need more CPU to solve these issues ?

Thank you very much !

Labels (3)
0 Karma



your outputs.conf and inputs.conf seems to be correct. One thing what you could do is add spaces on both side of = . It shouldn’t changes , but still...

Can you send what the next commands shows. 

splunk btool inputs list


splunk btool outputs list

Also you could check what Monitoring Console on IDX shows (Settings - Monitoring Console - Indexing - Performance - instance)

r. Ismo

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...