Getting Data In

Forwarder stop accepting connections

jgauthier
Contributor

This morning I opened a dashboard and was greeted with "results not found."
I thought this was peculiar, so I started doing some digging and found that the server I was forwarding from had this in its log file:
08-10-2011 09:10:49.476 -0400 INFO TailingProcessor - Could not send data to output queue (parsingQueue), retrying...
08-10-2011 09:10:53.799 -0400 INFO BatchReader - Could not send data to output queue (parsingQueue), retrying...

So, I began poking around, and could not figure out what was happening.
Finally, I started to think it was the receiver on the indexer, so I tried to hit the listening port:

jgauthier$ telnet 192.168.74.45 9997
Trying 192.168.74.45...

Nothing. I tried from another system.. nothing. Not a "Connection refused", just totally not accepting. I restarted splunkd, and everything started working.

I could not find any log files on the splunk indexer to help, because it started a few days ago and has since been scrolled from the log.

Any suggestions?

Tags (1)

Lex
New Member

I had a similar problem where the issue was that the Splunk server was running into its 1024 open file limit. I edited the /etc/security/limits.conf to allow for a 2048 softlimit and 4096 hardlimit on "nofile" and restarted. Check with ulimit -a if the new setting has indeed taken effect.

Obviously, this only applies if your receiving Splunk server is a Linux server.

0 Karma

jgauthier
Contributor

Yup. My splunk server is Windows. (latest version of both)

0 Karma
Get Updates on the Splunk Community!

Fun with Regular Expression - multiples of nine

Fun with Regular Expression - multiples of nineThis challenge was first posted on Slack #regex channel ...

[Live Demo] Watch SOC transformation in action with the reimagined Splunk Enterprise ...

Overwhelmed SOC? Splunk ES Has Your Back Tool sprawl, alert fatigue, and endless context switching are making ...

What’s New & Next in Splunk SOAR

Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us on ...