Getting Data In

Too many open files on forwarders

oreoshake
Communicator

I'm starting to get a lot of these errors on my forwarders. Any suggestions? Pushing /etc/security/limits.conf doesn't sound ideal.

I'm running heavy forwarders as root.

Tags (1)
0 Karma

netwrkr
Communicator

limits.conf shouldn't come into play unless you explicity have 'nofile' defined.

This is actually a pretty typical issue on the systems I've seen, and quite easily fixed.

'ulimit'

I typically double the number of open files to start.

Using a RHEL/Fedora/Cent OS vi /etc/profile

20,000 number of open files ulimit -n 20000

I then 'source' /etc/profile so my current shell will apply that new value

'source /etc/profile'

Verify 'ulimit -n'

Then restart splunk.

Verify splunk applied the new settings by viewing splunk/var/log/splunkd.log

Looking for this line "INFO ulimit - Limit: open files: 20000 files"

jrodman
Splunk Employee
Splunk Employee

You probably should work with support and/or investigate which files are open by looking in proc, or using lsof. We might be a bit too aggressive in how many files we open in the new tailing code, but that's just a wild guess.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...