Getting Data In

Too many open files on forwarders

oreoshake
Communicator

I'm starting to get a lot of these errors on my forwarders. Any suggestions? Pushing /etc/security/limits.conf doesn't sound ideal.

I'm running heavy forwarders as root.

Tags (1)
0 Karma

netwrkr
Communicator

limits.conf shouldn't come into play unless you explicity have 'nofile' defined.

This is actually a pretty typical issue on the systems I've seen, and quite easily fixed.

'ulimit'

I typically double the number of open files to start.

Using a RHEL/Fedora/Cent OS vi /etc/profile

20,000 number of open files ulimit -n 20000

I then 'source' /etc/profile so my current shell will apply that new value

'source /etc/profile'

Verify 'ulimit -n'

Then restart splunk.

Verify splunk applied the new settings by viewing splunk/var/log/splunkd.log

Looking for this line "INFO ulimit - Limit: open files: 20000 files"

jrodman
Splunk Employee
Splunk Employee

You probably should work with support and/or investigate which files are open by looking in proc, or using lsof. We might be a bit too aggressive in how many files we open in the new tailing code, but that's just a wild guess.

0 Karma
Get Updates on the Splunk Community!

Splunk Mobile: Your Brand-New Home Screen

Meet Your New Mobile Hub  Hello Splunk Community!  Staying connected to your data—no matter where you are—is ...

Introducing Value Insights (Beta): Understand the Business Impact your organization ...

Real progress on your strategic priorities starts with knowing the business outcomes your teams are delivering ...

Enterprise Security (ES) Essentials 8.3 is Now GA — Smarter Detections, Faster ...

As of today, Enterprise Security (ES) Essentials 8.3 is now generally available, helping SOC teams simplify ...