Getting Data In

Too many open files on forwarders

oreoshake
Communicator

I'm starting to get a lot of these errors on my forwarders. Any suggestions? Pushing /etc/security/limits.conf doesn't sound ideal.

I'm running heavy forwarders as root.

Tags (1)
0 Karma

netwrkr
Communicator

limits.conf shouldn't come into play unless you explicity have 'nofile' defined.

This is actually a pretty typical issue on the systems I've seen, and quite easily fixed.

'ulimit'

I typically double the number of open files to start.

Using a RHEL/Fedora/Cent OS vi /etc/profile

20,000 number of open files ulimit -n 20000

I then 'source' /etc/profile so my current shell will apply that new value

'source /etc/profile'

Verify 'ulimit -n'

Then restart splunk.

Verify splunk applied the new settings by viewing splunk/var/log/splunkd.log

Looking for this line "INFO ulimit - Limit: open files: 20000 files"

jrodman
Splunk Employee
Splunk Employee

You probably should work with support and/or investigate which files are open by looking in proc, or using lsof. We might be a bit too aggressive in how many files we open in the new tailing code, but that's just a wild guess.

0 Karma
Get Updates on the Splunk Community!

The OpenTelemetry Certified Associate (OTCA) Exam

What’s this OTCA exam? The Linux Foundation offers the OpenTelemetry Certified Associate (OTCA) credential to ...

From Manual to Agentic: Level Up Your SOC at Cisco Live

Welcome to the Era of the Agentic SOC   Are you tired of being a manual alert responder? The security ...

Splunk Classroom Chronicles: Training Tales and Testimonials (Episode 4)

Welcome back to Splunk Classroom Chronicles, our ongoing series where we shine a light on what really happens ...