Getting Data In

splunk forwarder maximum monitored file limit?

JasonCzerak
Explorer

Is there a max file count a single forwarder can monitor? I have some oracle applications that generate 10,000's of new files daily, various sizes. I have a dedicated host that has these shared mounts mounted up and just scanning for new data. Every how and again it crashes the box. I can't seem to make anything out on the console.

Am I running into some sort of file limit?

Tags (1)

dart
Splunk Employee
Splunk Employee

Can you use a local forwarder on the source data, so we can determine if it's Splunk or mount related issues?

0 Karma

dart
Splunk Employee
Splunk Employee

The Splunk on Splunk app will give you an idea of what's causing the queue blockage.
It's possible for a single Universal Forwarder to saturate an indexer if it has sufficient event throughput.

0 Karma

JasonCzerak
Explorer

09-04-2012 00:10:36.644 -0500 INFO TailingProcessor - Could not send data to output queue (parsingQueue), retrying...
09-04-2012 00:10:37.836 -0500 INFO TailingProcessor - ...continuing.

0 Karma

JasonCzerak
Explorer

For one of the data sources, I have, and it works just fine with about 100,000 files. It's when I added several other data sources it broke down.

I do have these errors in splunkd.log that I've just noticed:
09-04-2012 03:06:06.730 -0500 INFO TailingProcessor - File descriptor cache is full (100), trimming...

0 Karma
Get Updates on the Splunk Community!

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI! Discover how Splunk’s agentic AI ...

[Puzzles] Solve, Learn, Repeat: Dereferencing XML to Fixed-length events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...