Getting Data In

How to prevent directory bombs on forwarders?

twinspop
Influencer

Spent all day yesterday trying to figure out why a client's logs weren't indexing. Most of the time I had no access to the server in question, so I was simply troubleshooting from internal logs, configs, and the sporadic logs that would show up briefly after a restart.

Finally, when I was just about to throw in the towel, I started poking around directories above the target files. The monitor line had an asterisk at this point in the path, so even though most other dirs didn't match further down the line, Splunk had to check them. Several of them had 100k+ files in them. So Splunk was stuck trying to read these dirs. Even performing just an ls | wc -l took over 10 minutes on a few of them.

I can find big directories with something like this and send it into Splunk for alerting:

find /path -size +100k -type d

Adjusting the size requirement as needed. Is there a better way to avoid these landmines?

Thanks,
Jon

0 Karma

yannK
Splunk Employee
Splunk Employee

It's hard to avoid the scan of all the files in the potential path when you have wildcard.

To avoid indexing unnecessary files in sub folders, maybe disable the recursive indexing of sub folders ?
recursive = true|false
see http://docs.splunk.com/Documentation/Splunk/7.0.0/Data/Monitorfilesanddirectorieswithinputs.conf

0 Karma
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...