Getting Data In

Blacklisting directories without read permission

_smp_
Builder

Hi. I have configured a 6.5.3 Linux Universal Forwarder with an inputs.conf like this:

[monitor:///www/*/logs/access_log*]
disabled = 0
index = web
sourcetype = access_combined
crcSalt = <SOURCE>
blacklist = \.gz$|lost\+found

I am trying to blacklist a directory named '/www/lost+found' because the splunk user does not have read-permission to this directory. But the blacklist regex isn't working because I am still seeing a WARN FilesystemChangeWatcher - error reading directory "/www/lost+found": Permission denied error in the _internal log. It seems to be ignoring .gz files as I would expect. Is this an issue with the regex? Or is this more of an order-of-operations type of situation where it needs to read the directory before processing the blacklist?

0 Karma

woodcock
Esteemed Legend

Try this:

 blacklist = \.gz$|(lost\+found)
0 Karma

_smp_
Builder

Unfortunately no, that didn't work either.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi scottprigge,
try to use blacklist = lost\+found
and then restart Splunk on Forwarder.
Bye.
Giuseppe

0 Karma

_smp_
Builder

Sorry, maybe I misunderstood something. But I already have that exact blacklist regex included in the stanza of my original post. The difference is that I also need to exclude files ending with a .gz extension so my regex looks like \.gz$|lost\+found

0 Karma

gcusello
SplunkTrust
SplunkTrust

Sorry I misunderstood,
try with

blacklist = \.gz$|lost\+found.

Bye.
Giuseppe

0 Karma

_smp_
Builder

No, that doesn't seem to have made any difference.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...