Getting Data In

Does changing of logs permissions require a forwarder restart?

Ultra Champion

We lost the read permission on numerous servers. When the permissions were restored, it appears that a forwarder restart is needed. Is this the expected behavior?

Tags (2)
0 Karma
1 Solution

Revered Legend

If the files/directories are not accessible, Splunk will drop them from it's monitoring list and will not be monitored till it restart again (when it'll re-evaluate the monitoring list again, same behavior happens with IgnoreOlderThan attribute.), so yes, it's expected.

View solution in original post

Revered Legend

If the files/directories are not accessible, Splunk will drop them from it's monitoring list and will not be monitored till it restart again (when it'll re-evaluate the monitoring list again, same behavior happens with IgnoreOlderThan attribute.), so yes, it's expected.

View solution in original post

Ultra Champion

Ok, is it a feature or a bug? It seems to me that if permissions are changed, the forwarder has to pick up these changes at some point without being bounced.

0 Karma

Influencer

It's neither a feature nor bug, its just how it works. You can submit a p4 enhancement request if you think there is a room for improvement.

0 Karma

Ultra Champion

Fair enough. To me it's obviously a bug. There is a problem of how an admin can detect that permissions were changed. We have constant issues with that and only after major commotions, like today, we fix these issues by bouncing. How me, as an admin, can detect something like that?

0 Karma

Ultra Champion

Is there a way to detect any of these underlying OS changes in the Splunk logs?

0 Karma

Revered Legend

Things like access denied should be logged in the splunkd log of the forwarder.

Ultra Champion

That's great - so we might need "just" an alert based on this logging..

0 Karma

Influencer

I agree, its pain when you have hundreds if not thousands of forwarders managing and requesting permissions for the files to the id that splunk forwarder is running as. We were in the same situation and came a long way since we have convinced our management to let our Splunk forwarders run as root. Which made things lot easier.

Ultra Champion

Oh man - I don't like this solution at all ; -) it just illustrates what a software limitation/bug forces us to do.

0 Karma

Ultra Champion

Is there maybe anything like http://<host>:8000/en-US/debug/refresh for the forwarder?

0 Karma

Revered Legend

You may try this CLI command

$Splunk_Home/bin/splunk _internal call /configs/conf-inputs/_reload
0 Karma

Ultra Champion

Interesting

0 Karma
Don’t Miss Global Splunk
User Groups Week!

Free LIVE events worldwide 2/8-2/12
Connect, learn, and collect rad prizes
and swag!