Monitoring Splunk

In splunk, is there way to find out any changes made in the config files stanza in UF ?

Hemnaath
Motivator

Hi All,
Scenario

In an Enterprise organisation, there are thousands of Splunk UF agents are installed in the remote machine and all the UF agents are communicating with Deployment server. Accidentally an end user changed the config files and saved/restarted the agent and there was no issues detected after the change.

Question:

In splunk is there way to find out any changes made by an end user in the config files stanza in UF and there was no error after the change. In this case where and how we can find out which file/UF agent machine was changed by an end user.

Labels (1)
0 Karma
1 Solution

nickhills
Ultra Champion

If a change is made inside an application on a UF and that application is managed by the deployment server, then on restarting the UF and periodically, the installed apps are compared with the DS.

If any changes have been made, then the app will be redeployed erasing the local changes.

For this reason, you should ensure that all apps (not just some of them) are managed by the DS.

With that said.. changes in etc/system are not managed by the DS, so it is possible for a user to “break” the link to the DS.
You should make sure users don’t have permission to modify files in etc/system (which is default on Linux & windows.

Admin priv users who muck about and break a UF should be reprimanded and/or shot. 🙂

You can also push an app which includes deploymentclient.conf and any other important settings then changes need to be made in both system and apps to break it.

If my comment helps, please give it a thumbs up!

View solution in original post

0 Karma

nickhills
Ultra Champion

If you want to be notified when there are modifications, there are two options I can think of.

Option 1
Add a monitor on etc/system and etc/apps and index all .conf files.
Run a periodic search to compare for changes

Pros:
You can see what has changed

Cons:
Lots of duplicate config, and detecting changes like this could be challenging.

Option 2
Use some kind of file integrity monitoring (or roll your own) - md5 all the files in etc/system and etc/apps and dump them to /tmp/allmd5s
Then md5 /tmp/allmd5s and index the hash.

Compare the current hash with the previous and alert when different.

Pros:
Less duplicate config/ low event bytecount

Cons:
Can only tell that something changed, not what

In both cases, expect alerts when you push an app update, and of course this only works if the UF is still sending data.

If my comment helps, please give it a thumbs up!
0 Karma

nickhills
Ultra Champion

If a change is made inside an application on a UF and that application is managed by the deployment server, then on restarting the UF and periodically, the installed apps are compared with the DS.

If any changes have been made, then the app will be redeployed erasing the local changes.

For this reason, you should ensure that all apps (not just some of them) are managed by the DS.

With that said.. changes in etc/system are not managed by the DS, so it is possible for a user to “break” the link to the DS.
You should make sure users don’t have permission to modify files in etc/system (which is default on Linux & windows.

Admin priv users who muck about and break a UF should be reprimanded and/or shot. 🙂

You can also push an app which includes deploymentclient.conf and any other important settings then changes need to be made in both system and apps to break it.

If my comment helps, please give it a thumbs up!
0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...