My linux_audit logs increased after updating apps and causing license manager to go over limit. Anyone know a fix for this, I have looked for the stanzas on the backend but not able to find out where these logs are coming from.
This is not strictly Splunk question.
If your systems started producing more audit events something must have changed. Probably either audit rules defined in your systems changed or the systems' behaviour changed so they report more events. It's something you need to resolve with your Linux admins. You could compare old data with new data to see what changed - whether there are more messages of some particular types or maybe new processes started geting "caught" by audit.
This is not strictly Splunk question.
If your systems started producing more audit events something must have changed. Probably either audit rules defined in your systems changed or the systems' behaviour changed so they report more events. It's something you need to resolve with your Linux admins. You could compare old data with new data to see what changed - whether there are more messages of some particular types or maybe new processes started geting "caught" by audit.
So I did some research of when the uptick happened. It started last Monday before I starting upgrading Splunk. I blacklisted the host that were having the large amount of audit logs and reached out to the department for those host. Looks like it wasnt an app but servers possibly added or ingesting more due to a change. Will find out more once the department responds. Until then, will keep them blacklisted so that we stay under our license amount
Go to one of the Linux servers that is reporting audit logs and run btool on the CLI.
splunk btool --debug inputs list | grep audit
The output will include the name of the inputs.conf file where the input is defined. Edit that file (or its peer in /local) to disable the input.