We would like to consume the error log from our various SQL databases. Our SQL team does not configure the databases to log in a common directory and there could be multiple logs in multiple drives on a server. Instead of specifying all of the possible drives and resource intense recursive inputs, our SQL team tried creating a symlink for us.
They could create a symlink to each log file and store all of those symlinks in one common directory. Then we could just injest whatever is in that directory.
What i'm noticing is that on a Splunk restart, the errorlog files are read fine, but updates to those log files are not seen by Splunk. I'm just wondering if this is supposed to work as I think it should - as new data is written to target of those symlinks, splunk would realize the target file changed and consume the new log lines. And if so, am I maybe missing a setting somewhere? If it's not supposed to work or we can't get it to work, no big deal. I know I have other options to specify the various paths in inputs.
These servers are mix of Win 2003, 2008 and 2012. Splunk forwarder version 6.2.1. The symlinks are created under SQL_ERRORLOGS directory on the root of C:. Monitoring stanza is below. Any insight would be appreciated.
[monitor://C:\SQL_ERRORLOGS] index = upmc_sql sourcetype = sql:error disabled = false followSymlink = true
I am trying to get dual feeds from a windows box to 2 different splunk indexes. Could you please let me know if you have any approach to achieve the same apart from symbolic link.
I have exact same problem in the same version of forwarder. New files are not seen until the forwarder is restarted. the destination directory is a symlink to another local drive on the system. we already use crcSalt = parameter.