I'm running on a system with specs lower than they should be, particularly in the RAM department, (which I plan on fixing) but in the meantime, is there any benefit in reducing the size of a log file that currently is allowed to grow to 2GB before being rolled over? Thanks!
Not really.
However, if you are asking Splunk to monitor a directory, you can generally improve performance by cleaning out old, dead files from the directory. When you tell Splunk to "monitor" a directory, it has to keep checking all the files to make sure they haven't changed. I've seen folks point Splunk at a directory tree with over 15K files in it - a real waste of resources if only a couple of hundred files are actually being updated...
Probably not an issue on most Splunk indexers, but comes up fairly often on Splunk forwarders.
Not really.
However, if you are asking Splunk to monitor a directory, you can generally improve performance by cleaning out old, dead files from the directory. When you tell Splunk to "monitor" a directory, it has to keep checking all the files to make sure they haven't changed. I've seen folks point Splunk at a directory tree with over 15K files in it - a real waste of resources if only a couple of hundred files are actually being updated...
Probably not an issue on most Splunk indexers, but comes up fairly often on Splunk forwarders.