I have 3 servers each with a log file. I am planning on installing a universal forwarder on each server to push the info in these files to the receiver on the main server. Currently the log files gather no more than 5MB a day. They currently aren't getting large enough to turn over and start a new log file. My thoughts were to use the batch input type to drop the file into the Splunk directory, index it, and delete it. However because these logs aren't turning over enough I am worried getting duplicate event data. Thus, I am focused on real time forwarding on each server but concerned with the amount of resources that each forwarder will consume. With this in mind, is it better to constantly run the forwarders to avoid duplicate data, or is there another way to get the log files indexed while avoiding duplicate event data?
... View more