If my forwarder's inputs.conf stanza line is set to batch instead of monitor, it will delete the file after it is indexed.
Lets say my inputs.conf stanza is this
[batch:///home/bleung/tutorialdata]
disabled = false
index = tutorial
sourcetype = testing
If the files there are deleted in that path and I put in a file there with the same data just different timestamps, will the forwarder still send data to the indexer?
Scenario:
I scp a file to the forward server and is oneshot'ed. Next day I scp a file with same data just different timestamps. The indexer does not receive any data for that day.
Splunk version 5+
Splunk forwarder version 5+
according to the description of "batch" in inputs.conf.spec , you should set move_policy = sinkhole .
move_policy = sinkhole
* IMPORTANT: This attribute/value pair is required. You *must* include "move_policy = sinkhole" when defining batch inputs.
* This loads the file destructively.
* Do not use the batch input type for files you do not want to consume destructively.
* As long as this is set, Splunk won't keep track of indexed files. Without the "move_policy = sinkhole" setting, it won't load the files destructively and will keep a track of them.
according to the description of "batch" in inputs.conf.spec , you should set move_policy = sinkhole .
move_policy = sinkhole
* IMPORTANT: This attribute/value pair is required. You *must* include "move_policy = sinkhole" when defining batch inputs.
* This loads the file destructively.
* Do not use the batch input type for files you do not want to consume destructively.
* As long as this is set, Splunk won't keep track of indexed files. Without the "move_policy = sinkhole" setting, it won't load the files destructively and will keep a track of them.