Getting Data In

inputs.conf using batch in stanza

ben_leung
Builder

If my forwarder's inputs.conf stanza line is set to batch instead of monitor, it will delete the file after it is indexed.

Lets say my inputs.conf stanza is this

[batch:///home/bleung/tutorialdata]
disabled = false
index = tutorial
sourcetype = testing

If the files there are deleted in that path and I put in a file there with the same data just different timestamps, will the forwarder still send data to the indexer?

Scenario:
I scp a file to the forward server and is oneshot'ed. Next day I scp a file with same data just different timestamps. The indexer does not receive any data for that day.

Splunk version 5+
Splunk forwarder version 5+

Tags (3)
1 Solution

dmlee
Communicator

according to the description of "batch" in inputs.conf.spec , you should set move_policy = sinkhole .

move_policy = sinkhole
* IMPORTANT: This attribute/value pair is required. You *must* include "move_policy = sinkhole" when defining batch inputs.
* This loads the file destructively.  
* Do not use the batch input type for files you do not want to consume destructively.
* As long as this is set, Splunk won't keep track of indexed files. Without the "move_policy = sinkhole" setting,  it won't load the files destructively and will keep a track of them.

View solution in original post

dmlee
Communicator

according to the description of "batch" in inputs.conf.spec , you should set move_policy = sinkhole .

move_policy = sinkhole
* IMPORTANT: This attribute/value pair is required. You *must* include "move_policy = sinkhole" when defining batch inputs.
* This loads the file destructively.  
* Do not use the batch input type for files you do not want to consume destructively.
* As long as this is set, Splunk won't keep track of indexed files. Without the "move_policy = sinkhole" setting,  it won't load the files destructively and will keep a track of them.
Get Updates on the Splunk Community!

Modern way of developing distributed application using OTel

Recently, I had the opportunity to work on a complex microservice using Spring boot and Quarkus to develop a ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security ...

Archived Metrics Now Available for APAC and EMEA realms

We’re excited to announce the launch of Archived Metrics in Splunk Infrastructure Monitoring for our customers ...