Batch configured inputs are getting deleted before they can be indexed. I tried configuring time_to_close in inputs.conf but Splunk complains that the configuration is invalid. I'm assuming due to using it with batch vs monitor input.
Is there any other way to delay the deletion of the file? Seems to be Splunk is unable to consume it before the move policy to sinkhole kicks in.
[batch:///data/*.csv]
index=main
sourcetype=csv
move_policy = sinkhole
time_before_close = 300
On restart:
Checking conf files for problems...
Invalid key in stanza [batch:///data/*.csv] in /opt/splunk/etc/system/local/inputs.conf, line 5: time_before_close (value: 300)
Your indexes and inputs configurations are not internally consistent. For more information, run 'splunk btool check --debug'
the error doesnt necessarily mean that its not working, rather, that the configuration is not present in the .conf file.
time_before_close does work for a regular input, and it might be the case that it will work for batch too.
try it for a while, and maybe do test it with writing to a file and take a break, and then write again, what's it do?
Was there ever an answer to this?