Getting Data In

Configure delay in batch input? time_to_close does not work (v6.0.5)

the_wolverine
Champion

Batch configured inputs are getting deleted before they can be indexed. I tried configuring time_to_close in inputs.conf but Splunk complains that the configuration is invalid. I'm assuming due to using it with batch vs monitor input.

Is there any other way to delay the deletion of the file? Seems to be Splunk is unable to consume it before the move policy to sinkhole kicks in.

[batch:///data/*.csv]
index=main
sourcetype=csv
move_policy = sinkhole
time_before_close = 300

On restart:

Checking conf files for problems...
    Invalid key in stanza [batch:///data/*.csv] in /opt/splunk/etc/system/local/inputs.conf, line 5: time_before_close  (value:  300)
    Your indexes and inputs configurations are not internally consistent. For more information, run 'splunk btool check --debug'
0 Karma

Genti
Splunk Employee
Splunk Employee

the error doesnt necessarily mean that its not working, rather, that the configuration is not present in the .conf file.
time_before_close does work for a regular input, and it might be the case that it will work for batch too.
try it for a while, and maybe do test it with writing to a file and take a break, and then write again, what's it do?

0 Karma

jlaigo2
Path Finder

Was there ever an answer to this?

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...