Getting Data In

inputs.conf -> time_before_close

lpolo
Motivator

Have any of you had the necessity to use time_before_close in inputs.conf. if so could you share your scenario?
I am having an issue with a source log where events could be quite large. Therefore, some events are not broken correctly.

Thanks,
Lp

Tags (1)
0 Karma

sowings
Splunk Employee
Splunk Employee

I have a log file which is a large XML document comprised of various sub-documents that take a while to run. Each job writes its data to the file as the output is generated, but the whole XML document isn't closed (appropriate closing tags, etc) until the whole set of jobs is complete. Sometimes, the writing of the log will pause for more than 3 seconds (the default value of time_before_close), and so Splunk was consuming that file half-way through.

If you're seeing events broken before they're complete, consider MAX_EVENTS (it defaults to 256 additional lines, so if you have those multi-line events showing a linecount of 257, this could be the issue), or possibly TRUNCATE.

0 Karma

lpolo
Motivator

Thanks for your comment. It is not my scenario.

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...