Getting Data In

inputs.conf -> time_before_close

lpolo
Motivator

Have any of you had the necessity to use time_before_close in inputs.conf. if so could you share your scenario?
I am having an issue with a source log where events could be quite large. Therefore, some events are not broken correctly.

Thanks,
Lp

Tags (1)
0 Karma

sowings
Splunk Employee
Splunk Employee

I have a log file which is a large XML document comprised of various sub-documents that take a while to run. Each job writes its data to the file as the output is generated, but the whole XML document isn't closed (appropriate closing tags, etc) until the whole set of jobs is complete. Sometimes, the writing of the log will pause for more than 3 seconds (the default value of time_before_close), and so Splunk was consuming that file half-way through.

If you're seeing events broken before they're complete, consider MAX_EVENTS (it defaults to 256 additional lines, so if you have those multi-line events showing a linecount of 257, this could be the issue), or possibly TRUNCATE.

0 Karma

lpolo
Motivator

Thanks for your comment. It is not my scenario.

0 Karma
Get Updates on the Splunk Community!

AI for AppInspect

We’re excited to announce two new updates to AppInspect designed to save you time and make the app approval ...

App Platform's 2025 Year in Review: A Year of Innovation, Growth, and Community

As we step into 2026, it’s the perfect moment to reflect on what an extraordinary year 2025 was for the Splunk ...

Operationalizing Entity Risk Score with Enterprise Security 8.3+

Overview Enterprise Security 8.3 introduces a powerful new feature called “Entity Risk Scoring” (ERS) for ...