Getting Data In

inputs.conf -> time_before_close

lpolo
Motivator

Have any of you had the necessity to use time_before_close in inputs.conf. if so could you share your scenario?
I am having an issue with a source log where events could be quite large. Therefore, some events are not broken correctly.

Thanks,
Lp

Tags (1)
0 Karma

sowings
Splunk Employee
Splunk Employee

I have a log file which is a large XML document comprised of various sub-documents that take a while to run. Each job writes its data to the file as the output is generated, but the whole XML document isn't closed (appropriate closing tags, etc) until the whole set of jobs is complete. Sometimes, the writing of the log will pause for more than 3 seconds (the default value of time_before_close), and so Splunk was consuming that file half-way through.

If you're seeing events broken before they're complete, consider MAX_EVENTS (it defaults to 256 additional lines, so if you have those multi-line events showing a linecount of 257, this could be the issue), or possibly TRUNCATE.

0 Karma

lpolo
Motivator

Thanks for your comment. It is not my scenario.

0 Karma
Get Updates on the Splunk Community!

Modern way of developing distributed application using OTel

Recently, I had the opportunity to work on a complex microservice using Spring boot and Quarkus to develop a ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security ...

Archived Metrics Now Available for APAC and EMEA realms

We’re excited to announce the launch of Archived Metrics in Splunk Infrastructure Monitoring for our customers ...