Splunk Search

LineBreakingProcessor question

willemjongeneel
Communicator

Hello,

I receive errors like the ones below:
LineBreakingProcessor - Truncating line because limit of 132000 bytes has been exceeded with a line length >= 136321

I solved this before by just raising the TRUNCATE value. Still i get these errors. Is there a point where I should stop raising this value? Are there risks in setting this value to high?

Thanks, kind regards,
Willem

0 Karma
1 Solution

MuS
SplunkTrust
SplunkTrust

Hi willemjongeneel,

the only risk will be to set it to TRUNCATE = 0 !
Everything else will help Splunk in knowing instead of guessing where to truncate and therefore it will increase the parsing performance of the Splunk instance.

Hope this helps ...

cheers, MuS

View solution in original post

0 Karma

MuS
SplunkTrust
SplunkTrust

Hi willemjongeneel,

the only risk will be to set it to TRUNCATE = 0 !
Everything else will help Splunk in knowing instead of guessing where to truncate and therefore it will increase the parsing performance of the Splunk instance.

Hope this helps ...

cheers, MuS

0 Karma

willemjongeneel
Communicator

Yes that helps, thanks!

Kind regards,
Willem

0 Karma

lakshman239
Influencer

Ideally you want to limit it TRUNCATE = 999999 to allow genuinely long lines instead of setting to 0 (never truncate).

0 Karma

MuS
SplunkTrust
SplunkTrust

Seriously if you have to set this value, you are best to either @#$@#$#$% or point your devs to this http://dev.splunk.com/view/logging/SP-CAAAFCK

Or use black magic or cough cribl cough to make the events useable 😉

cheers, MuS

0 Karma
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...