I have an event being imported with a custom source type. in that source type i have
NO_BINARY_CHECK=1
CHECK_FOR_HEADER=false
LEARN_SOURCETYPE=false
SHOULD_LINEMERGE=false
However splunk is still truncating my log lines and then generating a new event with the rest of the line (potentially broken up again) generating incorrect data. Is there a way i can tell splunk to import the whole log line into one event? The event log line can be up to 128k. I am fine with it being truncated in the display but not in the indexed data. alternatively i am fine with any one field being limited to a certain size (such as 4k) but as it stands now any fields after the really long field is missing.
thanks,
rob
Add the following to what you already have in your props file
MAX_EVENTS = 10000
TRUNCATE = 0
This will cause the data to not truncate no matter how many lines you have and will break the event into a new event after 10000 lines. If you have more thank 10000 lines in a single event then increase this number accordingly
Truncate=0 only stops splunk from discarding data that should be indexed after some number of characters in a single event is reached, the exact limit of characters i dont remember off the top of my head.
Max_events will not do that.
Linebreaking will still occur by whatever you have defined in your config which from what it looks like is to make a new event when it detects a timestamp
MAX_EVENTS = 10000 will allow a single event to go beyond the Splunk default of 256 lines per event. This is the solution for the problem you described where the remaining part of a single event was 'overflowed' into a new event.
So for instance if your single event was 300 lines long, 256 lines will go into one event and the remaining 44 lines will be placed into a new event
I think just adding TRUNCATE=0 is what i needed. I don't want to join any separate lines into the same event.