Hello,
I have events coming via HEC to Splunk cloud with event size 2641524, i see the sourcetype truncate limit was set to 10000 by default. Is it recommended to raise the truncate limit to 2700000.
Appreciate all your help.
Thanks
@richgalloway Hello Sir,
Can you please help me with this request, I got stuck with this issue for a long time, Please help me out.
Thanks
If the TRUNCATE setting is to blame then you'll see a message to that effect in _internal. If you don't see that message then something else is the cause of the truncation.
index=_internal sourcetype=splunkd component=linebreakingprocessor message="truncating*"
@richgalloway I don't see a message when i executed the search mentioned below, let me help you with the below context and props.conf configuration.
Can you please help me with the line breaking and truncate issue which I am seeing for the nested Json events coming via HEC to splunk. This event size is almost close to 25 million bytes where as the truncate limit is set to 10000 only.I was not allowed to set the truncate limit to 0 due to performance issues.I want to break this nested event into multiple events starting from Source_System
Example of an event:
{"sourcetype": "abc_json","index":"test", "event":{"severity":"INFO","logger":"org.mule.runtime.core.internal.processor.LoggerMessageProcessor","time":"XXX","thread":"[MuleRuntime].xxx.123: [App name].post:\\schedules:application\\json:app.CPU_INTENSIVE @xxxx","message":{"correlationId":"XXXX","inputPayload":[{"Source_System":"TEST","Created_By":"ESB","Created_Date_UTC":"1900-XX-01T02:59:14.783Z","Last_Updated_By":"ESB","Last_Updated_Date_UTC":"2020-07-25T03:34:31.91Z",]},{"Source_System":"TEST2","Created_By":"ESB","Created_Date_UTC":"1900-XX-07T02:59:14.783Z","Last_Updated_By":"ESB","Last_Updated_Date_UTC":"1900-XX-25T03:34:31.91Z",]},{"Source_System":"TEST3","Created_By":"ESB","Created_Date_UTC":"2019-08-22T23:27:32.123Z","Last_Updated_By":"ESB","Last_Updated_Date_UTC":"1900-xx-20T01:11:45.35Z",]}}}}'
My current props.conf configuration:
ADD_EXTRA_TIME_FIELDS=True
ANNOTATE_PUNCT=true
AUTO_KV_JSON=true
BREAK_ONLY_BEFORE_DATE=null
CHARSET=UTF-8
DEPTH_LIMIT=1000
DETERMINE_TIMESTAMP_DATE_WITH_SYSTEM_TIME=false
LB_CHUNK_BREAKER_TRUNCATE=2000000
LEARN_MODEL=true
LEARN_SOURCETYPE=true
LINE_BREAKER=([,|[]){"Source_System":
LINE_BREAKER_LOOKBEHIND=100
MATCH_LIMIT=100000
MAX_DAYS_AGO=2000
MAX_DAYS_HENCE=2
MAX_DIFF_SECS_AGO=3600
MAX_DIFF_SECS_HENCE=604800
MAX_EVENTS=256
MAX_TIMESTAMP_LOOKAHEAD=128
NO_BINARY_CHECK=true
SEGMENTATION=indexing
SEGMENTATION-all=full
SEGMENTATION-inner=inner
SEGMENTATION-outer=outer
SEGMENTATION-raw=none
SEGMENTATION-standard=standard
SHOULD_LINEMERGE=false
TRUNCATE=10000
category=Custom
detect_trailing_nulls=false
disabled=false
maxDist=100
pulldown_type=true
termFrequencyWeightedDist=false
Am i missing something?
25 million-bytes is an order of magnitude larger than those in the OP and changes the problem. Events that large will be rejected by HEC. See ' maxEventSize' in inputs.conf.spec.