Splunk Enterprise

Why some events with a specific field are not parsed

mah
Builder

hi,

I have events like this :

log=log_name {"timestamp":"2020-10-13T13:44:06.242Z","version":"1","message":"xxx","name":"abcd","level":"INFO","id":"123","env":"dev"}

I have set up a I have set up a props.conf : 

[sourcetype]
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)log\=
TRUNCATE = 999999

TRANSFORMS-extractions = indexed_log
TRANSFORMS-remove = remove_log
 
And a transforms.conf : 
[indexed_log]
REGEX = ^log\=(.+?)\s
FORMAT = log::$1
WRITE_META = true

[remove_log]
REGEX = ^log\=.+?\s((?:\{|\[).+?(?:\}|\]))$
DEST_KEY = _raw
FORMAT = $1
 
That works good until I found some events are not parsed. The only difference I noticed is that there is a field that contains a lot of characters (more than 6000).
Log not parsed example : 
log=test {"timestamp":"2020-10-13T12:10:57.177Z","version":"1","message":"Error ","name":"1234","level":"ERROR","field_with_characters_above_6000":"abcdef...….","env":"dev"} 
 
The props.conf and transforms.conf are not effective anymore on that kind of event.
 
Can you help me please ?
Labels (1)
Tags (1)
0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...