Example raw data:
{"field1": "value1", "field2": "value2", ..., "string": "1" }
{"field1": "value1", "field2": "value2", ... ,"string":"2"}
{"field1": "value1", "field2": "value2", ..., "string":"3" }
{"field1": "value1", "field2": "value2", ..., "string":"4" }
Splunk merge few of raw data string into single event, as result you got 2 events.
Event 1:
{"field1": "value1", "field2": "value2", ..., "string": "1" }
{"field1": "value1", "field2": "value2", ... ,"string":"2"}
{"field1": "value1", "field2": "value2", ..., "string":"3" }
Event 2:
{ [-]
field1: value1
field2: value2
...
string: 4
}
So, 80% of events looks like event 1 in example. But some events caught by single row and parsed as JSON type.
Using cluster of Splunk Enterprise and splunkforwarder for data delivery, version 6.5.5.
I have tried to setup props.conf on splunkforwarder (app which work with JSON log files), tried to use different LINE_BREAKER:
1. (\})
2. \}
3. "(^)\{"
Current props.conf:
[json-logs]
SHOULD_LINEMERGE = false
KV_MODE = json
LINE_BREAKER = (\})
TIME_PREFIX = \"time\": \"
Have same problem not only with JSON format logs, looks like props.conf line breaking options not work at all.
What I am doing wrong?
... View more