Getting Data In

Issues with multiline events props.conf

sunnyb147
Path Finder

Hi Everyone,

Requesting small help with configuring props.conf which can help me to break the multiline events correctly. These are two types of events which I am trying to ingest for the first one either a part is being ingested or the event is broken for the second one(in a single line) that is ingesting without any issues.

I tried below props.conf but no luck, I am just a newbie therefore requesting your help. For BREAK_ONLY_BEFORE I added the rex so that it can capture and break both types of events.

 

[testing]
BREAK_ONLY_BEFORE={(\s+|)"transaction-id"(\s+|):(\s+|)"
SHOULD_LINEMERGE=false
NO_BINARY_CHECK=1
TRUNCATE=0
MAX_EVENTS=1024

 

 

 

{
  "transaction-id" : "steve-123",
  "usecase-id" : "123",
  "timestamp" : "2021-03-07T06:51:27,188+0100",
  "timestamp-out" : "2021-03-07T06:51:27,188+0100",
  "component" : "A",
  "payload" : "{\"error\":\"Internal server error\",\"message\":\"Internal server error\",\"description\":\"The server encountered an unexpected condition that prevented it from fulfilling the request\"}",
  "country-code" : "IN",
  "status" : "error",
  "error-code" : "500",
  "error" : "Internal Server Error",
  "message-size" : 176,
  "logpoint" : "response"
}

{"transaction-id":"steve-456","usecase-id":"456","timestamp":"2021-03-07T06:51:27,188+0100","timestamp-out":"2021-03-07T06:51:27,188+0100","component":"B","payload":"{\"error\":\"Internalservererror\",\"message\":\"Internalservererror\",\"description\":\"The server encountered an unexpected condition that prevented it from fulfilling the request\"}","country-code":"IN","status":"error","error-code":"500","error":"Internal Server Error","message-size":176,"logpoint":"response"}

 

 

Thanks,

Sunny

Labels (1)
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @sunnyb147,

You can use builtin _json sourcetype, it will ingest correctly;

[ _json ]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=json
KV_MODE=none
SHOULD_LINEMERGE=true
category=Structured
description=JavaScript Object Notation format. For more information, visit http://json.org/
disabled=false
pulldown_type=true
If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma

sunnyb147
Path Finder

@scelikok Thanks for the suggestion but still the event which is being ingested is broken.

 

{
  "transaction-id" : "novotel-123",
  "usecase-id" : "123",
  "timestamp" : "2021-03-22T06:51:27,188+0100",
  "timestamp-out" : "2021-03-22T06:51:27,188+0100",
  "component" : "A",
  "payload" : "{\"error\":\"Internal server error\",\"message\":\"Internal server error\",\"description\":\"The server encountered an unexpected condition that prevented it from fulfilling the request\"}",
  "country-code" : "IN",
  "status" : "error",
  "error-code" : "500",
  "error" : "Internal Server Error",
  "caller-id" : "",
  "message-size" : 176,
  "logpoint" : "response-out"
}

 

I appended above event in the log file but over index I received broken one: 

{
  "transaction-id" : "novotel-123",
  "usecase-id" : "123",
  "timestamp" : "2021-03-22T06:51:27,188+0100",
  "timestamp-out" : "2021-03-22T06:51:27,188+0100",
  "component" : "A",
Collapse
0 Karma

to4kawa
Ultra Champion
[testing]
LINE_BREAKER=([\r\n]+){\"transaction-id
SHOULD_LINEMERGE=false
NO_BINARY_CHECK=1
TRUNCATE=0
MAX_EVENTS=1024

LINE_BREAKER is better.

0 Karma

sunnyb147
Path Finder

@to4kawa Thanks for your response, but unfortunately its still the same, the event which is being ingested is broken.

I tried changing the limit of MAX_EVENTS but then too it is not helping 😕

sunnyb147_0-1616236740043.png

 

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...