Getting Data In

Lines break when indexing JSON data using props.conf attributes

Path Finder

Hi team,

I am not able to index below JSON data in Splunk 6.2 with below props.conf attributes. Its breaking at every line and treating as separate event with no field extraction. When I add the same file from Search head using add data option and selects _json as source type, the fields are correctly extracted. But do not work when mention same attributes and customized sourcetype name in props. Please suggest.

Data:
{
  "messageId" : "VIPJAPAN40001JCOMPLETE2017220818015450",
  "messageType" : "EVENT",
  "sendingAppId" : "P1",
  "sendTimeStamp" : "2017-22-08T17:09:27.526-05:00",
}

Props:
[APP1_INOUT_App2]
INDEXED_EXTRACTIONS = json
KV_MODE = none
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = Structured
disabled = false
DATETIME_CONFIG = CURRENT
0 Karma

New Member

This works perfect for json data, at least for me:

[sourcetype]
INDEXEDEXTRACTIONS=json
JSON
TRIMBRACESINARRAYNAMES=true
CHARSET = AUTO
MAXDIFFSECSAGO = 604800
MAX
EVENTS = 10000
NOBINARYCHECK = 1
TRUNCATE = 0

0 Karma

Motivator

Try the below config,

[APP1_INOUT_App2]
DATETIME_CONFIG=CURRENT
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
CHARSET=UTF-8
INDEXED_EXTRACTIONS=json
KV_MODE=none
pulldown_type=true

SplunkTrust
SplunkTrust

@anantdeshpande, Can you try with the following?

SHOULDLINEMERGE = true
Also if you know event break pattern like start or end you should LINE
BREAKER, BREAKONLYBEFORE, MUSTBREAKAFTER etc. to identify events correctly. Refer to documentation: http://docs.splunk.com/Documentation/Splunk/latest/Data/Configureeventlinebreaking

Where is the field for timestamp in your data? Can you please one complete event as sample (or few)?

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

Path Finder

I tried with both SHOULD_LINEMERGE = true and false, but same results.

I can write BREAKONLYBEFORE = ^{ or (^){ - but JSON is recognized format by Splunk and I should not be writing other stuff like normal logs.

In my data there are multiple fields with time stamp.

0 Karma