Getting Data In

how to parse json to extract multiple line event ?

sfatnass
Contributor

i have one file json that contain many object like that :

{
    "id": 1,
    "name": "toto",
    "price": 1.50,
    "tags": ["travel", "red"] } 
{
        "id": 2,
        "name": "toto",
        "price": 12,
        "tags": ["home", "green"] }

i need to extract that on two event for the example : how can i use line_breaker on props.conf

thx

Tags (3)
0 Karma
1 Solution

jkat54
SplunkTrust
SplunkTrust

This method will index each field name in the json payload:

[ <SOURCETYPE NAME> ]  
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
CHARSET=AUTO
INDEXED_EXTRACTIONS=json
KV_MODE=none
disabled=false
pulldown_type=true

This would not and would come at a lower performance cost:

[ <SOURCETYPE NAME> ]
CHARSET=AUTO
SHOULD_LINEMERGE=false
disabled=false
LINE_BREAKER=(^){.*"id":

View solution in original post

0 Karma

jkat54
SplunkTrust
SplunkTrust

This method will index each field name in the json payload:

[ <SOURCETYPE NAME> ]  
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
CHARSET=AUTO
INDEXED_EXTRACTIONS=json
KV_MODE=none
disabled=false
pulldown_type=true

This would not and would come at a lower performance cost:

[ <SOURCETYPE NAME> ]
CHARSET=AUTO
SHOULD_LINEMERGE=false
disabled=false
LINE_BREAKER=(^){.*"id":
0 Karma

sfatnass
Contributor

this is my current configuraiton in the props.conf
[json]

[source::.../mysource...]
sourcetype = json
SHOULD_LINEMERGE = false
TRUNCATE=0
NO_BINARY_CHECK = 1
LINE_BREAKER = ([\r\n]+){

then i need to have something like that :

[json]

[source::.../mysource...]
 SHOULD_LINEMERGE=true
 NO_BINARY_CHECK=true
 CHARSET=AUTO
 INDEXED_EXTRACTIONS=json
 KV_MODE=json
 disabled=false
 pulldown_type=true

it's ok like that, it work very well and the performence is great thx

0 Karma

jkat54
SplunkTrust
SplunkTrust

Do you want the fields extracted at index time or search time?

Both examples I gave you worked with your example data so either you didn't reindex the data, didn't put the props in the correct place, or maybe the example data you provided isn't exactly like the data you're ingesting.

0 Karma

jkat54
SplunkTrust
SplunkTrust

The settings you used would index the fields and would need to be placed on the universal forwarder and indexers. It wouldn't apply to data already ingested either.

0 Karma

sfatnass
Contributor

just to extract json for many event like your exemple and extract all field too, because i will use some request and i need to know who the field contain the correct value ^^
but it's ok and thx for your reply

0 Karma

jkat54
SplunkTrust
SplunkTrust

great, just so you know the INDEXED_EXTRACTIONS will consume more disk space and does require more CPU on the indexers/forwarders

0 Karma

sfatnass
Contributor

ok but it's more performent no? the objectif of my project is to play more speed ^^

0 Karma

jkat54
SplunkTrust
SplunkTrust

It can be faster when you're searching for the fields involved yes.

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...