Splunk Search

Extract pairldelim kvdelim JSON problems

trtracy81
New Member

I have JSON data that I'm trying to extract into fields and unable to get all the data extracted correctly.

My query is
index=myindex
|spath
|extract paridelim="," kvdelim=":{}[,]"

My data looks like this
*{"version":"1.0.0","integrationType":"metric","action":"created","metrics":{"General":{"tapid":1,"port":2,"length":16,"timestamp":1580164559,"packet_id":626910,"protocol":"test","Indexes":{"Address1":[0],"FCode":[1],"AddressOut1":[2,3],"outputValue":[4,5],"checksum":[6,7]}},"ApplicationLayer":{"Rail":{"Rail16":1}},"TransportLayer":{"Address2":3,"FCode2":{"code":5,"string":"Read Single Values"},"type":"response","crc":56253}}}

{"version":"1.0.0","integrationType":"metric","action":"created","metrics":{"General":{"tapid":1,"port":2,"length":30,"timestamp":1580164556,"packet_id":626904,"protocol":"test","Indexes":{"Address1":[0],"FCode":[1],"RValues":[2],"reg1":[3,4],"reg2":[5,6],"reg3":[7,8],"reg4":[9,10],"reg5":[11,12],"reg6":[13,14],"reg7":[15,16],"reg8":[17,18],"reg9":[19,20],"reg10":[21,22],"reg11":[23,24],"reg12":[25,26],"reg13":[27,28],"checksum":[29,28]}},"ApplicationLayer":{"Registering":{}},"TransportLayer":{"Address2":3,"FCode2":{"code":3,"string":"Read Multiple Values"},"type":"response","crc":18279}}}*

The query does fine for most of the data but fails to get multi-values.
For example:
"AddressOut1":[2,3] will only give me AddressOut1 = 2 -- It's not extracting the 3. I was expecting AddressOut1=2,3
"checksum":[6,7] again will only give me checksum = 6 -- and skip the 7.
The same with "reg1":[3,4]. I'm only getting the 3. Whenever there is multiple values, I only get the first entry in the array.

I suspect it because the "," is used to separate the keys but because one one of the values also uses a "," as a separator, it not able to do it.
Is there a better way to extract these or am I missing something?

Thanks

0 Karma
1 Solution

to4kawa
Ultra Champion
| makeresults 
| eval _raw="{\"version\":\"1.0.0\",\"integrationType\":\"metric\",\"action\":\"created\",\"metrics\":{\"General\":{\"tapid\":1,\"port\":2,\"length\":16,\"timestamp\":1580164559,\"packet_id\":626910,\"protocol\":\"test\",\"Indexes\":{\"Address1\":[0],\"FCode\":[1],\"AddressOut1\":[2,3],\"outputValue\":[4,5],\"checksum\":[6,7]}},\"ApplicationLayer\":{\"Rail\":{\"Rail16\":1}},\"TransportLayer\":{\"Address2\":3,\"FCode2\":{\"code\":5,\"string\":\"Read Single Values\"},\"type\":\"response\",\"crc\":56253}}}|{\"version\":\"1.0.0\",\"integrationType\":\"metric\",\"action\":\"created\",\"metrics\":{\"General\":{\"tapid\":1,\"port\":2,\"length\":30,\"timestamp\":1580164556,\"packet_id\":626904,\"protocol\":\"test\",\"Indexes\":{\"Address1\":[0],\"FCode\":[1],\"RValues\":[2],\"reg1\":[3,4],\"reg2\":[5,6],\"reg3\":[7,8],\"reg4\":[9,10],\"reg5\":[11,12],\"reg6\":[13,14],\"reg7\":[15,16],\"reg8\":[17,18],\"reg9\":[19,20],\"reg10\":[21,22],\"reg11\":[23,24],\"reg12\":[25,26],\"reg13\":[27,28],\"checksum\":[29,28]}},\"ApplicationLayer\":{\"Registering\":{}},\"TransportLayer\":{\"Address2\":3,\"FCode2\":{\"code\":3,\"string\":\"Read Multiple Values\"},\"type\":\"response\",\"crc\":18279}}}"
| eval _raw=split(_raw,"|") 
| stats count by _raw 
| table _raw 
`comment("this is sample you provide")`
| spath 
| rename *{} as * 
| foreach * 
    [ eval lastname_<<FIELD>>=mvindex(split("<<FIELD>>","."),-1)] 
| foreach lastname_* 
    [ eval {<<FIELD>>} = '<<MATCHSTR>>' 
    | fields - lastname_<<MATCHSTR>> <<MATCHSTR>>] 
| eval version=spath(_raw,"version"),integrationType=spath(_raw,"integrationType"),action=spath(_raw,"action") 
| table version,integrationType,action ,tapid,port ,length ,timestamp ,packet_id ,protocol,Address1,AddressOut1,outputValue ,checksum ,Rail16,FCode ,RValues ,reg*,checksum ,Address2,code,string,type,crc 
| foreach * 
    [ eval <<FIELD>> = mvjoin(<<FIELD>> ,",")]

View solution in original post

0 Karma

to4kawa
Ultra Champion
| makeresults 
| eval _raw="{\"version\":\"1.0.0\",\"integrationType\":\"metric\",\"action\":\"created\",\"metrics\":{\"General\":{\"tapid\":1,\"port\":2,\"length\":16,\"timestamp\":1580164559,\"packet_id\":626910,\"protocol\":\"test\",\"Indexes\":{\"Address1\":[0],\"FCode\":[1],\"AddressOut1\":[2,3],\"outputValue\":[4,5],\"checksum\":[6,7]}},\"ApplicationLayer\":{\"Rail\":{\"Rail16\":1}},\"TransportLayer\":{\"Address2\":3,\"FCode2\":{\"code\":5,\"string\":\"Read Single Values\"},\"type\":\"response\",\"crc\":56253}}}|{\"version\":\"1.0.0\",\"integrationType\":\"metric\",\"action\":\"created\",\"metrics\":{\"General\":{\"tapid\":1,\"port\":2,\"length\":30,\"timestamp\":1580164556,\"packet_id\":626904,\"protocol\":\"test\",\"Indexes\":{\"Address1\":[0],\"FCode\":[1],\"RValues\":[2],\"reg1\":[3,4],\"reg2\":[5,6],\"reg3\":[7,8],\"reg4\":[9,10],\"reg5\":[11,12],\"reg6\":[13,14],\"reg7\":[15,16],\"reg8\":[17,18],\"reg9\":[19,20],\"reg10\":[21,22],\"reg11\":[23,24],\"reg12\":[25,26],\"reg13\":[27,28],\"checksum\":[29,28]}},\"ApplicationLayer\":{\"Registering\":{}},\"TransportLayer\":{\"Address2\":3,\"FCode2\":{\"code\":3,\"string\":\"Read Multiple Values\"},\"type\":\"response\",\"crc\":18279}}}"
| eval _raw=split(_raw,"|") 
| stats count by _raw 
| table _raw 
`comment("this is sample you provide")`
| spath 
| rename *{} as * 
| foreach * 
    [ eval lastname_<<FIELD>>=mvindex(split("<<FIELD>>","."),-1)] 
| foreach lastname_* 
    [ eval {<<FIELD>>} = '<<MATCHSTR>>' 
    | fields - lastname_<<MATCHSTR>> <<MATCHSTR>>] 
| eval version=spath(_raw,"version"),integrationType=spath(_raw,"integrationType"),action=spath(_raw,"action") 
| table version,integrationType,action ,tapid,port ,length ,timestamp ,packet_id ,protocol,Address1,AddressOut1,outputValue ,checksum ,Rail16,FCode ,RValues ,reg*,checksum ,Address2,code,string,type,crc 
| foreach * 
    [ eval <<FIELD>> = mvjoin(<<FIELD>> ,",")]
0 Karma

trtracy81
New Member

Hi Kamlesh.
I looked at your 3 links and I did not see any of the fields that had a Key/Value pair that matches the type of data I have. What I was hoping for was something that extracts the fields to look something like this.

version : "1.0.0"
integrationType : "metric"
action : "created"
tapid : 1
port : 2
length : 16
timestamp : 1580164559
packet_id : 626910
protocol : "test"
Address1 : 0
FCode : 1
AddressOut1 : 2,3
outputValue : 4,5
checksum : 6,7
Rail16 : 1
Address2 : 3
code : 5
string : "Read Single Values"
type : "response"
crc : 56253

version : "1.0.0"
integrationType": "metric"
action : "created"
tapid" : 1
port : 2
length : 30
timestamp : 1580164556
packet_id : 626904
protoco l :"test"
Address1 : 0
FCode : 1
RValues : 2
reg1 : 3,4
reg2 : 5,6
reg3 : 7,8
reg4 : 9,10
reg5 : 11,12
reg6 : 13,14
reg7 : 15,16
reg8 : 17,18
reg9 : 19,20
reg10 : 21,22
reg11 : 23,24
reg12 : 25,26
reg13 : 27,28
checksum : 29,28
Address2 : 3,
code : 3
string : "Read Multiple Values"
type : "response
crc : 18279

If I do a simple data import of the samples I just gave and use JSON data type, I seem to get what I'm expecting except it looks like I get 2 fields that show AddressOut1 = 2 and then a AddressOut2 = 3, not a AddressOut1 = 2,3.
Hope this makes sense.

Thanks

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@trtracy81 Can you please try this?

YOUR_SEARCH | rename metrics.General.* as *, metrics.General.Indexes.*{} as *,metrics.TransportLayer.* as *,metrics.TransportLayer.FCode2.* as *,metrics.ApplicationLayer.Rail.Rail16 as Rail16
| eval AddressOut2 = mvindex(AddressOut1,1),AddressOut1 = mvindex(AddressOut1,0)
| foreach * [ | eval <<FIELD>>=mvjoin(<<FIELD>>,",") ]
| table version,integrationType,action ,tapid,port ,length ,timestamp ,packet_id ,protocol,Address1,AddressOut1,AddressOut2,outputValue ,checksum ,Rail16,FCode ,RValues ,reg*,checksum ,Address2,code,string,type,crc

Sample Search

| makeresults 
| eval data="{\"version\":\"1.0.0\",\"integrationType\":\"metric\",\"action\":\"created\",\"metrics\":{\"General\":{\"tapid\":1,\"port\":2,\"length\":16,\"timestamp\":1580164559,\"packet_id\":626910,\"protocol\":\"test\",\"Indexes\":{\"Address1\":[0],\"FCode\":[1],\"AddressOut1\":[2,3],\"outputValue\":[4,5],\"checksum\":[6,7]}},\"ApplicationLayer\":{\"Rail\":{\"Rail16\":1}},\"TransportLayer\":{\"Address2\":3,\"FCode2\":{\"code\":5,\"string\":\"Read Single Values\"},\"type\":\"response\",\"crc\":56253}}}|{\"version\":\"1.0.0\",\"integrationType\":\"metric\",\"action\":\"created\",\"metrics\":{\"General\":{\"tapid\":1,\"port\":2,\"length\":30,\"timestamp\":1580164556,\"packet_id\":626904,\"protocol\":\"test\",\"Indexes\":{\"Address1\":[0],\"FCode\":[1],\"RValues\":[2],\"reg1\":[3,4],\"reg2\":[5,6],\"reg3\":[7,8],\"reg4\":[9,10],\"reg5\":[11,12],\"reg6\":[13,14],\"reg7\":[15,16],\"reg8\":[17,18],\"reg9\":[19,20],\"reg10\":[21,22],\"reg11\":[23,24],\"reg12\":[25,26],\"reg13\":[27,28],\"checksum\":[29,28]}},\"ApplicationLayer\":{\"Registering\":{}},\"TransportLayer\":{\"Address2\":3,\"FCode2\":{\"code\":3,\"string\":\"Read Multiple Values\"},\"type\":\"response\",\"crc\":18279}}}
",data=split(data,"|") | mvexpand data | eval _raw=data 
| extract 
| rename metrics.General.* as *, metrics.General.Indexes.*{} as *,metrics.TransportLayer.* as *,metrics.TransportLayer.FCode2.* as *,metrics.ApplicationLayer.Rail.Rail16 as Rail16
| eval AddressOut2 = mvindex(AddressOut1,1),AddressOut1 = mvindex(AddressOut1,0)
| foreach * [ | eval <<FIELD>>=mvjoin(<<FIELD>>,",") ]
| table version,integrationType,action ,tapid,port ,length ,timestamp ,packet_id ,protocol,Address1,AddressOut1,AddressOut2,outputValue ,checksum ,Rail16,FCode ,RValues ,reg*,checksum ,Address2,code,string,type,crc
0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...