Getting Data In

How to extract the nested array coordinates from JSON?

weiquanswq
Explorer

Hi !!
I am new to Splunk and trying to extract the array coordinates from Json.

{"type":"Feature","geometry":{"type":"MultiPoint","coordinates":[[103.62107,1.27478],[103.622,1.29625],  ……., [103.6224,1.28207]]}}

Is anyone able to help me out here?

Thanks

0 Karma
1 Solution

weiquanswq
Explorer

I able to achieve the results I want using spath and extract according to the index.

My code is as follows:
... | spath path=features{}.geometry.coordinates{}{0} output=a
| spath path=features{}.geometry.coordinates{}{1} output=b
| table a b | eval x=mvzip(a,b)| mvexpand x

Hope this will be useful. 🙂

View solution in original post

0 Karma

weiquanswq
Explorer

I able to achieve the results I want using spath and extract according to the index.

My code is as follows:
... | spath path=features{}.geometry.coordinates{}{0} output=a
| spath path=features{}.geometry.coordinates{}{1} output=b
| table a b | eval x=mvzip(a,b)| mvexpand x

Hope this will be useful. 🙂

0 Karma

sshelly_splunk
Splunk Employee
Splunk Employee

Try this in your props.conf:
under the sourcetype or monitor stanza:

Example:

[myjson_sourcetype]
EXTRACT-coordinates = `\[\[(?P<coord_1>\S+)],\[(?P<coord_2>\S+)\].+\[(?P<coord_3>\S+)\]\]`

The above will create the 3 fields (coord_1, 2 and 3. Assuming the data comes in like that, you should be good to go. If not, please post more of the data, so we can all take a look.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...