Getting Data In

How to extract the nested array coordinates from JSON?

weiquanswq
Explorer

Hi !!
I am new to Splunk and trying to extract the array coordinates from Json.

{"type":"Feature","geometry":{"type":"MultiPoint","coordinates":[[103.62107,1.27478],[103.622,1.29625],  ……., [103.6224,1.28207]]}}

Is anyone able to help me out here?

Thanks

0 Karma
1 Solution

weiquanswq
Explorer

I able to achieve the results I want using spath and extract according to the index.

My code is as follows:
... | spath path=features{}.geometry.coordinates{}{0} output=a
| spath path=features{}.geometry.coordinates{}{1} output=b
| table a b | eval x=mvzip(a,b)| mvexpand x

Hope this will be useful. 🙂

View solution in original post

0 Karma

weiquanswq
Explorer

I able to achieve the results I want using spath and extract according to the index.

My code is as follows:
... | spath path=features{}.geometry.coordinates{}{0} output=a
| spath path=features{}.geometry.coordinates{}{1} output=b
| table a b | eval x=mvzip(a,b)| mvexpand x

Hope this will be useful. 🙂

0 Karma

sshelly_splunk
Splunk Employee
Splunk Employee

Try this in your props.conf:
under the sourcetype or monitor stanza:

Example:

[myjson_sourcetype]
EXTRACT-coordinates = `\[\[(?P<coord_1>\S+)],\[(?P<coord_2>\S+)\].+\[(?P<coord_3>\S+)\]\]`

The above will create the 3 fields (coord_1, 2 and 3. Assuming the data comes in like that, you should be good to go. If not, please post more of the data, so we can all take a look.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...