Splunk Enterprise

How would I extract the data so that I have the data in fields names and ids?

rberman
Path Finder

Supposing I have two events which have a JSON field "groups" as 1 or multiple lists of name value pairs. So there can be 1...N lists of name value pairs and 1...N name, value pairs within one list. I asked the following question https://community.splunk.com/t5/All-Apps-and-Add-ons/How-can-I-parse-key-value-pairs-from-JSON/m-p/6... to find out how to parse one list of 1...N name value pairs. Now I am trying to figure out how to parse events where it is possible to have 1...N lists of 1...N name value pairs. 

Example

event 1:   groups = [ {"name1":"id1", "name2":"id2", "name3":"id3", "name4":"id4"}, { "name10":"id10", "name11": "id11"} ]  

event 2:  groups = [ { "name20":"id20", "name21": "id21", "name22":"id22", "name23": "id23"}]

 

How would I extract the data so that I have the data in fields names and ids as follows:

  names ids
event 1

 ["name1", "name2", "name3", "name4"]

["name10", "name11"]

 ["id1", "id2", "id3", "id4"]

["id10", "id11"]

event 2 ["name20", "name21", "name22", "name23"] ["id20", "id21", "id22", "id23"]

 

You can use this to create the example data: 

| makeresults format=json data="[{\"groups\"[{\"name1\":\"id1\",\"name2\":\"id2\",\"name3\":\"id3\",\"name4\":\"id4\"},  {\"name10\":\"id10\",\"name11\":\"id11\"}]}, {\"groups\":[{\"name20\":\"id20\",\"name21\":\"id21\",\"name22\":\"id22\",\"name23\":\"id23\"}]}]"

Thank you so much for any help you can give in advance!

 

 

Labels (1)
0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...