I've been working on a project with JSON in the event where Tags are stored similar to this...
{
"Name": "example",
"Tags": [
{"Key": "Building", "Value": "1"},
{"Key": "Floor", "Value": "2"},
{"Key": "Color", "Value": "Red"}
]
}
The default extract from spath provided the Tags{}.Key and Tags{}.Value fields which were pretty much useless as-is. What I wanted was for each tag to be a field on the event so that you could use them in a search, ex. Building=1 AND Color=Red. But the number of tags varies and the same value could appear in multiple tags (i.e. Building=1 AND Floor=1).
Here's what I came up with so far... I'm curious if anyone has a better suggestion.
| rename Tags{}.Key as Key, Tags{}.Value as Value | eval zip=mvzip(Key,Value, ":") | mvexpand zip |rex field=zip mode=sed "s/$/\"}/g" |rex field=zip mode=sed "s/^/{\"tag./g"| rex field=zip mode=sed "s/:/\": \"/g" | spath input=zip | transaction Name
This approach basically uses mvzip and mvexpand to pull apart the Tags, then uses rex with sed to rebuild a JSON object to pass back through spath. It seems pretty complex, but I just can't see a better way to do it.
I'm interested to hear if anyone has a better suggestion?
This app may help you - developed by Brett Adams, a Splunk Trust member, it is designed to extract arrays of KV pairs from JSON arrays.
https://splunkbase.splunk.com/app/6161/
A different hack - similar idea to mvzip but looks more SPLy, is to use mvmap and extract.
| rename Tags{}.* AS *
| eval ind = mvrange(0, mvcount(Key))
| eval pair=mvmap(ind, mvappend(pair, mvindex(Key, ind). "=" . mvindex(Value, ind)))
| rename _raw AS temp, pair AS _raw
| kv
| rename temp As _raw
Output from the sample data is thus
Name | Building | Floor | Color |
example | 1 | 2 | Red |
You shouldn't have to zip as the data is well contained. Use path in spath.
``` assuming field json contains that JSON object ```
| spath input=json path=Tags{}
| mvexpand Tags{}
| spath input=Tags{}
With sample data, you get
Key | Tags{} | Value | _time |
Building | {"Key": "Building", "Value": "1"} | 1 | 2022-07-20 01:45:58 |
Floor | {"Key": "Floor", "Value": "2"} | 2 | 2022-07-20 01:45:58 |
Color | {"Key": "Color", "Value": "Red"} | Red | 2022-07-20 01:45:58 |
Does this look like something you want?
Not quite…
I want one record that looks more like this…
Name | Building | Floor | Color |
example | 1 | 2 | Red |
The problem is that you can’t effectively search multiple tags with what you have there. For example, I can search `Key=Floor AND Value=2` to get the second record of the mvexpand, but I can’t get a search that links them together. I need to be able to do a search that is more like `Building=1 AND Floor=2` Does that make sense?
That doesn't seem to work for me. The table before the transpose looks like this...
Name | Key | Value |
example | Building | 1 |
example | Floor | 2 |
example | Color | Red |
Then the transpose turns the Name field into columns which, again, doesn't get something that can be well searched for things like (Building=1 AND Floor=2).
This will do the trick - using foreach
| makeresults
| eval _raw="{
\"Name\": \"example\",
\"Tags\": [
{\"Key\": \"Building\", \"Value\": \"1\"},
{\"Key\": \"Floor\", \"Value\": \"2\"},
{\"Key\": \"Color\", \"Value\": \"Red\"}
]
}"
| spath
| rename Tags{}.* as *
| foreach 0 1 2 3 4 5 6 7 8 9 10 [ eval k=mvindex(Key, <<FIELD>>), v=mvindex(Value, <<FIELD>>), x_{k}=v ]
| table name x_*
| rename x_* as *
just make the foreach loop big enough to accommodate the array length.
However, I do believe that's exactly what the app is designed to to more efficiently.
This app may help you - developed by Brett Adams, a Splunk Trust member, it is designed to extract arrays of KV pairs from JSON arrays.
https://splunkbase.splunk.com/app/6161/
I do think that this seems like a good solution, but I think adding an application will have to be an alternative solution for me in this case. But this is very interesting and does do what I want very nicely. I was not aware of it previously, so thank you.
array2object path="Tags" key=Key value=Value
Very easy.