I have a log file that is coming into splunk in json format. There appear to be two fields of interest, "key" and "value."
key:
originid
origintype
template
starttime
endtime
justification
value - (has the values for each of the items in "key."):
12345 (is not always the same id)
BuiltInRole (is not always the same)
85750845e54 (is not always the same)
2022-12-03T14:00:00:00.5661018Z
2022-12-04T14:00:00:00.5661018Z
some reason to satisfy the justification
I want have the following:
originid = 12345
origintype = BuiltInRole
template = 85750845e54
starttime = 2022-12-03T14:00:00:00.5661018Z
endtime = 2022-12-04T14:00:00:00.5661018Z
justification = some reason to satisfy the justification
Thanks for the help and guidance.
I was able to figure this out using the below spl:
index=myindex sourcetype=my_sourcetype
| fillnull value=na "properties.additionalDetails{}.key" "properties.additionalDetails{}.value"
| eval key_value = mvzip('properties.additionalDetails{}.key', 'properties.additionalDetails{}.value', "=")
| rename _raw as temp, key_value as _raw
| extract pairdelim="\n" kvdelim="="
| rename temp as _raw
Thanks to everyone who helped and pointed me in the right direction.
I was able to figure this out using the below spl:
index=myindex sourcetype=my_sourcetype
| fillnull value=na "properties.additionalDetails{}.key" "properties.additionalDetails{}.value"
| eval key_value = mvzip('properties.additionalDetails{}.key', 'properties.additionalDetails{}.value', "=")
| rename _raw as temp, key_value as _raw
| extract pairdelim="\n" kvdelim="="
| rename temp as _raw
Thanks to everyone who helped and pointed me in the right direction.
@bt149 Glad you find solution to your problem! A curtesy reminder: It is customary to credit the first comment that materially informs a custom SPL as solution, even if there are minor differences such as field names or a small manipulation that is not at the core of the original question.
By the way, when delimiter is equal sign (=), you don't need to explicitly specify kvdelim and pairdelim.
Thanks for the information. The spl only worked with the pairdelim and kvdelim in the spl. Thanks again.
As @richgalloway suggested, it is much easier for others to understand what you mean by using both text and sample data (anonymize as needed; some people just abstract with made-up fields). If I have to read tea leaves, you have a JSON array, each array element contains two keys, one named "key" and the other "value", like
{"log": [{"key":"originid", "value":"12345"}, {"key":"origintype", "value":"BuiltInRole"}, ... ]}
You want to pair the values of "key" and "value", so you have originid=12345, origintype=BuiltInRole, and so on. Is this correct? (It is important to confirm that they are in an array; otherwise the strategy would be different.)
This question gets asked often, in various forms, even recently. But I cannot find that one at this time. So, here we go one method using kv aka extract:
| eval key_value = mvzip('log{}.key', 'log{}.value', "=") ``` pair values with = ```
| rename _raw as temp, key_value as _raw
| kv
| rename temp as _raw
This is the test data I use
{"log":[{"key":"originid", "value":"12345"}, {"key":"origintype", "value":"BuiltInRole"}, {"key":"template", "value":"85750845e54"}, {"key":"starttime", "value":"2022-12-03T14:00:00:00.5661018Z"}, {"key":"endtime", "value":"2022-12-04T14:00:00:00.5661018Z"}, {"key":"justification", "value":"some reason to satisfy the justification"}]}
(I don't see any reason why you cannot share sample data like this.) This is emulated with
| makeresults
| fields - _time
| eval _raw = "{\"log\":[{\"key\":\"originid\", \"value\":\"12345\"}, {\"key\":\"origintype\", \"value\":\"BuiltInRole\"}, {\"key\":\"template\", \"value\":\"85750845e54\"}, {\"key\":\"starttime\", \"value\":\"2022-12-03T14:00:00:00.5661018Z\"}, {\"key\":\"endtime\", \"value\":\"2022-12-04T14:00:00:00.5661018Z\"}, {\"key\":\"justification\", \"value\":\"some reason to satisfy the justification\"}]}"
``` data emulation above ```
Here is a sample data for your review. The posted solution did not produce any results. I swapped out the "log" for what is in my logs, which is "additionalDetails."
"additionalDetails": [{"key": "RoleDefinitionOriginId", "value": "65555555-69f5-4237-9190-012177145e10"}, {"key": "RoleDefinitionOriginType", "value": "BuiltInRole"}, {"key": "TemplateId", "value": "65555555-69f5-4237-9190-012177145e10"}, {"key": "StartTime", "value": "2022-12-03T03:38:14.3598981Z"}, {"key": "ExpirationTime", "value": "2022-12-03T11:38:14.3598981Z"}, {"key": "Justification", "value": "BAU activity"}]}}
Thanks for the help, it's very much appreciated.
The goal is to take the information from the "additionalDetails" array (I think that's the term) and create new fields using the data in "key" and the value of the new fields would be the data in the "value" section of the log. Hope this makes sense.
Thanks again.
Please share the raw JSON data.