- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm new to Splunk and trying to display table in the below format after reading data from json. Could someone help me with the splunk query.
Transaction Name | pct2ResTime |
Transaction 1 | 4198 |
Transaction 2 | 1318 |
Transaction 3 | 451 |
JSON file name: statistics.json
{
“Transaction1” : {
"transaction" : "Transaction1”,
"pct1ResTime" : 3083.0,
"pct2ResTime" : 4198.0,
"pct3ResTime" : 47139.0
},
"Transaction2” : {
"transaction" : "Transaction2”,
"pct1ResTime" : 1151.3000000000002,
"pct2ResTime" : 1318.8999999999996,
"pct3ResTime" : 6866.0
},
"Transaction3” : {
"transaction" : "Transaction3”,
"pct1ResTime" : 342.40000000000003,
"pct2ResTime" : 451.49999999999983,
"pct3ResTime" : 712.5799999999997
}
}
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

It appears you have multiple stats for the same transaction in the event . try using mvdedup
| spath
| eval date=strftime(_time,"%m-%d %k:%M")
| table date *.pct2ResTime
| foreach *.pct2ResTime
[| eval <<FIELD>> = mvdedup('<<FIELD>>')]
| untable date transaction pct2ResTime
| eval "Transaction Name"=mvindex(split(transaction,"."),0)
| xyseries "Transaction Name" date pct2ResTime
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

@ITWhisperer is correct. You should not use regex with JSON which contains structured data. In fact, you also do not need spath with raw events because Splunk by default does that. So, you can use untable directly.
A more semantic implementation of your intentions is to use JSON functions introduced in 8.2:
index=jenkins_artifact source="<path to json>/statistics.json"
| eval Transaction_type = json_keys(_raw)
| foreach mode=json_array Transaction_type
[eval jsonTrans = mvappend(jsonTrans, json_object("Transaction Name", <<ITEM>>, "pct2ResTime", json_extract(_raw, <<ITEM>> . ".pct2ResTime")))]
| fields - _raw Transaction*
| mvexpand jsonTrans
| spath input=jsonTrans
| fields - json*
This is an emulation of your mock data:
| makeresults
| eval _raw = "{
\"Transaction1\" : {
\"transaction\" : \"Transaction1\",
\"pct1ResTime\" : 3083.0,
\"pct2ResTime\" : 4198.0,
\"pct3ResTime\" : 47139.0
},
\"Transaction2\" : {
\"transaction\" : \"Transaction2\",
\"pct1ResTime\" : 1151.3000000000002,
\"pct2ResTime\" : 1318.8999999999996,
\"pct3ResTime\" : 6866.0
},
\"Transaction3\" : {
\"transaction\" : \"Transaction3\",
\"pct1ResTime\" : 342.40000000000003,
\"pct2ResTime\" : 451.49999999999983,
\"pct3ResTime\" : 712.5799999999997
}
}"
| spath
``` the above emulates
index=jenkins_artifact source="<path to json>/statistics.json"
```
Output is
Transaction Name | pct2ResTime |
Transaction1 | 4198 |
Transaction2 | 1318.8999999999996 |
Transaction3 | 451.49999999999983 |
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It is possible to use props.conf settings on your indexer machines to pre-process the JSON into distinct events for each transaction, but I will assume that you instead have that one json object as a single event in Splunk.
You can then use the following search:
<Your search for finding the json event>
``` Chop off the first and last brackets ```
| rex field=_raw mode=sed "s/^{//"
| rex field=_raw mode=sed "s/}$//"
``` Add a "SplitHere" keyword to target with a makemv command ```
| rex field=_raw mode=sed "s/},/},SPLITHERE/g" max_match=99
``` Remove the Transaction1 etc. labels for each sub-object ```
| rex field=_raw mode=sed "s/\s*\"Transaction\d*\"\s:\s//g" max_match=99
``` To avoid making _raw a multivalue lets eval it to the "a" field ```
| eval a = _raw
``` Split 'a' into multiple values and table it ```
| makemv a delim=",SPLITHERE"
| mvexpand a
| table a
``` Extract the key values for each json object ```
| spath input=a
``` Filter to desired fields and make it into final table with renaming and rounding ```
| table transaction pct2ResTime
| rename transaction as "Transaction Name"
| eval pct2ResTime = round(pct2ResTime)
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hey, OP! This is the correct answer.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for your help! but I'm unable to produce the table with this query.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Can you paste a copy of your original event in a code sample format? Perhaps one of the double-quotes is wrong.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Have you tried using spath?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
No! I did not try with spath. Query i tried so far is below. Also, Could you please help with spath, I'm very new to splunk
index=jenkins_artifact source="<path to json>/statistics.json"
| rex max_match=0 "(?<keyvalue>\"[^\"]+\":\"[^\"]+\")"
| mvexpand keyvalue
| rex field=keyvalue "\"(?<key>[^\"]+)\":\"(?<value>[^\"]+)\""
| eval {key}=value
| fields - keyvalue key value _raw host eventtype index linecount source sourcetype punct splunk_server tag tag::eventtype timestamp
| untable date Transaction pct2ResTime | where like(Transaction,"%__%")
| xyseries Transaction date pct2ResTime
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Assuming your event are as you have shown, you could do this
| spath
| table _time *.pct2ResTime
| untable _time transaction pct2ResTime
| eval "Transaction Name"=mvindex(split(transaction,"."),0)
| table "Transaction Name" pct2ResTime
If not, please share a more accurate representation of your events, preferably in a code block (as above) to preserve the formatting of the data.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I did attach the query what i tried and screenshots of how i makeresults and how json files look like.
Basically, I would like to compare today's 95th percentile with previous day or some other day 95th percentile to check for deviation. Also, this json file has been generated by jmeter file using jtl file. Please let me know if you know any way to generate the report in splunk using jtl file
index=jenkins_artifact source="job/V8_JMeter_Load_Test_STAGE_Pipeline/*/src/TestResults/*/JMeter/RUN2/statistics.json"
| spath
| eval date = strftime(_time, "%m-%d %k:%M")
| eval "Transaction Name"=mvindex(split(transaction,"."),0)
| eval pct2ResTime = round(pct2ResTime)
| untable date "Transaction Name" pct2ResTime
| xyseries "Transaction Name" date pct2ResTime
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

It appears you have multiple stats for the same transaction in the event . try using mvdedup
| spath
| eval date=strftime(_time,"%m-%d %k:%M")
| table date *.pct2ResTime
| foreach *.pct2ResTime
[| eval <<FIELD>> = mvdedup('<<FIELD>>')]
| untable date transaction pct2ResTime
| eval "Transaction Name"=mvindex(split(transaction,"."),0)
| xyseries "Transaction Name" date pct2ResTime
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you! Works well.
I'm struggling to get the last date value to calculate the percentage deviation. Could you please help
