Splunk Search

Help generating table

Thulasiraman
Explorer

I'm new to Splunk and trying to display table in the below format after reading data from json. Could someone help me with the splunk query.

Transaction Namepct2ResTime
Transaction 1 4198
Transaction 21318
Transaction 3451


JSON file name: statistics.json

{

  “Transaction1” : {

    "transaction" : "Transaction1”,

    "pct1ResTime" : 3083.0,

    "pct2ResTime" : 4198.0,

    "pct3ResTime" : 47139.0

  },

  "Transaction2” : {

    "transaction" : "Transaction2”,

    "pct1ResTime" : 1151.3000000000002,

    "pct2ResTime" : 1318.8999999999996,

    "pct3ResTime" : 6866.0

  },

  "Transaction3” : {

    "transaction" : "Transaction3”,

    "pct1ResTime" : 342.40000000000003,

    "pct2ResTime" : 451.49999999999983,

    "pct3ResTime" : 712.5799999999997

  }

}

Labels (1)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

It appears you have multiple stats for the same transaction in the event . try using mvdedup

| spath
| eval date=strftime(_time,"%m-%d %k:%M")
| table date *.pct2ResTime
| foreach *.pct2ResTime
    [| eval <<FIELD>> = mvdedup('<<FIELD>>')]
| untable date transaction pct2ResTime
| eval "Transaction Name"=mvindex(split(transaction,"."),0)
| xyseries "Transaction Name" date pct2ResTime

View solution in original post

0 Karma

yuanliu
SplunkTrust
SplunkTrust

@ITWhisperer is correct.  You should not use regex with JSON which contains structured data.  In fact, you also do not need spath with raw events because Splunk by default does that.  So, you can use untable directly.

A more semantic implementation of your intentions is to use JSON functions introduced in 8.2:

 

index=jenkins_artifact source="<path to json>/statistics.json"
| eval Transaction_type = json_keys(_raw)
| foreach mode=json_array Transaction_type
    [eval jsonTrans = mvappend(jsonTrans, json_object("Transaction Name", <<ITEM>>, "pct2ResTime", json_extract(_raw, <<ITEM>> . ".pct2ResTime")))]
| fields - _raw Transaction*
| mvexpand jsonTrans
| spath input=jsonTrans
| fields - json*

 

This is an emulation of your mock data:

 

| makeresults
| eval _raw = "{

  \"Transaction1\" : {

    \"transaction\" : \"Transaction1\",

    \"pct1ResTime\" : 3083.0,

    \"pct2ResTime\" : 4198.0,

    \"pct3ResTime\" : 47139.0

  },

  \"Transaction2\" : {

    \"transaction\" : \"Transaction2\",

    \"pct1ResTime\" : 1151.3000000000002,

    \"pct2ResTime\" : 1318.8999999999996,

    \"pct3ResTime\" : 6866.0

  },

  \"Transaction3\" : {

    \"transaction\" : \"Transaction3\",

    \"pct1ResTime\" : 342.40000000000003,

    \"pct2ResTime\" : 451.49999999999983,

    \"pct3ResTime\" : 712.5799999999997

  }

}"
| spath
``` the above emulates
index=jenkins_artifact source="<path to json>/statistics.json"
```

 

Output is

Transaction Namepct2ResTime
Transaction14198
Transaction21318.8999999999996
Transaction3451.49999999999983

 

0 Karma

marnall
Motivator

It is possible to use props.conf settings on your indexer machines to pre-process the JSON into distinct events for each transaction, but I will assume that you instead have that one json object as a single event in Splunk.

You can then use the following search:

<Your search for finding the json event>
``` Chop off the first and last brackets ```
| rex field=_raw mode=sed "s/^{//"
| rex field=_raw mode=sed "s/}$//"
``` Add a "SplitHere" keyword to target with a makemv command ```
| rex field=_raw mode=sed "s/},/},SPLITHERE/g" max_match=99
``` Remove the Transaction1 etc. labels for each sub-object ```
| rex field=_raw mode=sed "s/\s*\"Transaction\d*\"\s:\s//g" max_match=99
``` To avoid making _raw a multivalue lets eval it to the "a" field ```
| eval a = _raw
``` Split 'a' into multiple values and table it ```
| makemv a delim=",SPLITHERE"
| mvexpand a
| table a
``` Extract the key values for each json object ```
| spath input=a
``` Filter to desired fields and make it into final table with renaming and rounding ```
| table transaction pct2ResTime
| rename transaction as "Transaction Name"
| eval pct2ResTime = round(pct2ResTime)

woodcock
Esteemed Legend

Hey, OP!  This is the correct answer.

0 Karma

Thulasiraman
Explorer

Thank you for your help! but I'm unable to produce the table with this query.

0 Karma

marnall
Motivator

Can you paste a copy of your original event in a code sample format? Perhaps one of the double-quotes is wrong.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Have you tried using spath?

0 Karma

Thulasiraman
Explorer

No! I did not try with spath. Query i tried so far is below. Also,  Could you please help with spath, I'm very new to splunk

index=jenkins_artifact source="<path to json>/statistics.json"
| rex max_match=0 "(?<keyvalue>\"[^\"]+\":\"[^\"]+\")"
| mvexpand keyvalue
| rex field=keyvalue "\"(?<key>[^\"]+)\":\"(?<value>[^\"]+)\""
| eval {key}=value
| fields - keyvalue key value _raw host eventtype index linecount source sourcetype punct splunk_server tag tag::eventtype timestamp
| untable date Transaction pct2ResTime | where like(Transaction,"%__%")
| xyseries Transaction date pct2ResTime

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Assuming your event are as you have shown, you could do this

| spath
| table _time *.pct2ResTime
| untable _time transaction pct2ResTime
| eval "Transaction Name"=mvindex(split(transaction,"."),0)
| table "Transaction Name" pct2ResTime

If not, please share a more accurate representation of your events, preferably in a code block (as above) to preserve the formatting of the data.

0 Karma

Thulasiraman
Explorer

I did attach the query what i tried and screenshots of how i makeresults and how json files look like.

Basically, I would like to compare today's 95th percentile with previous day or some other day 95th percentile to check for deviation. Also, this json file has been generated by jmeter file using jtl file. Please let me know if you know any way to generate the report in splunk using jtl file

index=jenkins_artifact source="job/V8_JMeter_Load_Test_STAGE_Pipeline/*/src/TestResults/*/JMeter/RUN2/statistics.json" 
| spath 
| eval date = strftime(_time, "%m-%d %k:%M") 
| eval "Transaction Name"=mvindex(split(transaction,"."),0)  
| eval pct2ResTime = round(pct2ResTime) 
| untable  date "Transaction Name" pct2ResTime
| xyseries  "Transaction Name" date pct2ResTime



Thulasiraman_0-1735239712826.png

Thulasiraman_1-1735239736490.png

Thulasiraman_2-1735239756893.png

 




 

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

It appears you have multiple stats for the same transaction in the event . try using mvdedup

| spath
| eval date=strftime(_time,"%m-%d %k:%M")
| table date *.pct2ResTime
| foreach *.pct2ResTime
    [| eval <<FIELD>> = mvdedup('<<FIELD>>')]
| untable date transaction pct2ResTime
| eval "Transaction Name"=mvindex(split(transaction,"."),0)
| xyseries "Transaction Name" date pct2ResTime
0 Karma

Thulasiraman
Explorer

Thank you! Works well.

Thulasiraman_0-1735337549905.png

I'm struggling to get the last date value to calculate the percentage deviation. Could you please help

0 Karma
Get Updates on the Splunk Community!

Infographic provides the TL;DR for the 2024 Splunk Career Impact Report

We’ve been buzzing with excitement about the recent validation of Splunk Education! The 2024 Splunk Career ...

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...