Splunk Search

table view using spl

yuvaraj_m91
Loves-to-Learn Lots
{
  "abcdxyz" : {
    "transaction" : "abcdxyz",
    "sampleCount" : 60,
    "errorCount" : 13,
    "errorPct" : 21.666666,
    "meanResTime" : 418.71666666666664,
    "medianResTime" : 264.5,
    "minResTime" : 0.0,
    "maxResTime" : 4418.0,
    "pct1ResTime" : 368.4,
    "pct2ResTime" : 3728.049999999985,
    "pct3ResTime" : 4418.0,
    "throughput" : 0.25086548592644625,
    "receivedKBytesPerSec" : 0.16945669591340123,
    "sentKBytesPerSec" : 0.3197146692547623
  },
  "efghxyz" : {
    "transaction" : "efghxyz",
    "sampleCount" : 60,
    "errorCount" : 13,
    "errorPct" : 21.666666,
    "meanResTime" : 421.8,
    "medianResTime" : 32.0,
    "minResTime" : 0.0,
    "maxResTime" : 3566.0,
    "pct1ResTime" : 3258.5,
    "pct2ResTime" : 3497.6,
    "pct3ResTime" : 3566.0,
    "throughput" : 0.24752066797577596,
    "receivedKBytesPerSec" : 0.34477244084256037,
    "sentKBytesPerSec" : 0.08463804872238082
  },
  "ijklxyz" : {
    "transaction" : "ijklxyz",
    "sampleCount" : 60,
    "errorCount" : 13,
    "errorPct" : 21.666666,
    "meanResTime" : 27.733333333333338,
    "medianResTime" : 27.5,
    "minResTime" : 0.0,
    "maxResTime" : 241.0,
    "pct1ResTime" : 41.599999999999994,
    "pct2ResTime" : 52.699999999999974,
    "pct3ResTime" : 241.0,
    "throughput" : 0.25115636576738737,
    "receivedKBytesPerSec" : 0.3331214746541367,
    "sentKBytesPerSec" : 0.08588125143891667
  },
  "mnopxyz" : {
    "transaction" : "mnopxyz",
    "sampleCount" : 60,
    "errorCount" : 13,
    "errorPct" : 21.666666,
    "meanResTime" : 491.74999999999994,
    "medianResTime" : 279.5,
    "minResTime" : 0.0,
    "maxResTime" : 4270.0,
    "pct1ResTime" : 381.29999999999995,
    "pct2ResTime" : 4076.55,
    "pct3ResTime" : 4270.0,
    "throughput" : 0.2440254437195985,
    "receivedKBytesPerSec" : 0.16483632755942018,
    "sentKBytesPerSec" : 0.2839297997262848
  }
}

I need to create a table view from the above log event which was captured as a single event, like the below table format:

samplesabcdxyzefghxyzijklxyzmnopxyz
    "transaction" :    
    "sampleCount"                                            
    "errorCount"     
    "errorPct"                                       
    "meanResTime"                           
    "medianResTime"                                 
    "minResTime"                                      
    "maxResTime"                                      
    "pct1ResTime"                                 
    "pct2ResTime"                               
    "pct3ResTime"                                  
    "throughput"                               
    "receivedKBytesPerSec"                             
    "sentKBytesPerSec"                            



Labels (1)
Tags (1)
0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @yuvaraj_m91 

Try the following:

livehybrid_0-1753360336031.png

 

| table _raw
| eval parent_keys=json_array_to_mv(json_keys(_raw)) 
| mvexpand parent_keys 
| eval _raw=json_extract(_raw, parent_keys) 
| eval child_keys=json_keys(item) 
| spath
| fields - _*
| transpose 0 header_field=parent_keys

 

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Assuming that this is an accurate representation of your data, you could try something like this

| eval array=json_array_to_mv(json_keys(_raw),false())
| foreach array mode=multivalue 
    [| eval set=mvappend(set,json_extract(_raw,<<ITEM>>))]
| eval row=mvrange(0,mvcount(array))
| mvexpand row
| eval key=mvindex(array,row)
| eval fields=mvindex(set,row)
| table key fields
| fromjson fields
| fields - fields
| transpose 0 header_field=key column_name=samples
0 Karma
Get Updates on the Splunk Community!

App Platform's 2025 Year in Review: A Year of Innovation, Growth, and Community

As we step into 2026, it’s the perfect moment to reflect on what an extraordinary year 2025 was for the Splunk ...

Operationalizing Entity Risk Score with Enterprise Security 8.3+

Overview Enterprise Security 8.3 introduces a powerful new feature called “Entity Risk Scoring” (ERS) for ...

Unlock Database Monitoring with Splunk Observability Cloud

  In today’s fast-paced digital landscape, even minor database slowdowns can disrupt user experiences and ...