Splunk Search

Extracting data from Json key value(containing no array), using transpose and outer join

ruhibansal
Explorer

I have json in following format.

 

{
  "timestamp": "1625577829075",
  "debug": "true",
  "A_real": {
    "Sig1": {
      "A01": "Pass",
      "A02": "FAIL",
      "A03": "FAIL",
      "A04": "FAIL",
      "A05": "Pass",
      "finalEntry": "true"
    },
    "Sig2": {
      "A01": "Pass",
      "A02": "FAIL",
      "A03": "FAIL",
      "A04": "Pass",
      "A05": "FAIL",
      "finalEntry": "true"
    },
    "finalEntry": "true"
  }
}

 

and one csv file as following:

Id  Timestamp
A02  T1
A03  T2
A05  T3

I want to create a saved search using outer join on Id and transpose which gives me result as following:

Id             Sig1            Sig2

A02         Fail             Fail

A03         Fail              Fail

A05         Pass          Fail

 

Please sugget query.

Labels (1)
0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@ruhibansal 

Can you please try this?

YOUR_SEARCH
| fields A_real.S*.A*
| rename A_real.* as *
|eval dummy=null()
| foreach S* [ 
eval dummy= if(isnull(dummy),"<<FIELD>>".":".'<<FIELD>>',dummy."|"."<<FIELD>>".":".'<<FIELD>>')
] | eval dummy=split(dummy,"|")
| stats count by dummy | fields - count
| eval f1= mvindex(split(dummy,"."),0),I1= mvindex(split(dummy,"."),1), Id=mvindex(split(I1,":"),0),{f1}=mvindex(split(I1,":"),1) | fields - dummy I1 f1
| stats values(*) as * by Id
| lookup YOUR_LOOKUP Id
| where isnotnull(Timestamp) | fields - Timestamp

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

 

kamlesh_vaghela
SplunkTrust
SplunkTrust

@ruhibansal 

Can you please try this?

YOUR_SEARCH
| fields A_real.S*.A*
| rename A_real.* as *
|eval dummy=null()
| foreach S* [ 
eval dummy= if(isnull(dummy),"<<FIELD>>".":".'<<FIELD>>',dummy."|"."<<FIELD>>".":".'<<FIELD>>')
] | eval dummy=split(dummy,"|")
| stats count by dummy | fields - count
| eval f1= mvindex(split(dummy,"."),0),I1= mvindex(split(dummy,"."),1), Id=mvindex(split(I1,":"),0),{f1}=mvindex(split(I1,":"),1) | fields - dummy I1 f1
| stats values(*) as * by Id

 

My Sample Search :

| makeresults | eval _raw="{\"timestamp\": \"1625577829075\",\"debug\": \"true\",\"A_real\": {\"Sig1\": {\"A01\": \"Pass\",\"A02\": \"FAIL\",\"A03\": \"FAIL\",\"A04\": \"FAIL\",\"A05\": \"Pass\",\"finalEntry\": \"true\"},\"Sig2\": {\"A01\": \"Pass\",\"A02\": \"FAIL\",\"A03\": \"FAIL\",\"A04\": \"Pass\",\"A05\": \"FAIL\",\"finalEntry\": \"true\"},\"finalEntry\": \"true\"}}" | extract 
| rename comment as "Upto Now is sample data only" 
| fields A_real.S*.A*
| rename A_real.* as *
|eval dummy=null()
| foreach S* [ 
eval dummy= if(isnull(dummy),"<<FIELD>>".":".'<<FIELD>>',dummy."|"."<<FIELD>>".":".'<<FIELD>>')
] | eval dummy=split(dummy,"|")
| stats count by dummy | fields - count
| eval f1= mvindex(split(dummy,"."),0),I1= mvindex(split(dummy,"."),1), Id=mvindex(split(I1,":"),0),{f1}=mvindex(split(I1,":"),1) | fields - dummy I1 f1
| stats values(*) as * by Id

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

ruhibansal
Explorer

Hi @kamlesh_vaghela 

I have input json while getting uploaded to splunk every 5 minutes. I am using 'head 1' to see data for the latest one.

In the above mentioned data/query, I want to search all the input json files  for which the following field has never been null.

fields A_real.S*.A*

Can you please help me in the query?

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@ruhibansal 

Can you please share some sample events and expected output? So can help you.

KV

0 Karma
Get Updates on the Splunk Community!

Splunk App for Anomaly Detection End of Life Announcment

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...