Splunk Search

Extracting data from Json key value(containing no array), using transpose and outer join

ruhibansal
Explorer

I have json in following format.

 

{
  "timestamp": "1625577829075",
  "debug": "true",
  "A_real": {
    "Sig1": {
      "A01": "Pass",
      "A02": "FAIL",
      "A03": "FAIL",
      "A04": "FAIL",
      "A05": "Pass",
      "finalEntry": "true"
    },
    "Sig2": {
      "A01": "Pass",
      "A02": "FAIL",
      "A03": "FAIL",
      "A04": "Pass",
      "A05": "FAIL",
      "finalEntry": "true"
    },
    "finalEntry": "true"
  }
}

 

and one csv file as following:

Id  Timestamp
A02  T1
A03  T2
A05  T3

I want to create a saved search using outer join on Id and transpose which gives me result as following:

Id             Sig1            Sig2

A02         Fail             Fail

A03         Fail              Fail

A05         Pass          Fail

 

Please sugget query.

Labels (1)
0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@ruhibansal 

Can you please try this?

YOUR_SEARCH
| fields A_real.S*.A*
| rename A_real.* as *
|eval dummy=null()
| foreach S* [ 
eval dummy= if(isnull(dummy),"<<FIELD>>".":".'<<FIELD>>',dummy."|"."<<FIELD>>".":".'<<FIELD>>')
] | eval dummy=split(dummy,"|")
| stats count by dummy | fields - count
| eval f1= mvindex(split(dummy,"."),0),I1= mvindex(split(dummy,"."),1), Id=mvindex(split(I1,":"),0),{f1}=mvindex(split(I1,":"),1) | fields - dummy I1 f1
| stats values(*) as * by Id
| lookup YOUR_LOOKUP Id
| where isnotnull(Timestamp) | fields - Timestamp

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

 

kamlesh_vaghela
SplunkTrust
SplunkTrust

@ruhibansal 

Can you please try this?

YOUR_SEARCH
| fields A_real.S*.A*
| rename A_real.* as *
|eval dummy=null()
| foreach S* [ 
eval dummy= if(isnull(dummy),"<<FIELD>>".":".'<<FIELD>>',dummy."|"."<<FIELD>>".":".'<<FIELD>>')
] | eval dummy=split(dummy,"|")
| stats count by dummy | fields - count
| eval f1= mvindex(split(dummy,"."),0),I1= mvindex(split(dummy,"."),1), Id=mvindex(split(I1,":"),0),{f1}=mvindex(split(I1,":"),1) | fields - dummy I1 f1
| stats values(*) as * by Id

 

My Sample Search :

| makeresults | eval _raw="{\"timestamp\": \"1625577829075\",\"debug\": \"true\",\"A_real\": {\"Sig1\": {\"A01\": \"Pass\",\"A02\": \"FAIL\",\"A03\": \"FAIL\",\"A04\": \"FAIL\",\"A05\": \"Pass\",\"finalEntry\": \"true\"},\"Sig2\": {\"A01\": \"Pass\",\"A02\": \"FAIL\",\"A03\": \"FAIL\",\"A04\": \"Pass\",\"A05\": \"FAIL\",\"finalEntry\": \"true\"},\"finalEntry\": \"true\"}}" | extract 
| rename comment as "Upto Now is sample data only" 
| fields A_real.S*.A*
| rename A_real.* as *
|eval dummy=null()
| foreach S* [ 
eval dummy= if(isnull(dummy),"<<FIELD>>".":".'<<FIELD>>',dummy."|"."<<FIELD>>".":".'<<FIELD>>')
] | eval dummy=split(dummy,"|")
| stats count by dummy | fields - count
| eval f1= mvindex(split(dummy,"."),0),I1= mvindex(split(dummy,"."),1), Id=mvindex(split(I1,":"),0),{f1}=mvindex(split(I1,":"),1) | fields - dummy I1 f1
| stats values(*) as * by Id

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

ruhibansal
Explorer

Hi @kamlesh_vaghela 

I have input json while getting uploaded to splunk every 5 minutes. I am using 'head 1' to see data for the latest one.

In the above mentioned data/query, I want to search all the input json files  for which the following field has never been null.

fields A_real.S*.A*

Can you please help me in the query?

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@ruhibansal 

Can you please share some sample events and expected output? So can help you.

KV

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...