Getting Data In

Excluding field values from JSON SPATH Table

mishutts
Explorer

Hi,

Can someone help filter out a nested JSON value in a table?

I have a search and SPATH command where I can't figure out how exclude stage{}.status=SUCCESS so that only failures are shown in the table. Adding  stage{}.status!=SUCCESS doesn't work due to multiple nested JSON fields containing both SUCCESS and FAILURES.

Here is the search I'm using:

index="jenkins_statistics" event_tag=job_event type="completed" stages{}.status=FAILURE | spath stages{} output=Stages | table job_name job_result stages{}.name stages{}.status stages{}.error stages{}.error_type

Thank you

Here is how the table looks:

SPATH-Table.jpg

Here is the raw event:

{"job_type":"Pipeline","metadata":{"BITBUCKET_PR_ID":"","BRANCH_TO_BUILD":"","BRANCH_SOURCE":"","scm":"git"},"upstream":"","job_duration":261.361,"label":"nojobs","type":"completed","queue_time":9.549,"event_tag":"job_event","node":"(master)","job_name":"SomeSite/job/SomeJob_Build","test_summary":{"duration":0.0,"skips":0,"total":0,"failures":0,"passes":0},"stages":[{"duration":59.661,"pause_duration":0.0,"start_time":1603984902,"children":[{"duration":0.002,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984902,"name":"Print Message","id":"13","status":"SUCCESS"},{"duration":0.004,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984902,"name":"Print Message","id":"14","status":"SUCCESS"},{"duration":0.019,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984902,"name":"Notify a build status to BitBucket.","id":"15","status":"SUCCESS"},{"duration":0.006,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984902,"name":"Print Message","id":"16","status":"SUCCESS"},{"duration":55.35,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984902,"name":"Check out from version control","id":"17","status":"SUCCESS"},{"duration":0.015,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984957,"name":"Print Message","id":"18","status":"SUCCESS"},{"duration":0.484,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984957,"name":"Shell Script","id":"19","status":"SUCCESS"},{"duration":0.004,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984958,"name":"Print Message","id":"20","status":"SUCCESS"},{"duration":0.652,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984958,"name":"Shell Script","id":"21","status":"SUCCESS"},{"duration":0.014,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984958,"name":"Print Message","id":"22","status":"SUCCESS"},{"duration":0.607,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984958,"name":"Shell Script","id":"23","status":"SUCCESS"},{"duration":0.014,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984959,"name":"Print Message","id":"24","status":"SUCCESS"},{"duration":0.013,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984959,"name":"Print Message","id":"25","status":"SUCCESS"},{"duration":0.553,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984959,"name":"Shell Script","arguments":"touch code_version.properties","id":"26","status":"SUCCESS"},{"duration":0.564,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984959,"name":"Shell Script","arguments":"\n cat <<EOF > code_version.properties \n GIT_REVISION=$GIT_REVISION\n BITBUCKET_PULL_REQUEST_ID=$BITBUCKET_PULL_REQUEST_ID\n CODE_VERSION=$CODE_VERSION\n GIT_BRANCH_LOCAL=$GIT_BRANCH_LOCAL\n GIT_BRANCH=$GIT_BRANCH\n BB_REPO=$BB_REPO\n BB_CREDS=$BB_CREDS\n OUTLOOK_WEBHOOK=$OUTLOOK_URL\n EOF","id":"27","status":"SUCCESS"},{"duration":0.62,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984960,"name":"Shell Script","arguments":"cat code_version.properties","id":"28","status":"SUCCESS"},{"duration":0.003,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984961,"name":"Print Message","id":"29","status":"SUCCESS"},{"duration":0.032,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984961,"name":"General Build Step","arguments":"MASKED_VALUE","id":"30","status":"SUCCESS"},{"duration":0.443,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984961,"name":"Notify a build status to BitBucket.","id":"31","status":"SUCCESS"}],"name":"CheckoutLogic","id":"10","status":"SUCCESS"},{"duration":112.969,"pause_duration":0.0,"start_time":1603984961,"children":[{"duration":112.809,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603984961,"name":"Shell Script","arguments":"npm i","id":"38","status":"SUCCESS"}],"name":"Install Dependancies","id":"37","status":"SUCCESS"},{"duration":0.021,"pause_duration":0.0,"start_time":1603985074,"name":"Test","id":"42","status":"SUCCESS"},{"duration":0.016,"pause_duration":0.0,"start_time":1603985074,"name":"Branch: Lint, Unit Test","id":"45","status":"SUCCESS"},{"duration":47.47,"pause_duration":0.0,"start_time":1603985074,"children":[{"duration":0.003,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985074,"name":"Print Message","id":"59","status":"SUCCESS"},{"duration":0.004,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985074,"name":"Print Message","id":"60","status":"SUCCESS"},{"duration":25.865,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985074,"name":"Shell Script","arguments":"npm run lint:junit","id":"61","status":"SUCCESS"},{"duration":19.642,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985100,"error_type":"java.io.IOException","name":"General Build Step","id":"67","error":"MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite","status":"FAILURE"},{"duration":0.879,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985120,"name":"Shell Script","id":"70","status":"SUCCESS"},{"duration":0.541,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985121,"name":"Notify a build status to BitBucket.","id":"73","status":"SUCCESS"}],"error_type":"java.io.IOException","name":"Lint, Unit Test","id":"48","error":"MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite","status":"FAILURE"},{"duration":0.019,"pause_duration":0.0,"start_time":1603985074,"name":"Branch: Sonar Analysis","id":"46","status":"SUCCESS"},{"duration":0.054,"pause_duration":0.0,"start_time":1603985074,"name":"Sonar Analysis","id":"50","status":"SUCCESS"},{"duration":23.417,"pause_duration":0.0,"start_time":1603985074,"children":[{"duration":22.203,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985074,"name":"Shell Script","arguments":"curl -k https://SomeJenNode.com:1234/job/Dev_Site/job/scan/lastBuild/consoleFull | sed \"s#<span class=\"timestamp\"><b>##g;s#</b> </span># #g\"","id":"58","status":"SUCCESS"}],"name":"Building Dev_Site » Sonar_scan","id":"57","status":"SUCCESS"},{"duration":0.137,"pause_duration":0.0,"start_time":1603985122,"error_type":"java.io.IOException","name":"Build Code","id":"85","error":"MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite","status":"FAILURE"},{"duration":0.15,"pause_duration":0.0,"start_time":1603985122,"error_type":"java.io.IOException","name":"Package","id":"89","error":"MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite","status":"FAILURE"},{"duration":0.156,"pause_duration":0.0,"start_time":1603985122,"error_type":"java.io.IOException","name":"Zip Test Data","id":"93","error":"MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite","status":"FAILURE"},{"duration":0.218,"pause_duration":0.0,"start_time":1603985122,"error_type":"java.io.IOException","name":"Prepare Artifacts","id":"97","error":"MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite","status":"FAILURE"},{"duration":38.86,"pause_duration":0.0,"start_time":1603985123,"children":[{"duration":0.339,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985123,"name":"Notify a build status to BitBucket.","id":"102","status":"SUCCESS"},{"duration":38.473,"pause_duration":0.0,"exec_node":"SomeJenNode","start_time":1603985123,"name":"Delete workspace when build is done","id":"103","status":"SUCCESS"}],"name":"Declarative: Post Actions","id":"101","status":"SUCCESS"}],"build_number":1234,"job_result":"FAILURE","trigger_by":"Started by user John Doe: Bitbucket PPR: pull request updated","scm":"git","user":"(scm)","build_url":"job/SomeSite/job/SomeJob_Build/1234/","queue_id":258942,"job_started_at":"2020-10-29T15:21:40Z"}

Here is a highlighted Example:

10/29/20
11:26:02.217 AM
{ [-]
build_number: 1234
build_url: job/SomeSite/job/SomeJob_Build/1234/
event_tag: job_event
job_duration: 261.361
job_name: SomeSite/SomeJob_Build
job_result: FAILURE
job_started_at: 2020-10-29T15:21:40Z
job_type: Pipeline
label: nojobs
metadata: { [-]
BITBUCKET_PR_ID:
BRANCH_SOURCE:
BRANCH_TO_BUILD:
scm: git
}
node: (master)
queue_id: 123456
queue_time: 9.549
scm: git
stages: [ [-]
{ [-]
children: [ [+]
]
duration: 59.661
id: 10
name: CheckoutLogic
pause_duration: 0
start_time: 1603984902
status: SUCCESS
}
{ [-]
children: [ [+]
]
duration: 112.969
id: 37
name: Install Dependancies
pause_duration: 0
start_time: 1603984961
status: SUCCESS
}
{ [-]
duration: 0.021
id: 42
name: Test
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
duration: 0.016
id: 45
name: Branch: Lint, Unit Test
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
children: [ [+]
]
duration: 47.47
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 48
name: Lint, Unit Test
pause_duration: 0
start_time: 1603985074
status: FAILURE
}
{ [-]
duration: 0.019
id: 46
name: Branch: Some Analysis
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
duration: 0.054
id: 50
name: Some Analysis
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
children: [ [+]
]
duration: 23.417
id: 57
name: Building Dev_Site » Some_scan
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
duration: 0.137
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 85
name: Build Code
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
duration: 0.15
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 89
name: Package
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
duration: 0.156
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 93
name: Zip Test Data
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
duration: 0.218
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 97
name: Prepare Artifacts
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
children: [ [+]
]
duration: 38.86
id: 101
name: Declarative: Post Actions
pause_duration: 0
start_time: 1603985123
status: SUCCESS
}
]
test_summary: { [+]
}
trigger_by: Started by user John Doe: Bitbucket PPR: pull request updated
type: completed
upstream:
user: (scm)
} 10/29/20
11:26:02.217 AM
{ [-]
build_number: 2999
build_url: job/Dev_Site/job/SomeSite_Build/2999/
event_tag: job_event
job_duration: 261.361
job_name: Dev_Site/SomeSite_Build
job_result: FAILURE
job_started_at: 2020-10-29T15:21:40Z
job_type: Pipeline
label: nojobs
metadata: { [-]
BITBUCKET_PR_ID:
BRANCH_SOURCE:
BRANCH_TO_BUILD:
scm: git
}
node: (master)
queue_id: 123456
queue_time: 9.549
scm: git
stages: [ [-]
{ [-]
children: [ [+]
]
duration: 59.661
id: 10
name: CheckoutLogic
pause_duration: 0
start_time: 1603984902
status: SUCCESS
}
{ [-]
children: [ [+]
]
duration: 112.969
id: 37
name: Install Dependancies
pause_duration: 0
start_time: 1603984961
status: SUCCESS
}
{ [-]
duration: 0.021
id: 42
name: Test
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
duration: 0.016
id: 45
name: Branch: Lint, Unit Test
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
children: [ [+]
]
duration: 47.47
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 48
name: Lint, Unit Test
pause_duration: 0
start_time: 1603985074
status: FAILURE
}
{ [-]
duration: 0.019
id: 46
name: Branch: Sonar Analysis
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
duration: 0.054
id: 50
name: Sonar Analysis
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
children: [ [+]
]
duration: 23.417
id: 57
name: Building Dev_Site » Some_scan
pause_duration: 0
start_time: 1603985074
status: SUCCESS
}
{ [-]
duration: 0.137
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 85
name: Build Code
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
duration: 0.15
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 89
name: Package
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
duration: 0.156
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 93
name: Zip Test Data
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
duration: 0.218
error: MicrosoftAzureStorage - Error occurred while uploading to Azure - SomeDevSite
error_type: java.io.IOException
id: 97
name: Prepare Artifacts
pause_duration: 0
start_time: 1603985122
status: FAILURE
}
{ [-]
children: [ [+]
]
duration: 38.86
id: 101
name: Declarative: Post Actions
pause_duration: 0
start_time: 1603985123
status: SUCCESS
}
]
test_summary: { [+]
}
trigger_by: Started by user John Doe: Bitbucket PPR: pull request updated
type: completed
upstream:
user: (scm)
}

 

Labels (1)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust
| eval name_status=mvzip('stages{}.name','stages{}.status')
| eval name_status=mvfilter(like(name_status,"%FAILURE"))
| eval stages_name = mvmap(name_status,mvindex(split(name_status,","),0))
| eval stages_status = mvmap(name_status,mvindex(split(name_status,","),1))
| table job_name job_result stages_name stages_status stages{}.error stages{}.error_type

View solution in original post

mishutts
Explorer

Thanks for the help. I can't test until we upgrade to 8 this month but believe the answer to be correct and will mark it solved.

0 Karma

mishutts
Explorer

Thank you for the suggestion. When I add this to the end of my search, I get "Error in 'eval' command: The 'mvmap' function is unsupported or undefined. " Am I correct that the query should be without the spath command?I tried both way but same result.

"index="jenkins_statistics" event_tag=job_event type="completed" stages{}.status=FAILURE

| eval name_status=mvzip('stages{}.name','stages{}.status')
| eval name_status=mvfilter(like(name_status,"%FAILURE"))
| eval stages_name = mvmap(name_status,mvindex(split(name_status,","),0))
| eval stages_status = mvmap(name_status,mvindex(split(name_status,","),1))
| table job_name job_result stages_name stages_status stages{}.error stages{}.error_type"

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

You need at least 8.0 for the mvmap function. The spath function is required to extract the json fields.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| eval name_status=mvzip('stages{}.name','stages{}.status')
| eval name_status=mvfilter(like(name_status,"%FAILURE"))
| eval stages_name = mvmap(name_status,mvindex(split(name_status,","),0))
| eval stages_status = mvmap(name_status,mvindex(split(name_status,","),1))
| table job_name job_result stages_name stages_status stages{}.error stages{}.error_type

mishutts
Explorer

Now that I have our Splunk instance upgraded to 8, I have verified this solution is correct. Exactly what I needed. Thank you ITWhisperer!

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...