Getting Data In

How to split JSON events to usable format?

Barty
Explorer

Good morning all,

Complete novice with JSON workings, but essentially I have managed to configure a REST api that's ingesting the result of an alternative monitoring tool (sacrilage I know), running version 7.3.0. The events are coming in in JSON format as follows:-

{ [-]
channels: [ [-]
{ [-]
lastvalue: 0 #
lastvalue_raw: 0
name: $["Data"]["Attributes"]["ConnectionTime"]
name_raw: $["Data"]["Attributes"]["ConnectionTime"]
}
{ [-]
lastvalue: 15,204 #
lastvalue_raw: 15204
name: $["Data"]["Attributes"]["DownloadTime"]
name_raw: $["Data"]["Attributes"]["DownloadTime"]
}
{ [-]
lastvalue: 0 #
lastvalue_raw: 0
name: $["Data"]["Attributes"]["ErrorCode"]
name_raw: $["Data"]["Attributes"]["ErrorCode"]
}
{ [-]
lastvalue: 0 #
lastvalue_raw: 0
name: $["Data"]["Attributes"]["ResolveTime"]
name_raw: $["Data"]["Attributes"]["ResolveTime"]
}
{ [-]
lastvalue: 2,190 #
lastvalue_raw: 2190
name: $["Data"]["Attributes"]["ServerId"]
name_raw: $["Data"]["Attributes"]["ServerId"]
}

What I was hoping to do was produce a simple search that displays the name: $["Data"]["Attributes"] field with the corrseponding last_value next to it. However, as this is coming in as an array, I cannot simply do that and despite some search manipulation, I feel I'm moving further and further away from a simpler solution. Would someone be able to assist me in separating out the name: $["Data"]["Attributes"] fields into separate useable fields, and taking the lastvalue with it please?

Raw event looks as such:-

{"version":"19.3.51.2830","treesize":0,"channels":[{"name":"$["Data"]["Attributes"]["ConnectionTime"]","name_raw":"$[\"Data\"][\"Attributes\"][\"ConnectionTime\"]","lastvalue":"0 #","lastvalue_raw":0.0000},{"name":"$["Data"]["Attributes"]["DownloadTime"]","name_raw":"$[\"Data\"][\"Attributes\"][\"DownloadTime\"]","lastvalue":"15,204 #","lastvalue_raw":15204.0000},{"name":"$["Data"]["Attributes"]["ErrorCode"]","name_raw":"$[\"Data\"][\"Attributes\"][\"ErrorCode\"]","lastvalue":"0 #","lastvalue_raw":0.0000},{"name":"$["Data"]["Attributes"]["ResolveTime"]","name_raw":"$[\"Data\"][\"Attributes\"][\"ResolveTime\"]","lastvalue":"0 #","lastvalue_raw":0.0000},{"name":"$["Data"]["Attributes"]["ServerId"]","name_raw":"$[\"Data\"][\"Attributes\"][\"ServerId\"]","lastvalue":"2,190 #","lastvalue_raw":2190.0000},{"name":"$["Data"]["Attributes"]["StagingMode"]","name_raw":"$[\"Data\"][\"Attributes\"][\"StagingMode\"]","lastvalue":"0 #","lastvalue_raw":0.0000},{"name":"$["Data"]["Attributes"]["TotalTime"]","name_raw":"$[\"Data\"][\"Attributes\"][\"TotalTime\"]","lastvalue":"15,204 #","lastvalue_raw":15204.0000},{"name":"$["Data"]["Id"]","name_raw":"$[\"Data\"][\"Id\"]","lastvalue":"51,237,312,666 #","lastvalue_raw":51237312666.0000},{"name":"$["Relationships"]["0"]["Id"]","name_raw":"$[\"Relationships\"][\"0\"][\"Id\"]","lastvalue":"2,190 #","lastvalue_raw":2190.0000},{"name":"$["Relationships"]["1"]["Id"]","name_raw":"$[\"Relationships\"][\"1\"][\"Id\"]","lastvalue":"51,237,312,666 #","lastvalue_raw":51237312666.0000},{"name":"Download Time","lastvalue":"0 s","lastvalue_raw":0.0000},{"name":"Downtime","lastvalue":""},{"name":"Error Code","lastvalue":"0 #","lastvalue_raw":0.0000},{"name":"Response Time","lastvalue":"350 msec","lastvalue_raw":350.0000}]}

0 Karma

DalJeanis
Legend

1) you are looking for the spath command.

2) Your ingestion stanzas should be telling the system that the sourcetype is a json, so it can do autoextraction.

3) Splunk happily accepts data from any monitoring or analysis tool, IoT object, or other monitoring solution, and it's not sacrilege, its canon. mmooooarrr daaataaa!!!! 😉

0 Karma
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...