Splunk Search

About Parsing JSON Log in splunk

huylbq
Loves-to-Learn Lots

<6>2023-08-17T04:51:52Z 49786672a6c4 PICUS[1]: {"common":{"unique_id":"6963f063-a68d-482c-a22a-9e96ada33126","time":"2023-08-17T04:51:51.668553048Z","type":"","action":"","user_id":0,"user_email":"","user_first_name":"","user_last_name":"","account_id":7161,"ip":"","done_with_api":false,"platform_licences":null},"data":{"ActionID":26412,"ActionName":"Zebrocy Malware Downloader used by APT28 Threat Group .EXE File Download Variant-3","AgentName":"VICTIM-99","AssessmentName":"LAB02","CVE":"_","DestinationPort":"443","File":"682822.exe","Hash":"eb81c1be62f23ac7700c70d866e84f5bc354f88e6f7d84fd65374f84e252e76b","Result":{"alert_result":"","has_detection_result":false,"logging_result":"","prevention_result":"blocked"},"RunID":109802,"SimulationID":36236,"SourcePort":"51967","Time":5}}

I have a raw log like that, can you help me to parsing it into seperated lines ?

Labels (3)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Like @gcusello said, you don't need to parse raw logs into separate lines.  You just need to extract the part that is compliant JSON, then use spath to extract JSON nodes into Splunk fields.

 

| eval json = replace(_raw, "^[^\{]+", "")
| spath input=json

 

Your sample event gives

common.account_idcommon.actioncommon.done_with_api...data.Timejson
7161 false 5{"common":{"unique_id":"6963f063-a68d-482c-a22a-9e96ada33126","time":"2023-08-17T04:51:51.668553048Z","type":"","action":"","user_id":0,"user_email":"","user_first_name":"","user_last_name":"","account_id":7161,"ip":"","done_with_api":false,"platform_licences":null},"data":{"ActionID":26412,"ActionName":"Zebrocy Malware Downloader used by APT28 Threat Group .EXE File Download Variant-3","AgentName":"VICTIM-99","AssessmentName":"LAB02","CVE":"_","DestinationPort":"443","File":"682822.exe","Hash":"eb81c1be62f23ac7700c70d866e84f5bc354f88e6f7d84fd65374f84e252e76b","Result":{"alert_result":"","has_detection_result":false,"logging_result":"","prevention_result":"blocked"},"RunID":109802,"SimulationID":36236,"SourcePort":"51967","Time":5}}

Here is an emulation you can play with and compare with real data

 

| makeresults
| eval _raw = "<6>2023-08-17T04:51:52Z 49786672a6c4 PICUS[1]: {\"common\":{\"unique_id\":\"6963f063-a68d-482c-a22a-9e96ada33126\",\"time\":\"2023-08-17T04:51:51.668553048Z\",\"type\":\"\",\"action\":\"\",\"user_id\":0,\"user_email\":\"\",\"user_first_name\":\"\",\"user_last_name\":\"\",\"account_id\":7161,\"ip\":\"\",\"done_with_api\":false,\"platform_licences\":null},\"data\":{\"ActionID\":26412,\"ActionName\":\"Zebrocy Malware Downloader used by APT28 Threat Group .EXE File Download Variant-3\",\"AgentName\":\"VICTIM-99\",\"AssessmentName\":\"LAB02\",\"CVE\":\"_\",\"DestinationPort\":\"443\",\"File\":\"682822.exe\",\"Hash\":\"eb81c1be62f23ac7700c70d866e84f5bc354f88e6f7d84fd65374f84e252e76b\",\"Result\":{\"alert_result\":\"\",\"has_detection_result\":false,\"logging_result\":\"\",\"prevention_result\":\"blocked\"},\"RunID\":109802,\"SimulationID\":36236,\"SourcePort\":\"51967\",\"Time\":5}}"
| eval json = replace(_raw, "^[^\{]+", "")
``` data emulation above ```

 

Hope this helps

Tags (2)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @huylbq,

usually json files aren't divided in separated lines because there's an header commo to more definitions.

You can extract all the fields using the "INDEXED_EXTRACTIONS = json" option in the sourcetype or the "spath" command (https://community.splunk.com/t5/Splunk-Enterprise/spath-command/m-p/518343).

Ciao.

Giuseppe

0 Karma

huylbq
Loves-to-Learn Lots

Any suggestion about Line breaker in props.conf or transform.conf

0 Karma

yuanliu
SplunkTrust
SplunkTrust

What is the problem with default line breaker?  Unless you can describe a specific problem, Configure event line breaking is the best suggestion others can give.

0 Karma
Get Updates on the Splunk Community!

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

See your relevant APM services, dashboards, and alerts in one place with the updated ...

As a Splunk Observability user, you have a lot of data you have to manage, prioritize, and troubleshoot on a ...

Index This | What goes away as soon as you talk about it?

May 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this month’s ...