Getting Data In

Parse log which contain unstructured and inner-json

gilsegev468
Engager

Hey,

I have a problem to parse my data: 

19-04-2021 gil-server-1 {"systemId":"1254", "systemName":"coffe", "message":"hello dor"}

I want to extract the fields before Splunk index the data.  

How to configure the props.conf or the transforms.conf?

Labels (2)
0 Karma

venkatasri
SplunkTrust
SplunkTrust

Hi @gilsegev468 

There are two ways of fields extraction search-time and index-time extraction ( which means while parsing extract and write to indexer).

In your case a search-time extraction is fine with a combination of inline rex (same can be configured as props.conf inside Search-head)  and spath as inner-json is the one you want to extract fields from.

venkatasri_0-1618903989423.png

| makeresults 
| eval log_data="19-04-2021 gil-server-1 {\"systemId\":\"1254\", \"systemName\":\"coffe\", \"message\":\"hello dor\"}" 
| rex field=log_data "(?<inner_json>\{\".*)" 
| spath input=inner_json 
| table systemId systemName message

If you want to use in-line rex then field=_raw (default), same regex can be configured to source/sourcetype in props.conf for search-time extraction and deploy it to Search Head.

---------------------------------------------

An upvote would be appreciated if it helps!

0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...