Getting Data In

separate json payload to separate events

Esky73
Builder

my search :

SEARCH...
| rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
| spath input=ShortMessage
| rename sapResponseData{}.contractAccounts{}.nameId AS NameId sapResponseData{}.contractAccounts{}.contractAccount AS ContractAccountNumber sapResponseData{}.contractAccounts{}.sapResponses{}.contractNumber AS ContractNumber sapResponseData{}.contractAccounts{}.sapResponses{}.messageId AS messageId sapResponseData{}.contractAccounts{}.sapResponses{}.messageNumber AS messageNumber sapResponseData{}.contractAccounts{}.sapResponses{}.dataset AS dataSet metadata.correlationId AS CorrelationId metadata.sendDate AS SendDate metadata.sendTime AS SendTime 
| lookup dataset_lookup.csv dataset AS dataSet OUTPUTNEW usage AS Usage
| table _time NameId ContractAccountNumber ContractNumber messageId messageNumber dataSet Usage CorrelationId SendDate SendTime

new field after split field:

{
  "sapResponseData": [
    {
      "responseOriginatedIn": "ERROR",
      "contractAccounts": [
        {
          "nameId": "ABC_999999999999999",
          "contractAccount": "888888888888",
          "sapResponses": [
            {
              "contractNumber": 999999999,
              "responseCode": "002",
              "messageId": "DEF456",
              "messageNumber": "031",
              "dataset": "002"
            },
            {
              "contractNumber": 999999999,
              "responseCode": "002",
              "messageId": "ABC123",
              "messageNumber": "094",
              "dataset": "001"
            }
          ]
        }
      ]
    }
  ],
  "metadata": {
    "correlationId": "Correlation_ID",
    "priority": "LOW",
    "sendDate": "2019-03-21",
    "sendTime": 224813
  }
}

currently sapResponses all appear in the same field in the table

How to create new events per sapResponse ? and keep metadata for all events ?

Tags (2)
0 Karma
1 Solution

Esky73
Builder

I think i figured it out with help from here :

https://answers.splunk.com/answers/25653/mvexpand-multiple-multi-value-fields.html

| rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
| spath input=ShortMessage 
| rename sapResponseData{}.contractAccounts{}.* AS * metadata.* AS * sapResponseData{}.contractAccounts{}.sapResponses{}.* AS * 
| eval reading=mvzip(contractNumber, messageId) 
| eval reading=mvzip(reading, dataset) 
| eval reading=mvzip(reading, messageNumber) 
| mvexpand reading 
| makemv reading delim="," 
| eval contractNumber=mvindex(reading, 0) 
| eval messageId=mvindex(reading, 1) 
| eval messageNumber=mvindex(reading, -1) 
| eval dataset=mvindex(reading, 2) 
| lookup dataset_lookup.csv dataset OUTPUTNEW usage AS Usage 
| table _time nameId contractAccount contractNumber messageId messageNumber dataset Usage correlationId sendDate sendTime

View solution in original post

0 Karma

Esky73
Builder

I think i figured it out with help from here :

https://answers.splunk.com/answers/25653/mvexpand-multiple-multi-value-fields.html

| rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
| spath input=ShortMessage 
| rename sapResponseData{}.contractAccounts{}.* AS * metadata.* AS * sapResponseData{}.contractAccounts{}.sapResponses{}.* AS * 
| eval reading=mvzip(contractNumber, messageId) 
| eval reading=mvzip(reading, dataset) 
| eval reading=mvzip(reading, messageNumber) 
| mvexpand reading 
| makemv reading delim="," 
| eval contractNumber=mvindex(reading, 0) 
| eval messageId=mvindex(reading, 1) 
| eval messageNumber=mvindex(reading, -1) 
| eval dataset=mvindex(reading, 2) 
| lookup dataset_lookup.csv dataset OUTPUTNEW usage AS Usage 
| table _time nameId contractAccount contractNumber messageId messageNumber dataset Usage correlationId sendDate sendTime
0 Karma

woodcock
Esteemed Legend

Click Accept on your answer to close the question.

0 Karma

somesoni2
Revered Legend

Give this a try

 SEARCH...
 | rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
 | spath input=ShortMessage
 | rename sapResponseData{}.contractAccounts{}.* AS *  sapResponseData{}.contractAccounts{}.sapResponses{}.*  metadata.* AS *
 | lookup dataset_lookup.csv dataset OUTPUTNEW usage AS Usage
 | table _time nameId contractAccount contractNumber messageId messageNumber dataset Usage correlationId sendDate sendTime
| eval temp=mvzip(contractNumber,mvzip(messageId,mvzip(messageNumber,dataset,"|"),"|"),"|") 
| mvexpand temp | rex field=temp "(?<contractNumber>]^\|]+)\|(?<messageId>]^\|]+)\|(?<messageNumber>]^\|]+)\|(?<dataset>]^\|]+)" | fields - temp 
0 Karma

Esky73
Builder

Hi Thanks for answering - this has no effect :

i still have the following for contractNumber,messageId,messageNumber,dataset

contractNumber  messageId   messageNumber   dataset
123456          12345       0123            001
123456          67891       4567            002
0 Karma

coccyx
Path Finder

You can look into mvexpand with your JSON array to explode each event into multiples. As an alternative, Cribl allows you to pre-process events in the ingestion pipeline and we include a function called JSON Unroll which allows you to create an event per entry an a JSON array.

0 Karma

Esky73
Builder

Hi - thanks - i tried mvexpand initially but that gave me a duplicate event.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Introduction to Splunk AI

How are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. Lucky for ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Maximizing the Value of Splunk ES 8.x

Splunk Enterprise Security (ES) continues to be a leader in the Gartner Magic Quadrant, reflecting its pivotal ...