Getting Data In

separate json payload to separate events

Esky73
Builder

my search :

SEARCH...
| rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
| spath input=ShortMessage
| rename sapResponseData{}.contractAccounts{}.nameId AS NameId sapResponseData{}.contractAccounts{}.contractAccount AS ContractAccountNumber sapResponseData{}.contractAccounts{}.sapResponses{}.contractNumber AS ContractNumber sapResponseData{}.contractAccounts{}.sapResponses{}.messageId AS messageId sapResponseData{}.contractAccounts{}.sapResponses{}.messageNumber AS messageNumber sapResponseData{}.contractAccounts{}.sapResponses{}.dataset AS dataSet metadata.correlationId AS CorrelationId metadata.sendDate AS SendDate metadata.sendTime AS SendTime 
| lookup dataset_lookup.csv dataset AS dataSet OUTPUTNEW usage AS Usage
| table _time NameId ContractAccountNumber ContractNumber messageId messageNumber dataSet Usage CorrelationId SendDate SendTime

new field after split field:

{
  "sapResponseData": [
    {
      "responseOriginatedIn": "ERROR",
      "contractAccounts": [
        {
          "nameId": "ABC_999999999999999",
          "contractAccount": "888888888888",
          "sapResponses": [
            {
              "contractNumber": 999999999,
              "responseCode": "002",
              "messageId": "DEF456",
              "messageNumber": "031",
              "dataset": "002"
            },
            {
              "contractNumber": 999999999,
              "responseCode": "002",
              "messageId": "ABC123",
              "messageNumber": "094",
              "dataset": "001"
            }
          ]
        }
      ]
    }
  ],
  "metadata": {
    "correlationId": "Correlation_ID",
    "priority": "LOW",
    "sendDate": "2019-03-21",
    "sendTime": 224813
  }
}

currently sapResponses all appear in the same field in the table

How to create new events per sapResponse ? and keep metadata for all events ?

Tags (2)
0 Karma
1 Solution

Esky73
Builder

I think i figured it out with help from here :

https://answers.splunk.com/answers/25653/mvexpand-multiple-multi-value-fields.html

| rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
| spath input=ShortMessage 
| rename sapResponseData{}.contractAccounts{}.* AS * metadata.* AS * sapResponseData{}.contractAccounts{}.sapResponses{}.* AS * 
| eval reading=mvzip(contractNumber, messageId) 
| eval reading=mvzip(reading, dataset) 
| eval reading=mvzip(reading, messageNumber) 
| mvexpand reading 
| makemv reading delim="," 
| eval contractNumber=mvindex(reading, 0) 
| eval messageId=mvindex(reading, 1) 
| eval messageNumber=mvindex(reading, -1) 
| eval dataset=mvindex(reading, 2) 
| lookup dataset_lookup.csv dataset OUTPUTNEW usage AS Usage 
| table _time nameId contractAccount contractNumber messageId messageNumber dataset Usage correlationId sendDate sendTime

View solution in original post

0 Karma

Esky73
Builder

I think i figured it out with help from here :

https://answers.splunk.com/answers/25653/mvexpand-multiple-multi-value-fields.html

| rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
| spath input=ShortMessage 
| rename sapResponseData{}.contractAccounts{}.* AS * metadata.* AS * sapResponseData{}.contractAccounts{}.sapResponses{}.* AS * 
| eval reading=mvzip(contractNumber, messageId) 
| eval reading=mvzip(reading, dataset) 
| eval reading=mvzip(reading, messageNumber) 
| mvexpand reading 
| makemv reading delim="," 
| eval contractNumber=mvindex(reading, 0) 
| eval messageId=mvindex(reading, 1) 
| eval messageNumber=mvindex(reading, -1) 
| eval dataset=mvindex(reading, 2) 
| lookup dataset_lookup.csv dataset OUTPUTNEW usage AS Usage 
| table _time nameId contractAccount contractNumber messageId messageNumber dataset Usage correlationId sendDate sendTime
0 Karma

woodcock
Esteemed Legend

Click Accept on your answer to close the question.

0 Karma

somesoni2
Revered Legend

Give this a try

 SEARCH...
 | rex field=Message "^(?<Short>.*),\sRequestBody:\s(?<ShortMessage>[^\s]+)\".*$" 
 | spath input=ShortMessage
 | rename sapResponseData{}.contractAccounts{}.* AS *  sapResponseData{}.contractAccounts{}.sapResponses{}.*  metadata.* AS *
 | lookup dataset_lookup.csv dataset OUTPUTNEW usage AS Usage
 | table _time nameId contractAccount contractNumber messageId messageNumber dataset Usage correlationId sendDate sendTime
| eval temp=mvzip(contractNumber,mvzip(messageId,mvzip(messageNumber,dataset,"|"),"|"),"|") 
| mvexpand temp | rex field=temp "(?<contractNumber>]^\|]+)\|(?<messageId>]^\|]+)\|(?<messageNumber>]^\|]+)\|(?<dataset>]^\|]+)" | fields - temp 
0 Karma

Esky73
Builder

Hi Thanks for answering - this has no effect :

i still have the following for contractNumber,messageId,messageNumber,dataset

contractNumber  messageId   messageNumber   dataset
123456          12345       0123            001
123456          67891       4567            002
0 Karma

coccyx
Path Finder

You can look into mvexpand with your JSON array to explode each event into multiples. As an alternative, Cribl allows you to pre-process events in the ingestion pipeline and we include a function called JSON Unroll which allows you to create an event per entry an a JSON array.

0 Karma

Esky73
Builder

Hi - thanks - i tried mvexpand initially but that gave me a duplicate event.

0 Karma
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...