Getting Data In

Event Splitting on Nested JSON

JakeInfoSec
Explorer

I have JSON files which I am trying to event split as the JSON contains multiple events within each log. Here is an example of what the log would look like.

 

 

{
  "vulnerability": [
    {
      "event": {
        "sub1": {
          "complexity": "LOW"
        },
        "sub2": {
          "complexity": "LOW"
        }
      },
      "id": "test",
      "description": "test",
      "state": "No Known",
      "risk_rating": "LOW",
      "sources": [
        {
          "date": "test"
        }
      ],
      "additional_info": [
        {
          "test": "test"
        }
      ],
      "was_edited": false
    },
    {
      "event": {
        "sub1": {
          "complexity": "LOW"
        },
        "sub2": {
          "complexity": "LOW"
        }
      },
      "id": "test",
      "description": "test",
      "state": "No Known",
      "risk_rating": "LOW",
      "sources": [
        {
          "date": "test"
        }
      ],
      "additional_info": [
        {
          "test": "test"
        }
      ],
      "was_edited": false
    }
  ],
  "next": "test",
  "total_count": 109465
}

 

 

 In this example there would be two separate events that I need extracted out. I am essentially trying to pull out the event1 and event2 nests. Each log should have this same exact JSON format but there could be any number of events included in them. 

First event

 

 

    {
      "event": {
        "sub1": {
          "complexity": "LOW"
        },
        "sub2": {
          "complexity": "LOW"
        }
      },
      "id": "test",
      "description": "test",
      "state": "No Known",
      "risk_rating": "LOW",
      "sources": [
        {
          "date": "test"
        }
      ],
      "additional_info": [
        {
          "test": "test"
        }
      ],
      "was_edited": false
    }

 

 

Second event

 

    {
      "event": {
        "sub1": {
          "complexity": "LOW"
        },
        "sub2": {
          "complexity": "LOW"
        }
      },
      "id": "test",
      "description": "test",
      "state": "No Known",
      "risk_rating": "LOW",
      "sources": [
        {
          "date": "test"
        }
      ],
      "additional_info": [
        {
          "test": "test"
        }
      ],
      "was_edited": false
    }

 

 

 

I also want to exclude the opening 

 

 

{
  "vulnerability": [

 

 

and closing 

 

 

  ],
  "next": "test",
  "total_count": 109465
}

 

 

 portions of the log files.

 

Am I missing something on how to set this sourcetype up? I have the following currently but that does not seem to be working

LINE_BREAKER = \{(\r+|\n+|\t+|\s+)"event":

Labels (1)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Try this

LINE_BREAKER = ([\r\n]+)\{[\s\S]+?event\d
SEDCMD-stripStart = s/\{[\s\S]+?"vulnerability":\s\[//
SEDCMD-stripEnd = s/\],[\s\S]+?"next": .*//

The [\s\S]+? construct usually works best at matching embedded newlines.

---
If this reply helps you, Karma would be appreciated.
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Are you sure it will work with multiline events? I'm not 100% sure which regex flags are on with SEDCMD

0 Karma

JakeInfoSec
Explorer

Yeah I tried out the LINE_BREAKER provided above but didn't seem to have any luck. No matter what I have tried I haven't been able to get it working as hoped. I think you're right in that the layout as is is just bad so I'm going to go back to the drawing board and try to change how the logs are formatted prior to hitting Splunk. 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

This is simply bad data (at least from Splunk's point of view).

Even if you managed to break it into events (but I gotta honestly say that I see no way to reliably make sure you break in proper places and only in those places; manipulating structured data with just regexes is simply not reliable because regexes are not structure-aware), you'll still have those headers and footers (attached to an end of another event).

Also resulting events would have inconsistent contents - one event would have "event1" field, another would be "event2".

The best solution here would be to process your data and split before pushing it to Splunk.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...