Getting Data In

Split json into multiple events and sourcetype

sboogaar
Path Finder

Lets say I have the following json data onboarded.

  {
       "slaves": [{
              "id": "1234",
              "hostname": "12556"
       },
       {
              "id": "1245",
              "hostname": "1266"
       }]
      "masters": [{
              "id": "2234",
              "hostname": "22556"
       },
       {
              "id": "2245",
              "hostname": "2266"
       }]
  }

The result that I want is that for each slave I get an event with sourcetype indexnamex:slave and for each master I want to put each event in sourcetype indexnamex:master

So in indexnamex:slave I want 2 events

indexnamex:slave Event1

{"id": "1234","hostname": "12556" }

indexnamex:slave Event2

{ "id": "1245", "hostname": "1266" }

And in indexnamex:master also two events

indexnamex:master Event 1

{ "id": "2234", "hostname": "22556" }

indexnamex:master Event 2

{ "id": "2245", "hostname": "2266" }

I can not split on e.g. hostname x } as it is the same for slaves and masters.
Is it possible to do splitting in multiple steps?
e.g. first split on "slaves" : and "masters":
and after that split do a split on what is left?

If not are there any other options?

note: the example is simpler than my real data as it is 10k lines.

0 Karma

maciep
Champion

i could be wrong, but i don't think this is possible at parse/index time. One of the first things splunk will do is break the stream of data into events...and it only does that once. At that point, if you want just one id/host object per event, you've lost the context of masters/slaves and i don't think you can get it back. The option of large events with multiple objects would allow you to set the sourcetype the way you want i think, but then you have giant events with many id/host objects.

maybe you could break into large "masters" and "slaves" events and then come back around at search time with the collect command to split it up the way you want.

Of course if you can change the source and just split it up before you send to splunk, then that's a good option as well.

Curious too if this sort of problem might be a use case for the Data Stream Processor that's in beta now
https://www.splunk.com/en_us/software/splunk-next.html

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...