Getting Data In

Split json into multiple events and sourcetype

sboogaar
Path Finder

Lets say I have the following json data onboarded.

  {
       "slaves": [{
              "id": "1234",
              "hostname": "12556"
       },
       {
              "id": "1245",
              "hostname": "1266"
       }]
      "masters": [{
              "id": "2234",
              "hostname": "22556"
       },
       {
              "id": "2245",
              "hostname": "2266"
       }]
  }

The result that I want is that for each slave I get an event with sourcetype indexnamex:slave and for each master I want to put each event in sourcetype indexnamex:master

So in indexnamex:slave I want 2 events

indexnamex:slave Event1

{"id": "1234","hostname": "12556" }

indexnamex:slave Event2

{ "id": "1245", "hostname": "1266" }

And in indexnamex:master also two events

indexnamex:master Event 1

{ "id": "2234", "hostname": "22556" }

indexnamex:master Event 2

{ "id": "2245", "hostname": "2266" }

I can not split on e.g. hostname x } as it is the same for slaves and masters.
Is it possible to do splitting in multiple steps?
e.g. first split on "slaves" : and "masters":
and after that split do a split on what is left?

If not are there any other options?

note: the example is simpler than my real data as it is 10k lines.

0 Karma

maciep
Champion

i could be wrong, but i don't think this is possible at parse/index time. One of the first things splunk will do is break the stream of data into events...and it only does that once. At that point, if you want just one id/host object per event, you've lost the context of masters/slaves and i don't think you can get it back. The option of large events with multiple objects would allow you to set the sourcetype the way you want i think, but then you have giant events with many id/host objects.

maybe you could break into large "masters" and "slaves" events and then come back around at search time with the collect command to split it up the way you want.

Of course if you can change the source and just split it up before you send to splunk, then that's a good option as well.

Curious too if this sort of problem might be a use case for the Data Stream Processor that's in beta now
https://www.splunk.com/en_us/software/splunk-next.html

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...