All Apps and Add-ons

Unable to break JSON events from a REST Modular input (SPLUNK CLOUD)

ALXWBR
Path Finder

We are pulling some data from REST using REST API Modular Input (splunkbase.splunk.com/app/1546/), Response type json, and receiving the below response

{ 
"request" : "/getAllNearTermPSRT", 
"response" : { 
"now" : "2018-11-29T14:19:00Z", 
"flights" : 
[
{ 
"AODBUniqueField" : "7274920", 
"flightNo" : "BA2233", 
"PSRTc" : "2018-11-28T14:16:00Z", 
"PSRT1" : "2018-11-28T14:15:00Z", 
"PSRT2" : "2018-11-28T14:15:00Z", 
"PSRT3" : "2018-11-28T14:15:00Z" 
}, 
{ 
"AODBUniqueField" : "7274889", 
"flightNo" : "EZY8255", 
"PSRTc" : "2018-11-28T14:26:00Z", 
"PSRT1" : "2018-11-28T14:21:00Z", 
"PSRT2" : "2018-11-28T14:21:00Z", 
"PSRT3" : "2018-11-28T14:21:00Z" 
}, 
{ 
"AODBUniqueField" : "7274797", 
"flightNo" : "EZY8474", 
"PSRTc" : "2018-11-28T14:24:00Z", 
"PSRT1" : "2018-11-28T14:26:00Z", 
"PSRT2" : "2018-11-28T14:25:00Z", 
"PSRT3" : "2018-11-28T14:25:00Z" 
},
{ 
"AODBUniqueField" : "7274825", 
"flightNo" : "D82806", 
"PSRTc" : "2018-11-28T14:26:00Z", 
"PSRT1" : "2018-11-28T14:27:00Z", 
"PSRT2" : "2018-11-28T14:26:00Z", 
"PSRT3" : "2018-11-28T14:26:00Z" 
},
{ 
"AODBUniqueField" : "7274478", 
"flightNo" : "MT556", 
"PSRTc" : "2018-11-28T15:03:00Z", 
"PSRT1" : "2018-11-28T14:41:00Z", 
"PSRT2" : "2018-11-28T15:03:00Z", 
"PSRT3" : "2018-11-28T15:10:00Z" 
}, 
{ 
"AODBUniqueField" : "7274932", 
"flightNo" : "EI239", 
"PSRTc" : "2018-11-28T15:25:00Z", 
"PSRT1" : "2018-11-28T15:10:00Z", 
"PSRT2" : "2018-11-28T15:24:00Z", 
"PSRT3" : "2018-11-28T15:25:00Z" }
]
}
}

We would like to split each individual flight into individual events using current time as the timestamp, however, no matter what I have tried, I can't get Splunk to break the events. We've tried multiple different BREAK options in the sourcetype, but still no luck. We either get the whole lot as one single event, or we just get two events, one saying "request" and the other "response".

Is anyone able to help?

0 Karma
1 Solution

Damien_Dallimor
Ultra Champion

Plenty of examples already on answers about using custom response handlers to break up JSON into events , such as : https://answers.splunk.com/answers/701556/split-twitter-events-in-multiple-events.html#answer-702459

class FlightsHandler:

     def __init__(self,**args):
         pass

     def __call__(self, response_object,raw_response_output,response_type,req_args,endpoint):
         if response_type == "json":        
             output = json.loads(raw_response_output)

             for flight in output["response"]["flights"]:
                 flight["timestamp"] = output["response"]["now"]
                 print_xml_stream(json.dumps(flight))   
         else:
             print_xml_stream(raw_response_output)

View solution in original post

Damien_Dallimor
Ultra Champion

Plenty of examples already on answers about using custom response handlers to break up JSON into events , such as : https://answers.splunk.com/answers/701556/split-twitter-events-in-multiple-events.html#answer-702459

class FlightsHandler:

     def __init__(self,**args):
         pass

     def __call__(self, response_object,raw_response_output,response_type,req_args,endpoint):
         if response_type == "json":        
             output = json.loads(raw_response_output)

             for flight in output["response"]["flights"]:
                 flight["timestamp"] = output["response"]["now"]
                 print_xml_stream(json.dumps(flight))   
         else:
             print_xml_stream(raw_response_output)

ALXWBR
Path Finder

Thank you so much!!!! This has been giving me such a headache

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...