All Apps and Add-ons

How to handle huge data in Splunk custom command



We have implemented a custom command which queries the external rest api and pulls the data to splunk search page. The challenge we are facing is when the response data is huge Splunk search page is waiting for couple of minutes(more than 5 minutes) with out showing any data.

The result of the api's comes in the form of partitions. Lets say, if we have 100k records in the api rsult, All those 100k rows would be splitting up into 100 partitions and we need to iterate over 100 times to get all the 100k records.

Similarly if we could send the partition data to splunk and get the results appended to the splunk page as and when we get data for all partitions, we can have end user see the data as soon as possible instead of waiting for couple of minutes.

My custom command is generating custom command. I would like to know if there is any way to send the data in chunks to Splunk page instead waiting to pull all the 100k records. We tried couple of ways like yield (our code is in Python and using Splunk python SDK), enabling streaming attribute etc.

Please help me here to figure out a way to send the data in chunks for the generating custom command.

Thanking you.

Labels (2)
0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...