Building for the Splunk Platform

How to reduce custom search command result chunk size?

joepjisc
Path Finder

We are developing a custom search command to create events, this is using a streaming command with version 2 of the protocol, as the source is quite slow we'd like to send smaller chunks of results back to Splunk than the default 50,000, e.g. chunks of 1,000 events, so that users can view the partial results sooner.

We've tried various approaches including an incrementing integar and calling self.flush() when it is divisable by 1,000, but that caused a buffer full error.

Any suggestions would be really appreciated

 

 

...
@Configuration(type='streaming')
class OurSearchCommand(GeneratingCommand):
    ...
    for item in OurGenerator():
        item['_time'] = item['timestamp']
        yield item

 

 

Labels (2)
0 Karma

DexterMarkley
New Member
I know this is an old question, but for anyone else looking for the answer you need to overwrite the record_writer for the class. This is working for me, but I am not sure if there are any other implications of doing this.
 

 

self._record_writer._maxresultrows = 1000 

 

0 Karma

analyst
Loves-to-Learn Everything

@DexterMarkley  may you provide the location of file needed to be changes?

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud | Customer Survey!

If you use Splunk Observability Cloud, we invite you to share your valuable insights with us through a brief ...

Happy CX Day, Splunk Community!

Happy CX Day, Splunk Community! CX stands for Customer Experience, and today, October 3rd, is CX Day — a ...

.conf23 | Get Your Cybersecurity Defense Analyst Certification in Vegas

We’re excited to announce a new Splunk certification exam being released at .conf23! If you’re going to Las ...