Splunk Dev

How to reduce custom search command result chunk size?

joepjisc
Path Finder

We are developing a custom search command to create events, this is using a streaming command with version 2 of the protocol, as the source is quite slow we'd like to send smaller chunks of results back to Splunk than the default 50,000, e.g. chunks of 1,000 events, so that users can view the partial results sooner.

We've tried various approaches including an incrementing integar and calling self.flush() when it is divisable by 1,000, but that caused a buffer full error.

Any suggestions would be really appreciated

 

 

...
@Configuration(type='streaming')
class OurSearchCommand(GeneratingCommand):
    ...
    for item in OurGenerator():
        item['_time'] = item['timestamp']
        yield item

 

 

Labels (2)
0 Karma
1 Solution

DexterMarkley
Engager
I know this is an old question, but for anyone else looking for the answer you need to overwrite the record_writer for the class. This is working for me, but I am not sure if there are any other implications of doing this.
 

 

self._record_writer._maxresultrows = 1000 

 

View solution in original post

DexterMarkley
Engager
I know this is an old question, but for anyone else looking for the answer you need to overwrite the record_writer for the class. This is working for me, but I am not sure if there are any other implications of doing this.
 

 

self._record_writer._maxresultrows = 1000 

 

analyst
Loves-to-Learn Everything

@DexterMarkley  may you provide the location of file needed to be changes?

0 Karma
Get Updates on the Splunk Community!

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

Splunk Up Your Game: Why It's Time to Embrace Python 3.9+ and OpenSSL 3.0

Did you know that for Splunk Enterprise 9.4, Python 3.9 is the default interpreter? This shift is not just a ...

See your relevant APM services, dashboards, and alerts in one place with the updated ...

As a Splunk Observability user, you have a lot of data you have to manage, prioritize, and troubleshoot on a ...