- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
joepjisc
Path Finder
12-18-2020
06:19 AM
We are developing a custom search command to create events, this is using a streaming command with version 2 of the protocol, as the source is quite slow we'd like to send smaller chunks of results back to Splunk than the default 50,000, e.g. chunks of 1,000 events, so that users can view the partial results sooner.
We've tried various approaches including an incrementing integar and calling self.flush() when it is divisable by 1,000, but that caused a buffer full error.
Any suggestions would be really appreciated
...
@Configuration(type='streaming')
class OurSearchCommand(GeneratingCommand):
...
for item in OurGenerator():
item['_time'] = item['timestamp']
yield item
1 Solution
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
DexterMarkley
Engager
06-06-2023
11:14 AM
I know this is an old question, but for anyone else looking for the answer you need to overwrite the record_writer for the class. This is working for me, but I am not sure if there are any other implications of doing this.
self._record_writer._maxresultrows = 1000
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
DexterMarkley
Engager
06-06-2023
11:14 AM
I know this is an old question, but for anyone else looking for the answer you need to overwrite the record_writer for the class. This is working for me, but I am not sure if there are any other implications of doing this.
self._record_writer._maxresultrows = 1000
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
analyst
Loves-to-Learn Everything
09-20-2023
07:17 PM
@DexterMarkley may you provide the location of file needed to be changes?
