Hi @BernardEAI
limits.conf having 50k limit by default for [kvstore], I guess you shall change that in conf can not be passed in query as platform itself limiting it.
max_rows_per_query = <unsigned integer> * The maximum number of rows that will be returned for a single query to a collection. * If the query returns more rows than the specified value, then returned result set will contain the number of rows specified in this value. * Default: 50000
--
An upvote would be appreciated and Accept solution if this reply helps!
Thanks @venkatasri
On our DEV server this would be easy to solve, I could change the max_rows_per_query parameter in limits.conf.
On our production environment, we are tenants on a multi-tenant platform, so we do not have access to the configuration files.
The approach I took here is to make use of the skip parameter that is available in the query function. I can then have a loop that runs through the entire kv store by incrementing the skip parameter:
while end == False:
data_list = collection.data.query(skip=skip_tracker)
if len(data_list) != 0:
for item in data_list:
# perform action on entry in kv store (delete, update etc.)
skip_tracker += 50000
else:
end = True
More details on the query function here:
Hi @BernardEAI
limits.conf having 50k limit by default for [kvstore], I guess you shall change that in conf can not be passed in query as platform itself limiting it.
max_rows_per_query = <unsigned integer> * The maximum number of rows that will be returned for a single query to a collection. * If the query returns more rows than the specified value, then returned result set will contain the number of rows specified in this value. * Default: 50000
--
An upvote would be appreciated and Accept solution if this reply helps!