Hi All,
we are getting " HTTP 400 Bad Request -- Request exceeds API limits - see limits.conf for details. (Too many documents for a single batch save.)" Error from python job.
Batch save is not working fine if records count is more than 200.
I have added below stanzas in local,but not working fine.
[kvstore]
max_queries_per_batch = 500
max_rows_in_memory_per_dump = 1000
max_threads_per_outputlookup = 10
max_documents_per_batch_save = 500
Please help me out from this.
What is your H/W specification for the search head?
Have you checked if the search head is running into Out-Of-Memory once the issue happens?
If you still see below Error:
"StateStoreError: 'Batch save to KV store failed with code 400. Error details: Request exceeds API limits - see limits.conf for details. (Batch save size=52439798 too large)' "
You may need to increase the max_size_per_result_mb for the [kvstore] in limits.conf
https://docs.splunk.com/Documentation/Splunk/latest/Admin/Limitsconf#.5Bkvstore.5D
max_size_per_result_mb =
* The maximum size, in megabytes (MB), of the result that will be
returned for a single query to a collection.
* Default: 50
If you see the error during ITSI operation,
Please see our doc regarding the size limit:
Hope it help and see how it works for you.
H/W specification for the search head:
OS : RedHat 7.5
Arch : x86_64
CPU : 36 x Intel(R) Xeon(R) Platinum 8124M CPU @ 3.00GHz
RAM : 68.54 GB
SELinux is enforcing
No bhai Increase ,but not working.
[http_input]
max_content_length= 838860800000000
[kvstore]
max_size_per_result_mb =20000
[kvstore]
max_size_per_batch_save_mb =2000
[kvstore]
max_size_per_batch_result_mb =2000
Hi
Did you have a look at max_content_length
Maximum size of HTTP request is specified in limits.conf with this.
Bhai , We Increased max_content_length ,but not working .
[http_input]
max_content_length= 838860800000000
[kvstore]
max_size_per_result_mb =20000
[kvstore]
max_size_per_batch_save_mb =2000
[kvstore]
max_size_per_batch_result_mb =2000