We are trying to find a way to leverage the REST API to perform a bulk delete of KV Store records. Currently, the REST API only supports single record deletion or whole KVStore deletion.
I'm wondering if Splunk will eventually support bulk record deletion of KVStore records via REST.
Any Splunkers' insight into this will be super helpful.
Cheers!
hi,
I use CURL to interact with REST endpoint to do bulk deletions
You can test by putting the data in the Link : http://dev.splunk.com/view/webframework-developapps/SP-CAAAEZG
For example if I need to delete id greater than 24: {"id": {"$gt": 24}}
(Assuming your app to be "kvstoretest" and kvstore to be "kvstorecoll"
- Use the site http://meyerweb.com/eric/tools/dencoder/ to encode the value . Then run below curl command
curl -k -u admin:changeme -X DELETE \
https://localhost:8089/servicesNS/nobody/kvstoretest/storage/collections/data/kvstorecoll?query=%7B%...
Since it is REST endpoint, I hope it will work smoothly with clustered SH's too
Suppose I have a list to delete, I use a batch save to modify all the members of this list to have a field set to a non-valid value. For example, if 0 is invalid for the field 'id', set it to all the members of the list and do one batch save to the kv store.
Then, another call to make a batch delete through the query {'id' : 0}, e.g.
collection.data.delete(query=json.dumps({'id' : 0})
hi,
I use CURL to interact with REST endpoint to do bulk deletions
You can test by putting the data in the Link : http://dev.splunk.com/view/webframework-developapps/SP-CAAAEZG
For example if I need to delete id greater than 24: {"id": {"$gt": 24}}
(Assuming your app to be "kvstoretest" and kvstore to be "kvstorecoll"
- Use the site http://meyerweb.com/eric/tools/dencoder/ to encode the value . Then run below curl command
curl -k -u admin:changeme -X DELETE \
https://localhost:8089/servicesNS/nobody/kvstoretest/storage/collections/data/kvstorecoll?query=%7B%...
Since it is REST endpoint, I hope it will work smoothly with clustered SH's too
I don't think this will work for us
We cannot simply use a "where" statement, because we have to assume that the independent input process was adding new data matching the initial query.
Also not all the fetched objects might be processed for various reasons.
Therefore we need to do a bulk delete like "delete the following list of objects ..."
Greetings
Mathias
We also encounter this problem
We are using the KV store as backend/DB for an other application.
We create entries in the order of 50k - 200k.
Later we read, process and want to delete them.
But deleting 70k entries takes "forever" ...
Get the collection, clean the entries that have to be deleted, then delete the collection and store the cleaned collection is NOT an option.
The import process is independent and could write new entries at the same time.
We would really welcome either a batch delete or any other performance improvement
Otherwise we have to choose an other backend, which greatly reduces the benefits of using Splunk for this use case.
Cheers
Would creating a script to first list all the keys that you want delete and loop through it to delete one by one approach acceptable for you?
Hi somesoni2,
Thanks for your reply. It appears that looping through let's say few hundred or a few thousand will be very slow as experienced by someone else in the community:
https://answers.splunk.com/answers/431947/how-can-i-delete-kvstore-keys-at-high-speed.html