Building for the Splunk Platform

Deleting Data from KV Store

newbie2tech
Communicator

Hi Team,

We have a kvstore with about ~95 million events dating back to 3 years.

Key of the kvstore is unique numeric field. We also have timestamp among other fields.

We have requirement to only retain only 1 year worth of data.

I would like to know what would be the best way to get rid of old data.

Also is there a way to specify to drop any data which is older than 1 year going forward like index retention time.

We have clustered sh and indexer environment.

We are at Splunk version 6.11

Thank you!

0 Karma

darrenfuller
Contributor

Hi newbie2tech,

You need to create a mongo formatted query (since kvstore is a mongodb at its core, albeit a modified mongodb, so you can't just point a mongodb tool at it to manage it.)

Like so, if you have a query to look at your kvstore like so:

 |inputlookup my_kv_store where LastUpdateTime<1551139200

in Mongodb query format that would look like so:

 {"LastUpdateTime": {"$lt": 1551139200}}

Next, you need to take that and url encode it, using a tool like: http://meyerweb.com/eric/tools/dencoder/ which turns this:

{"LastUpdateTime": {"$lt": 1551139200}}

Into this:

%7B%22LastUpdateTime%22%3A%20%7B%22%24lt%22%3A%201551139200%7D%7D

Lastly, run the following from a command line that supports curl, and can access your Splunk search head:

curl -k -u admin:changeme -X DELETE https://splunkhost:8089/servicesNS/nobody/<appname_where_kvstore_is_defined>/storage/collections/dat...

Making sure you replace the and placeholders with the actual values relevant to your environment and pasting your mongo query after the query= point in the command.

Last point, measure twice, cut once.. Maybe create a test kvstore, put some dummy data in it and test this before you run it against your 3 years kvstore data and accidentally blow it all away. A backup of the kvstore might also be in order before this. (Call me paranoid).

Good luck.
Darren

Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...