Getting Data In

How send indexed data older than 3 months to colddb monthly?


I have indexed logs from more than nine months ago in the default directory: $SPLUNK_DB\dbcustom1\db
And I wish to run a task every month to send all logs older than 3 months to $SPLUNK_DB\dbcustom1\colddb

I would appreciate a clue please.


0 Karma


Have a look at this article which nicely explains Splunk data bucket life cycle and various factors involved in the bucket movement to next stage.

My guess will be to adjust the maxDataSize/maxHotSpanSecs to roll hot buckets to warm and then adjust maxWarmDBCount and maxHotBuckets values so that they contain only 90 day worth of data and then they roll-out to cold bucket.


maxHotSpanSecs=86400   (daily roll-out of hot to warm)
maxWarmDBCount = 87      (total hot+warm buckets are 90, so 90 days worth of data)

NOTE: Ensure that maxTotalDataSizeMB and frozenTimePeriodInSecs are sufficiently high so that cold buckets are not getting rolled-over to Frozen.


Hi somesoni2
Thanks for you answer.
I'm testing your example with following parameters
maxHotSpanSecs = 3600
maxWarmDBCount = 21

In 15 hours i get results, and I commented
a query.
With this I mean that when you reach total buckets will be sent to colddb. What if between maxHotSpanSecs splunk restart the service, but I'm wrong a new bucket is created. So this would impact total buckets.


0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud | Unified Identity - Now Available for Existing Splunk ...

Raise your hand if you’ve already forgotten your username or password when logging into an account. (We can’t ...

Index This | How many sides does a circle have?

February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Registration for Splunk University is Now Open!

Are you ready for an adventure in learning?   Brace yourselves because Splunk University is back, and it's ...