Getting Data In

How send indexed data older than 3 months to colddb monthly?


I have indexed logs from more than nine months ago in the default directory: $SPLUNK_DB\dbcustom1\db
And I wish to run a task every month to send all logs older than 3 months to $SPLUNK_DB\dbcustom1\colddb

I would appreciate a clue please.


0 Karma


Have a look at this article which nicely explains Splunk data bucket life cycle and various factors involved in the bucket movement to next stage.

My guess will be to adjust the maxDataSize/maxHotSpanSecs to roll hot buckets to warm and then adjust maxWarmDBCount and maxHotBuckets values so that they contain only 90 day worth of data and then they roll-out to cold bucket.


maxHotSpanSecs=86400   (daily roll-out of hot to warm)
maxWarmDBCount = 87      (total hot+warm buckets are 90, so 90 days worth of data)

NOTE: Ensure that maxTotalDataSizeMB and frozenTimePeriodInSecs are sufficiently high so that cold buckets are not getting rolled-over to Frozen.


Hi somesoni2
Thanks for you answer.
I'm testing your example with following parameters
maxHotSpanSecs = 3600
maxWarmDBCount = 21

In 15 hours i get results, and I commented
a query.
With this I mean that when you reach total buckets will be sent to colddb. What if between maxHotSpanSecs splunk restart the service, but I'm wrong a new bucket is created. So this would impact total buckets.


0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In the last month, the Splunk Threat Research Team (STRT) has had 2 releases of new security content via the ...

Announcing the 1st Round Champion’s Tribute Winners of the Great Resilience Quest

We are happy to announce the 20 lucky questers who are selected to be the first round of Champion's Tribute ...

We’ve Got Education Validation!

Are you feeling it? All the career-boosting benefits of up-skilling with Splunk? It’s not just a feeling, it's ...