I have indexed logs from more than nine months ago in the default directory:
And I wish to run a task every month to send all logs older than 3 months to
I would appreciate a clue please.
Have a look at this article which nicely explains Splunk data bucket life cycle and various factors involved in the bucket movement to next stage.
My guess will be to adjust the
maxDataSize/maxHotSpanSecs to roll hot buckets to warm and then adjust
maxHotBuckets values so that they contain only 90 day worth of data and then they roll-out to cold bucket.
maxHotSpanSecs=86400 (daily roll-out of hot to warm) maxHotBuckets=3 maxWarmDBCount = 87 (total hot+warm buckets are 90, so 90 days worth of data)
NOTE: Ensure that maxTotalDataSizeMB and frozenTimePeriodInSecs are sufficiently high so that cold buckets are not getting rolled-over to Frozen.
Thanks for you answer.
I'm testing your example with following parameters
maxHotSpanSecs = 3600
maxWarmDBCount = 21
In 15 hours i get results, and I commented
With this I mean that when you reach total buckets will be sent to colddb. What if between maxHotSpanSecs splunk restart the service, but I'm wrong a new bucket is created. So this would impact total buckets.