Deployment Architecture

forwarding existing events


Think I'm having duh-moment... I thought I could configure a new instance of Splunk, set it to receive, then configure my existing main-indexer to forward to it, and have it send everything it knows to the new instance -- essentially making a backup of the existing database and creating a hot-backup kind of setup. Well, I did all that, and events are flowing from the main to the "backup", but existing events are not (only new ones that are coming in). Everything appears configured correctly, and it's working; hence the 'duh' moment -- was I wrong when I thought it would forward everything it has? A conceptual error on my part?

~embarrassed to ask...

Yes, I know about the brute-force method of backups with flushing, copying, restarting...

Tags (1)
0 Karma


To make the forwarder resend everything it has, you can reset the fishbucket on the forwarder. The "fishbucket" is where Splunk stores all the data about the files it is monitoring. To do that, just stop Splunk on the forwarder (no need to stop the indexers) and then find the fishbucket directory - usually it is under $SPLUNK_HOME/var/lib/splunk. Remove the fishbucket directory and all subdirectories. Restart the forwarder, and it will start from the beginning of all files that it is monitoring. (It will create a new fishbucket when it restarts.)

Of course, the Splunk forwarder can't resend any temporal data like the results of script executions or any network inputs, so be careful about that!


Egads! Using Splunk for five years now; I totally forgot about the fishbucket...


0 Karma


not really embarrassing it wont forward any existing indexed data. You could copy them over to the new server and append the new events from now on..