Splunk Enterprise

How to create a secondary backup server for Splunk app for DB connect App ?

tarunmalhotra79
Engager

Hi Team,

There was a recent failure on one of our host where we have hosted Splunk App for DB Connect.Resultant we have lost data for almost 5,6 hours.

I wanted to create a secondary backup server, if something goes wrong with our present server, Ideally it should start working on the secondary server.

Also, i have realized that decrypting the passwords and entering it with script are not working, last time i tried I had to put it manually all the passwords.

Could anyone please guide me or gives me the direction how can i achieve that ?

Regards,

Labels (1)
0 Karma

thambisetty
SplunkTrust
SplunkTrust

Complete failover automation is impossible.

but you can have workaround.

Splunk manual start is required on secondary Splunk instance where Splunk_app_dbconnect will be idle.

1. open firewall rules from secondary instance  to all destinations from which you are collecting records from active instance.
2. Configure all inputs on secondary instance

3. don’t start splunk service

4. rsync db checkpoints available at $SPLUNK_HOME/var/lib/splunk/modinputs/splunk_app_dbconnect/* for every one minute from active to secondary instance

5. Configure to get an alert if  primary Instance Splunkd service is down.

6. You can have simple alert action which will be called by the alert.

you have to test this before taking decisions.

this is my idea only. Not tested really.

————————————
If this helps, give a like below.

thambisetty
SplunkTrust
SplunkTrust

I ran into same situation today. I followed same steps given in the previous answer and they worked as expected.

————————————
If this helps, give a like below.
0 Karma

richgalloway
SplunkTrust
SplunkTrust
Splunk doesn't have a high-availability solution for apps.
You can stand up a warm standby Splunk instance with DB Connect installed on it and use rsync (or a similar utility) to copy data from the primary to the standby. I don't know the exact files you'd have to synchronize.
The takeover process would be manual or automated outside of Splunk. Likewise, when the original server comes back on-line you will have to take steps to ensure only one instance performs inputs or you'll risk duplicate data.
---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...