We have splunk add-on for aws installed on one of the hf .Can we install the same add-on other HF and create the same inputsso that when one of the hf is down and the other sends data. Is this possible ? or any other work around for this?
Thanks in Advance
Don't do this.
Yes you would end up with duplicate events.
A better solution is to install multiple HFs with a different selection of inputs on each. (Depending on your AWS footprint, maybe you configure all inputs for a single region on each HF?)
This won't give you fault tolerance if a single HF fails, but it will reduce the amount of disruption as you would still collect data from the surviving HFs.
The AWS TA is a heavyweight app particularly if you collect all AWS data sources for a large number of resources.
Spreading the load across a pool of HFs like this can stop any single HF getting too bogged down.
Don't do this.
Yes you would end up with duplicate events.
A better solution is to install multiple HFs with a different selection of inputs on each. (Depending on your AWS footprint, maybe you configure all inputs for a single region on each HF?)
This won't give you fault tolerance if a single HF fails, but it will reduce the amount of disruption as you would still collect data from the surviving HFs.
The AWS TA is a heavyweight app particularly if you collect all AWS data sources for a large number of resources.
Spreading the load across a pool of HFs like this can stop any single HF getting too bogged down.
Thank you for your explanation
Hi,
You need to go with load balancing to address HF failovers.
The below documentation will help.
https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Setuploadbalancingd
https://docs.splunk.com/Documentation/Forwarder/8.0.1/Forwarder/Configureloadbalancing
@dindu thank you for your reply .But creating two same inputs on both HF does create duplicate events right?
Hi,
Yes - the Splunk indexer will treat each input as different and index the same data.