Getting Data In

Distributed Architecture: If data is sent to two heavy forwarders, how do you prevent duplicate logs?

Alteek
Explorer

Hi,

We are moving to a distributed architecture with 1 search head, 1 indexer and 2 heavy forwarders.

The idea is to forward logs from targets (syslog, universal forwarder) to both Heavy Forwarders.
So we ensure that logs are never lost.

But how can we avoid log duplications in the indexer in this case ? Is it handled automatically ?
Or perhaps there is a better way to do it.

Many thanks,
Regards

1 Solution

MuS
SplunkTrust
SplunkTrust

Hi Alteek,

you can use the load balancing feature of the universal forwarder, check out the docs about Configure forwarders with outputs.conf this way you can avoid event duplication.
Regarding the syslog devices; use a DNS alias or DNS round robin which referees to both heavy forwarders and use this DNS entry as syslog target.

hope this helps to get you started ...

cheers, MuS

View solution in original post

MuS
SplunkTrust
SplunkTrust

Hi Alteek,

you can use the load balancing feature of the universal forwarder, check out the docs about Configure forwarders with outputs.conf this way you can avoid event duplication.
Regarding the syslog devices; use a DNS alias or DNS round robin which referees to both heavy forwarders and use this DNS entry as syslog target.

hope this helps to get you started ...

cheers, MuS

Alteek
Explorer

Thank you for your answer.

I'll try to use the load balancing feature of the universal fwd, and I have found some intersting topics about linux heartbeat for the syslog case.

Regards

Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...