Deployment Architecture

Can you configure a Universal Forwarder output to send to two separate Heavy Forwarders?

Log_wrangler
Builder

I need to send two copies of events to two different HFs (not load-balanced).

I want to use a UF on a server to send events to a HF which will send cooked to the indexers, and I want the UF to send the same events to a different HF that will send raw (uncooked) events to a 3rd party.

Can the UF handle sending the data twice?

Thank you

1 Solution

markusspitzli
Communicator

Hey.

This documentation will help you: https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Routeandfilterdatad

Basically you have to configure two different destinations in outputs.conf:

[tcpout]
defaultGroup=myroute

[tcpout:myroute]
disabled=false
server=10.1.12.1:9997

[tcpout:anotherroute]
disabled=false
server=10.1.12.2:9997

Then you have to configure the props.conf for which sourcetype, host, or source you want to clone the data.

[mysourcetype]
TRANSFORMS-routing = routing

[host::myhost]
TRANSFORMS-routing = routing

[source::/var/log/messages]
TRANSFORMS-routing = routing

Of course you have to configure the transforms.conf

[routing]
REGEX=(.)
DEST_KEY=_TCP_ROUTING
FORMAT=myroute,anotherroute

that should do the job

View solution in original post

0 Karma

markusspitzli
Communicator

Hey.

This documentation will help you: https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Routeandfilterdatad

Basically you have to configure two different destinations in outputs.conf:

[tcpout]
defaultGroup=myroute

[tcpout:myroute]
disabled=false
server=10.1.12.1:9997

[tcpout:anotherroute]
disabled=false
server=10.1.12.2:9997

Then you have to configure the props.conf for which sourcetype, host, or source you want to clone the data.

[mysourcetype]
TRANSFORMS-routing = routing

[host::myhost]
TRANSFORMS-routing = routing

[source::/var/log/messages]
TRANSFORMS-routing = routing

Of course you have to configure the transforms.conf

[routing]
REGEX=(.)
DEST_KEY=_TCP_ROUTING
FORMAT=myroute,anotherroute

that should do the job

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...