Getting Data In

Why does the forwarder stops sending data to all configured TCP connections when one connection is not available?

dami_rel
Engager

Hello,

I'm new to splunk and hope you can help me with this problem.
I'm using Universal forwarder to send data from Server X to Splunk server A and to a third party Server B.
When third party Server B closes TCP port for any reason, Splunk forwarder on server X stops also sending data to Splunk server A.

How can I prevent forwarder on Server X to close stream to Splunk server A?

My forwarder config on Server X is as follows:

inputs.conf:

[default]
host = TEST
[monitor:///var/log/list.log]
disabled=false
sourcetype=log_iedge
index=vo

[monitor:///var/log/lstat.log]
disabled=false
sourcetype=log_lstat
index=vo

[monitor:///var/log/ISDM.log]
disabled=false
_TCP_ROUTING = Server_B

outputs.conf:

[tcpout]
defaultGroup = default-autolb-group, Server_B

[tcpout:default-autolb-group]
server = A.A.A.A:9998

[tcpout:Server_B]
server = B.B.B.B:9981
sendCookedData = false

felipesewaybric
Contributor

Once I have the same problem, is like the UF cant send to a peer, so he just stops, since he cant send the data, I use load balancing with cluster environment.
Check this docs:
https://docs.splunk.com/Documentation/Splunk/7.1.1/Forwarding/Setuploadbalancingd

0 Karma

nkpiquette
Path Finder

Even with Load Balancing configured though if B refuses/cant receive the data it does not send the data to A even if A has load balancing configured.

0 Karma
Get Updates on the Splunk Community!

Observability Unlocked: Kubernetes Monitoring with Splunk Observability Cloud

 Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team for an ...

Update Your SOAR Apps for Python 3.13: What Community Developers Need to Know

To Community SOAR App Developers - we're reaching out with an important update regarding Python 3.9's ...

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...