Getting Data In

Why do I have so many established TCP connections from one host?

danielsofoulis
Path Finder

I have a Windows host (192.168.2.2) which has a universal forwarder installed and is setup to talk to my single instance Splunk.
I have added the windows app with only 2 perfmon counters being monitored.

The Windows host run hyper-V which runs the Splunk instance.

As you can see below I have normal connections from other hosts, but for some reason the Windows host has established multiple tcp connections to Splunk. The list of connections keep growing, so I have to stop Splunkd service on the windows host.

netstat -aon | grep 9997
tcp 0 0 0.0.0.0:9997 0.0.0.0:* LISTEN off (0.00/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.2:54228 ESTABLISHED keepalive (6993.74/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.2:54234 ESTABLISHED keepalive (7073.10/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.2:54241 ESTABLISHED keepalive (7132.44/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.2:54216 ESTABLISHED keepalive (6921.62/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.2:54217 ESTABLISHED keepalive (6940.68/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.4:34608 ESTABLISHED keepalive (4530.40/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.102:52379 ESTABLISHED keepalive (4516.28/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.2:54229 ESTABLISHED keepalive (7015.83/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.1:53925 ESTABLISHED keepalive (4518.96/0/0)
tcp 0 0 192.168.2.3:9997 192.168.2.2:54251 ESTABLISHED keepalive (7191.76/0/0)

I have tried restarting Splunkforwarder service, uninstalling and reinstalling Splunk forwarder.
The connections finish but once I start or reinstall the forwarder, the TCP connections start again.

There is nothing special in my etc/system/local/outputs.conf

outputs.conf
[tcpout]
defaultGroup = default-autolb-group

[tcpout:default-autolb-group]
server = serlin001:9997

[tcpout-server://serlin001:9997]

Splunk Free Trial 6.5.2
Splunk Universal Forwarder 6.5.2

0 Karma

danielsofoulis
Path Finder

Not sure what caused this issue but I suspect it was related to the OS. Previously I had restarted Splunkd which did not fix the issue. Later after lodging this case I restarted the operating system which resolved the issue.

0 Karma

Masa
Splunk Employee
Splunk Employee

Sorry but I do not have a concrete answer. Some troubleshooting steps would be.

  1. Check splunkd.log for error or warn message
  2. Check indexer side of tcp session status and see it matches with the forwarder
  3. Check and monitor netstat and see if state of all TCP sessions stay as ESTABLISHED or it goes to close_wait state and stay long, or it will be closed eventually.
  4. Check firewall or some anti-virus software ,and try to turn it off for testing.
  5. File a Splunk Support case for further troubleshooting
0 Karma

Richfez
SplunkTrust
SplunkTrust

Is there anything interesting in the log files on either the UF or the Splunk server? If they're set up with the defaults, check /opt/splunk/var/log/splunk/splunkd.log on the server (Asuming *nix since you used grep 🙂 ) and C:\Program Files\Splunkuniversalforwarder\var\log\splunk\splunkd.log. For the latter, try from an Administrator command prompt

cd \program files\splunkuniversalforwarder\var\log\splunk\
findstr /i tcpout splunkd.log

Though just "type splunkd.log" will likely show you what you need (just scroll back a few pages from the end when it's done). Or even "type splunkd.log | clip" then you should be able to paste that into notepad and look.

I'm not positive what you should see there, but with any luck you'll find errors, warnings, or other text that seems related.

0 Karma
Get Updates on the Splunk Community!

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...

Observability protocols to know about

Observability protocols define the specifications or formats for collecting, encoding, transporting, and ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...