Getting Data In

Splunk Web server recv-q filling up, unable to connect

packland
Path Finder

I have a heavy forwarder running on a dedicated RHEL 7.5 server, I'm trying to connect via the web interface running on port 8000. I have tested this port from the client machine and by all accounts there is sufficient network access for this to work. While trying to connect, the browser will spin indefinitely waiting for a response.

After running netstat on the Heavy Forwarder I found that every time I try to initiate a connection I can see the web server process's recv-q go up but the process never seems to do anything with those requests. As such I've searched all appropriate system and application log files but can't find any indication as to what's going on.

I have restarted both the Splunk application and the server itself with no change.

The only other thing pointing towards the problem is that when I restart the Splunk service, it gets stuck on "Waiting for web server to be available..."

Has anyone experienced this before? I'm not sure what else I can test to reveal the issue. I have multiple other identically configured Heavy Forwarders that are working fine.

0 Karma

woodcock
Esteemed Legend

Check for this setting in server.conf (it should NOT be present):

[httpServer]
disableDefaultPort = true
0 Karma

woodcock
Esteemed Legend

Try disabling selinux and iptables (AKA firewalld) and retest; these often block network traffic and need the appropriate configurations to allow it. Also, make sure that nothing else is using that port (including a zombie process of splunk); use netstat for this.

0 Karma

packland
Path Finder

I had suspected system level firewalls as well but I can confirm these aren't the issue. I've searched for all processes using that port or zombie Splunk processes and found nothing. I will note, when starting the Splunk processes it successfully grabs that port and binds to it, it's just that when anything is sent to that service, it sits in the recv-q and doesn't get processed. Any other suggestions as to what it could be? I'm thinking at this point I might need to reinstall Splunk...

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...