Deployment Architecture

Forwarder still forwarding to removed indexer

richard_whiffen
Explorer

I have a strange one. I'm migrating hosts to a different indexer to redistribute my workload. Last week from the command line on the boxes in question I did:


./splunk add forward-server 10.11.12.13:9997 -auth blah:blah
./splunk remove forward-server 10.9.8.7:9997 -auth blah:blah
./splunk restart

I just went back to check things today and I still see the host sending data to the removed server "10.9.8.7" even after a restart (I restarted again just to be sure). Further I tried to do the remove again and it says:


In handler 'tcpout-server': 10.21.168.37:9997 forwarded-server not found

So I'm sure it's not in the config the 'normal' way.

Searching through the splunkbase hasn't turned up much that's relevant. The clients in question are all running 4.1.x versions of splunk, so I'm going to try and upgrade to see if that fixes it, but I've never seen this before where it continues to forward events 4 days after the indexer was removed from the config.

Any thoughts on what's going on?

Cheers,
Rich

0 Karma
1 Solution

richard_whiffen
Explorer

There's probably a bug at play here, but I don't have time to dig further. In a 4.1 light forwarder with a relatively busy workload (multiple events per second) adding a forward-server and then removing the first forward-server doesn't remove the first forward server from /opt/splunk/etc/apps/search/local/outputs.conf.

I performed the same action on 9 servers running the same splunk release, all configured as a light forwarder, all forwarding to the same indexer.


./splunk add forward-server 10.11.12.13:9997 -auth blah:blah
./splunk remove forward-server 10.9.8.7:9997 -auth blah:blah
./splunk restart

Six correctly stopped forwarding to the removed forwarder. Three that are relatively busy never removed the old forwarder from /opt/splunk/etc/apps/search/local/outputs.conf

The solution was to remove it by hand with an editor and restart splunk on the forwarders.

View solution in original post

0 Karma

richard_whiffen
Explorer

There's probably a bug at play here, but I don't have time to dig further. In a 4.1 light forwarder with a relatively busy workload (multiple events per second) adding a forward-server and then removing the first forward-server doesn't remove the first forward server from /opt/splunk/etc/apps/search/local/outputs.conf.

I performed the same action on 9 servers running the same splunk release, all configured as a light forwarder, all forwarding to the same indexer.


./splunk add forward-server 10.11.12.13:9997 -auth blah:blah
./splunk remove forward-server 10.9.8.7:9997 -auth blah:blah
./splunk restart

Six correctly stopped forwarding to the removed forwarder. Three that are relatively busy never removed the old forwarder from /opt/splunk/etc/apps/search/local/outputs.conf

The solution was to remove it by hand with an editor and restart splunk on the forwarders.

0 Karma

kristian_kolb
Ultra Champion

Have you manually checked what's in you outputs.conf file(s)?

Not that I've run into this problem myself, but I can guess that you possibly have more than one outputs.conf, and since they are merged at runtime, your new settings may be overridden.

What happens if you type

/opt/splunk[forwarder]/bin/splunk btool outputs list

on your forwarder?
Does the old indexer still show up?

/K

0 Karma

kristian_kolb
Ultra Champion

Well, I think you should have the btool available. You didn't just copy/paste the line above, did you? Those square brackets are just to show the optional paths for a default *nix install. Universal Forwarder is in /opt/splunkforwarder/... and the old-style Heavy/Lightweight Forwarder is in /opt/splunk/...

You can expect to find an outputs.conf in some of the following places:

/opt/splunk[forwarder]/etc/apps/search/local

/opt/splunk[forwarder]/etc/apps/launcher/local

/opt/splunk[forwarder]/etc/system/local

good luck.

/k

0 Karma

richard_whiffen
Explorer

I'm running 4.1 as the forwarder and as such don't have btool as an option it seems. I do still see it if I do:

splunk list forward-server

Which I hadn't thought to do before... So both forwards are still active, just need to grep -r to find out where the old one is coming from I guess.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...