Deployment Architecture

Can you help me with the following error with one of my indexer peers in a cluster: "Failed to add peer...to the master"

dkeck
Influencer

Hi,

Yesterday, I had a weird error. One of my indexer peers was not able to register with the master. This was not while adding a new Indexer, but it randomly started at 4 pm, at latest, for about a couple of hours.

Search peer xxx has the following message: Failed to register with cluster master reason: failed method=POST path=/services/cluster/master/peers/?output_mode=json master=xxxx rv=0 gotConnectionError=0 gotUnexpectedStatusCode=1 actual_response_code=500 expected_response_code=2xx status_line="Internal Server Error" socket_error="No error" remote_error=Cannot add peer=xxxx mgmtport=8089 (reason: non-zero pending job count=1, guid=966B9726-FD5A-4788-9797-E68BD4Exxxx). [ event=addPeer status=retrying AddPeerRequest: { _id= active_bundle_id=73203EA46E3A315FE912143694xxxx add_type=ReAdd-As-Is base_generation_id=68609 batch_serialno=1 batch_size=36 forwarderdata_rcv_port=9997 forwarderdata_use_ssl=0 last_complete_generation_id=69387 latest_bundle_id=73203EA46E3A315FE912143694xxxx mgmt_port=8089 name=966B9726-FD5A-4788-9797-E68BD4Exxxx register_forwarder_address= register_replication_address= register_search_address= replication_port=8080 replication_use_ssl=0 replications= server_name=xxxx site=site1 splunk_version=7.0.7 splunkd_build_number=b803471b1c68 status=Up } ].

On Master: Failed to add peer 'guid=966B9726-FD5A-4788-9797-E68BD4EBxxx server name=x ip=x to the master. Error=non-zero pending job count=1, guid=966B9726-FD5A-4788-9797-E68BDxxxxx

Error is not coming back for now, but I would like to understand what happened.

Maybe someone could point me in a direction?

0 Karma

prakash007
Builder

@dkeck you can go through this splunkanswers as a reference...
https://answers.splunk.com/answers/406947/why-am-i-getting-failed-to-add-peer-to-the-master.html

It might be a network blip in your case, and also timestamp in the message dropdown either on your search-heads or dmc is from when it shows you(I would dismiss it and see if I get more), not when it's generated. Make sure to check your internal logs for exact time of the event.

0 Karma
Get Updates on the Splunk Community!

Splunk at Cisco Live 2025: Learning, Innovation, and a Little Bit of Mr. Brightside

Pack your bags (and maybe your dancing shoes)—Cisco Live is heading to San Diego, June 8–12, 2025, and Splunk ...

Splunk App Dev Community Updates – What’s New and What’s Next

Welcome to your go-to roundup of everything happening in the Splunk App Dev Community! Whether you're building ...

The Latest Cisco Integrations With Splunk Platform!

Join us for an exciting tech talk where we’ll explore the latest integrations in Cisco + Splunk! We’ve ...