Splunk Enterprise

changing License Manager serverName affects SHC

stei-f
Loves-to-Learn

I have a splunk clustered environment, where the License Manager has a none existent(cannot be resolved/name-lookup) servername configured (etc/system/local/server.conf -> serverName). This has been running like this for a some time.

But this is introducing issues with license monitoring in Montioring Console. To eliminate this issue and make this Splunk instance to comply to other existing instances, i tried to simply change the serverName in server.conf to the hostname and restarting the Splunk service.

Splunk service is starting without complains, but the Monitoring Console reports that suddenly all the SearchHeads are unreachable.
Querying the Searchheads for shcluster-status, results in errors.

Reverting back to the old name and restarting, fixes that SearchHead unreachable issue and status.

This License Manager server has following roles: 
* License manager
* (Monitoring Console)
* Manager Node

I do not see any connection on why this change is affecting Searchheads. Indexers are fine. Deployer is a different server. I found documented issues (for this kind of change) for Indexers and the Monitoring Console itself or that it can have side affects for the Deployment Server, but no real hit on Searchheads/SHC.

As i do not have permanent access to this instance. I have to prepare kind of a remediation plan or at least analysis.

I'm searching for hints where I can start with my investigation. Maybe someone had successfully changed a License Master name. Hoping that I'm missing something obvious.

Thanks

Labels (3)
0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

as you have Cluster Manager as your LM too this this severName is actually your CM's name not only your LM name. 

I assume that you are using also e.g. indexer discovery and other stuff on your environment?

Is suppose that when you have changed that serverName then all those entities which are connected to your CM could have some issues as there is not existing that serverName what they are expecting.

It's hard to say what all those are without deeper look into your environment.

r. Ismo

0 Karma

stei-f
Loves-to-Learn

You are correct. I had prepared and found steps for fixing indexers, but they seemed fine.

The configuration for manager_uri and alike is largely based on IP (which is another topic on its own), the IP did not change. So endpoints should be able to reach the "modified" server (but may expect a different response).

I have to dig into indexer_discovery (and alike). I did not prepare for it. To my documentation it is not configured.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
IMHO I never use any IPs when I configure splunk infra nodes (I made this mistake once). My primary way is use native DNS service where I put/update node names like xx-IDX-a-1 with fqdn. Another option is use static CNAMEs and last option is use hosts file on nodes. In that way you could do most admin operations w/o any service breaks.
0 Karma

livehybrid
Influencer

Hi @stei-f 

Its very odd that this would only affect the SH, especially as any outbound connection from the monitoring console shouldnt be impacted by the change to the MC Server name.

From Monitoring Console, if you go to Settings->General Setup - What does this screen look like? Do you see the remote SHs in there? 

 

0 Karma

stei-f
Loves-to-Learn

I'm pretty sure the SH were in the "Settings->General Setup" (Listed as remote-instances), as I wanted to apply the config to make the name change applied to the apps lookups (splunk_ apps). At that time I was still thinking the unreachable status was a timing/communication thing. So to verify my point, I checked the shcluster-status (CLI) on the SH just to discover that the SHC failed (was not able to query the state). Thats when I chickened out and reverted back the configuration.

I will add this to my checklist.

In reflection, I messed up. I missed to take evidence of the situation (e.g. screenshots and error messages). Focusing on restoring the service.

0 Karma
Get Updates on the Splunk Community!

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

The Latest Cisco Integrations With Splunk Platform!

Join us for an exciting tech talk where we’ll explore the latest integrations in Cisco + Splunk! We’ve ...

AI Adoption Hub Launch | Curated Resources to Get Started with AI in Splunk

Hey Splunk Practitioners and AI Enthusiasts! It’s no secret (or surprise) that AI is at the forefront of ...