Deployment Architecture

Do I need to have autoLB on the search head?

scottrunyon
Contributor

I am getting the message

Forwarding to indexer group default-autolb-group blocked for 100 seconds  

I am running a single search head with two indexers and 42 Universal Forwarders. I have [tcpout] and [tcpout:default-autolb-group] configured in Splunk\etc\system\local\outputs.conf. on the search head with server= pointing to the two indexers. My question is, do I need to have autoLB defined on the search head?

0 Karma
1 Solution

scottrunyon
Contributor

Answer is no longer needed. We now have a single system Splunk instance, so load balancing to indexers is not needed.

View solution in original post

0 Karma

scottrunyon
Contributor

Answer is no longer needed. We now have a single system Splunk instance, so load balancing to indexers is not needed.

0 Karma

phadnett_splunk
Splunk Employee
Splunk Employee

Hi scottrunyon, autoLB defaults to true, so it is already configured and there is no need to set it manually. Hope this helps!

http://docs.splunk.com/Documentation/Splunk/latest/Admin/Outputsconf

autoLB = true
* Automatic load balancing is the only way to forward data. Round-robin method is not supported anymore.
* Defaults to true.

0 Karma

scottrunyon
Contributor

It looks to me that on the outputs.conf documentation, load balancing is defined for forwarders. My question concerns the search head, Is load balancing needed on a single search head with 2 indexers? In other words, can I remove the load balancing configuration and stop the message.

0 Karma
Get Updates on the Splunk Community!

Celebrating Fast Lane: 2025 Authorized Learning Partner of the Year

At .conf25, Splunk proudly recognized Fast Lane as the 2025 Authorized Learning Partner of the Year. This ...

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...