Splunk Enterprise

Splunk UI is glithcy

_pravin
Contributor

Hi Splunkers,


I have this problem with one of the Splunk instances, where the 'navbar' or the 'edit, export' button doesn't appear on the Splunk UI.

Upon multiple browser refreshes, the issue gets fixed.

The setup is 3 SHs managed by a load balancer in the front. Based on the availability, the search is transferred to the free SH. They are all deployed in the AWS cloud.

Has someone faced similar problems with Splunk?


Thanks in advance.

Regards,

Pravin

Labels (1)
Tags (2)
0 Karma
1 Solution

livehybrid
SplunkTrust
SplunkTrust

Hi @_pravin 

Does the load balancer have stickiness/cookies setup so that all traffic for a user will go to the same destination? I have seen this issue if different requests end up going to different destination servers.

Are you able to check the browser's developer console to see if any requests fail or give a non 200/20x status? Clicking onto a failed request might provide more information about why the API call is failing and thus failing to render the UI correctly.

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

 

View solution in original post

_pravin
Contributor

Hi @livehybrid ,

 

Thanks for your response. It was really useful.

When I have a glitchy UI, the error is mostly - Failed to load resource - the server responded with a status of 400 (Bad Request) (OR)  net::ERR_CONNECTION_RESET

_pravin_0-1760096294040.png

Does this load balancer stickiness also affect how searches are managed and loaded in Splunk? 

Do they also slow down the search speed when LB caching is present?

 

Regards,

Pravin

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Generally without properly configured session stickiness the SHC will not be working properly (because you'd be randomly redirected between backend SHs). 

0 Karma

_pravin
Contributor

Thanks @PickleRick 

I understand that the SHC is not configured properly, which is causing the issue with UI, but does it affect the Splunk search performance as well?

The redirection between different SHs is not happening, which is my concern now, but how does this cause Slplunk slowness?

Thanks,

Pravin

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Witn LB not working correctly consistently redirecting requests from one session to the same backend SH you can expect a lot of issues - random logouts, UI glitches, errors fetching results from searches...

But it should not affect search performance as such.

BTW, nobody said so far that your LB is misconfigured. It might be but that would manifest itself much more often than just not loading some statics.

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @_pravin 

Does the load balancer have stickiness/cookies setup so that all traffic for a user will go to the same destination? I have seen this issue if different requests end up going to different destination servers.

Are you able to check the browser's developer console to see if any requests fail or give a non 200/20x status? Clicking onto a failed request might provide more information about why the API call is failing and thus failing to render the UI correctly.

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

 

Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...