Getting Data In

vSphere Client Splunk VM using high Host Memory

wuming79
Path Finder

Hi,

I noticed my splunk vm is eating up a lot of host memory. I'm starting from 1GB to 2GB and now 8GB and it is always at full bar. What can I do to reduce the host memory usage? Is this due to too many data indexed?

0 Karma
1 Solution

s2_splunk
Splunk Employee
Splunk Employee

Are you aware of the requirements to run a Splunk indexer on VMWare?
Splunk has pretty significant resource requirements for indexers to begin with and adhering to the best practices for virtualized indexers outlined in the document linked above is pretty much the only path to happiness.

There are a lot of things an indexer needs memory for. The indexing pipeline is one of them, but indexers are also handling search requests from the search heads. Things like large lookup files can increase your memory footprint.

As long as you don't experience symptoms of a memory leak, where RAM usage is increasing until the VM fails, anything between 1 and16GB is pretty normal for a busy indexer.

View solution in original post

0 Karma

s2_splunk
Splunk Employee
Splunk Employee

Are you aware of the requirements to run a Splunk indexer on VMWare?
Splunk has pretty significant resource requirements for indexers to begin with and adhering to the best practices for virtualized indexers outlined in the document linked above is pretty much the only path to happiness.

There are a lot of things an indexer needs memory for. The indexing pipeline is one of them, but indexers are also handling search requests from the search heads. Things like large lookup files can increase your memory footprint.

As long as you don't experience symptoms of a memory leak, where RAM usage is increasing until the VM fails, anything between 1 and16GB is pretty normal for a busy indexer.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...