Getting Data In

Aside from disabling Redhat's Out of Memory (OOM) Manager that is killing splunkd, is there any other way to manage the splunk-optimize process to reduce the memory it uses?

rsolutions
Path Finder

Splunk-optimize is launching on our indexers and eating up a few GB of memory, then Redhat's out-of-memory manager kills the splunkd process (as seen in /var/log/messages). The 3 indexers have 16 CPUs and 16GB of ram (total logging of 70GB/day across the 3 indexers), so shouldn't be a resource issue. So aside from disabling the OOM (or reducing the likelihood of OOM killing Splunk http://www.oracle.com/technetwork/articles/servers-storage-dev/oom-killer-1911807.html ), is there any other way to manage the splunk-optimize process to reduce the memory it use?

0 Karma

srioux
Communicator

There's options for splunk-optimize stuff in indexes.conf:
http://docs.splunk.com/Documentation/Splunk/6.2.1/Admin/Indexesconf

If splunk-optimize is taking up that much resources to do its thing, might be running into bucket config issues as well. Have you increased bucket sizing (auto-high-volume) on the higher-volume indexes?

Here's some of the settings we've got on some of our higher-volume indexes:

maxMemMB = 20
maxConcurrentOptimizes = 6
maxHotIdleSecs = 86400
maxHotBuckets = 10
maxDataSize = auto_high_volume
0 Karma

esix_splunk
Splunk Employee
Splunk Employee

Be sure to read about THP in redhat. Splunk recommends disabling this.

http://docs.splunk.com/Documentation/Splunk/6.2.1/ReleaseNotes/SplunkandTHP

0 Karma

rsolutions
Path Finder

We have confirmed THP is disabled. We also set the OOM manager to -16 for the Splunk process. We bumped the memory on the indexers to 32GB, but the indexers have since crashed again. We'll get back to Splunk support, but something has to be causing the splunk-optimize process to use much more memory than normal.

Little lost on this one as we have never seen this before. We do have ES running in this environment, but other than that there isn't too much data, nor too many indexes created. Any other thoughts or suggestions?

0 Karma

tamasmarton
Explorer

Did you come to solution here?

0 Karma

rsolutions
Path Finder

The initial problem was the indexers were crashing. Splunk determined it was a bug a a fix was released as part of 6.0.3 I believe. The cause was an error with memory mapping between Splunk, the OS and VMware.

0 Karma

rsolutions
Path Finder

I will definitely try disabling THP and see what happens. I'll let you know how it goes.

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

There are some settings around splunk-optimize and similar indexing helper processes in http://docs.splunk.com/Documentation/Splunk/6.2.1/Admin/indexesconf (search for "optimize"), but I'm not familiar enough with those to give a qualified recommendation. The directly memory-related settings all come with big caveats.

I would seek configuration relief from the OOM/OS though, killing other people's processes is mean.

0 Karma

rsolutions
Path Finder

This is the article from Oracle that explains how to manage the OOM. While you can tell it to be "nicer" to Splunk, even they don't recommend turning it off.

http://www.oracle.com/technetwork/articles/servers-storage-dev/oom-killer-1911807.html

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

16GB is not a lot, especially when you have processes killed for being out of memory. Consider adding memory.

0 Karma

rsolutions
Path Finder

Thanks Martin. We originally started with 12GB of ram, but bumped it to 16GB. The only time we seem to run into issues is when the splunk-optimize process runs. Otherwise it is fine. The only other thing that is unique to this environment is that it does run Splunk ES (with a select few other apps that).

I realize we can add more ram, and that may ultimately be what is necessary. Just trying to figure out if there is anything else we can do to help manage it or reduce the memory requirements for the splunk-optimize process.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...