Monitoring Splunk

DMC Alert - Critical System Physical Memory Usage

sunnyparmar
Communicator

Hi,

I am getting physical memory usage alert from my main Splunk server for the server itself where it is installed and as I have figured out splunkd, mongod and python (all three processes of Splunk itself) consuming the highest memory on server so how to get rid of this? I have restarted Splunk services twice but didn't get the expected result so any advise would be appreciated.

Thanks

0 Karma
1 Solution

ddrillic
Ultra Champion

http://docs.splunk.com/Documentation/Splunk/6.2.4/Admin/Platformalerts says -
Critical system physical memory usage -
Fires when one or more instances exceeds 90% memory usage. On most Linux distributions, this alert can trigger if the OS is engaged in buffers and filesystem cacheing activities. The OS releases this memory if other processes need it, so it does not always indicate a serious problem.

So, it says Critical, but it's not necessarily critical - it's always a bit tricky to figure out how much memory is being used, excluding caching...

View solution in original post

ddrillic
Ultra Champion

http://docs.splunk.com/Documentation/Splunk/6.2.4/Admin/Platformalerts says -
Critical system physical memory usage -
Fires when one or more instances exceeds 90% memory usage. On most Linux distributions, this alert can trigger if the OS is engaged in buffers and filesystem cacheing activities. The OS releases this memory if other processes need it, so it does not always indicate a serious problem.

So, it says Critical, but it's not necessarily critical - it's always a bit tricky to figure out how much memory is being used, excluding caching...

carvay
Engager

I have exactly the same problem in several physical indexers.

For example:

# top
top - 09:53:50 up 14 days, 15:24,  2 users,  load average: 13.25, 31.35, 35.30
Tasks: 869 total,   1 running, 868 sleeping,   0 stopped,   0 zombie
Cpu(s):  7.2%us,  1.5%sy,  0.0%ni, 91.3%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:  65842216k total, 57581624k used,  8260592k free,  1530800k buffers

But if we run "free":

# free -m
             total       used       free     shared    buffers     cached
Mem:         64299      56687       7611        915       1496      46310
-/+ buffers/cache:       8881      55417
Swap:         2047        120       1927

Looks like"reserved" memory is being presented as "used" memory.

Any hints?

0 Karma

s2_splunk
Splunk Employee
Splunk Employee

What are your server specs and what kind of workload is the server processing (daily ingest, number of searches)?

Is "your main server" an indexer or a SH+indexer? Where are you running the DMC?

What operating system is your indexer running on?

0 Karma

sunnyparmar
Communicator
  1. Number of Cores - 4
  2. Physical Memory Capacity (MB) - 7865
  3. Operating System - Linux
  4. CPU Architecture - x86_64

Main server is not an indexer. Indexer is on some other server i.e. on Windows OS and DMC is also on main server where Splunk is installed.

Thanks

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

Add more memory 🙂

More to the point, what version are you on? There used to be a bug causing that alert to include disk cache in the calculation - resulting in critical usage all the time.

0 Karma

sunnyparmar
Communicator

thanks for replying.. I am using version 6.2.1

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

Consider updating, there have been many improved or fixed things since 6.2.1 - this might just be one of those.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In September, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...