Getting Data In

Splunk indexer (also Search head) is using enormous amount of RAM.

damiko
Communicator

Hello, dear Splunk Ninjas!
I've an issue with Search Head, it is using too much RAM. At first I thought that was our VM fault. But after troubleshooting little bit, I found out, that Splunk is using a LOT of RAM. 24 GB out of 24.8GB.

What might be the reason?

With kind regards,
Damiko
alt text
PS: I've added screenshot of "htop" command.

0 Karma
1 Solution

VatsalJagani
Motivator

HI @damiko,

It's troubleshooting problem, you need to go through tree of steps based on condition to reach to the root cause and solve it.

Go to Settings > Monitoring Console > Resource Usage > Resource Usage: Instance.
Check "Physical Memory Usage by Process Class" panel chart that will tell which part of Splunk is utilizing more memory like Splunkd, search, KVstore, etc.

Based on above conclusion you need to further debug. For example search part is taking more memory then you need to check below parts to check how many searches are running and which is taking more time.

  • Activity > Jobs
  • Activity > Triggered Alert
  • Check for savedsearches those are scheduled on the machine and what usual time it is taking, etc.

If Splunkd service is using more memory we might require to go through _internal splunkd logs to check what is going wrong.

Similarly go through different part of splunk to check what is wrong. Troubleshooting process will take some time so, be patience and carry on. Hope you will find the issue soon.

View solution in original post

martynoconnor
Communicator

Hi there,

Which version of Splunk are you running, and does this happen by default or does it happen when you run particular searches? I'm asking about version because mvexpand as a search command has some serious memory usage issues in 7.1.x which were only fixed in 7.2.5.1

Similarly, if it's particular search that seems to eat up all the memory, we can look at that search and see why and how to improve the efficiency of it.

0 Karma

damiko
Communicator

Hello! I'm using 7.2.4 version of Splunk.

0 Karma

VatsalJagani
Motivator

HI @damiko,

It's troubleshooting problem, you need to go through tree of steps based on condition to reach to the root cause and solve it.

Go to Settings > Monitoring Console > Resource Usage > Resource Usage: Instance.
Check "Physical Memory Usage by Process Class" panel chart that will tell which part of Splunk is utilizing more memory like Splunkd, search, KVstore, etc.

Based on above conclusion you need to further debug. For example search part is taking more memory then you need to check below parts to check how many searches are running and which is taking more time.

  • Activity > Jobs
  • Activity > Triggered Alert
  • Check for savedsearches those are scheduled on the machine and what usual time it is taking, etc.

If Splunkd service is using more memory we might require to go through _internal splunkd logs to check what is going wrong.

Similarly go through different part of splunk to check what is wrong. Troubleshooting process will take some time so, be patience and carry on. Hope you will find the issue soon.

View solution in original post

damiko
Communicator

@VatsalJagani I went to Settings > Monitoring Console > Resource Usage > Resource Usage: Instance.
However dashboard have "No results found" status.

0 Karma

VatsalJagani
Motivator

Make sure there is no error messages and you have access to _internal index data.

0 Karma

damiko
Communicator

@VatsalJagani After I've stopped "$Splunk_HOME$/bin/splunk stop" it is not responding to ./splunk start command...
Do you know why?... 😞

0 Karma

damiko
Communicator

Ok, I've copied splunk file from .rpm file and pasted into our Splunk 🙂

0 Karma

VatsalJagani
Motivator

I didn't get that. What do you mean? Please make sure you don't loose your splunk indexed data.

0 Karma

damiko
Communicator

I've downloaded splunk.rpm archive, and copied bin/splunk from there to my bin/
Now I can start Splunk again.

0 Karma

VatsalJagani
Motivator

Check htop command again, it might be resource starvation issue.

0 Karma

damiko
Communicator

Thank you Vatsal,
Trying to navigate throw Splunk atm, it's really slow...

0 Karma