Splunk Enterprise

How to resolve this error: splunk service restart due to Checking max_mem_usage_mb resultsSize

jinnypt
Explorer

Hello. 

The splunk service is restarting with an error as shown below during report scheduling execution at a specific time period.

 

search fail log

08-17-2022 06:00:04.164 INFO SearchOperator:inputcsv [12673 phase_1] - sid:scheduler__admin__search__RMD52e8470291689a839_at_1660683600_5272 Successfully read lookup file '/opt/splunk/etc/apps/search/lookups/xxx.csv'.
08-17-2022 06:00:04.166 INFO MultiValueProcessor [12673 phase_1] - Checking max_mem_usage_mb resultsSize=100 maxHeapSize=15728640000 memoryUsage=1824925 earlyExit=0
08-17-2022 06:00:04.169 INFO MultiValueProcessor [12673 phase_1] - Checking max_mem_usage_mb resultsSize=200 maxHeapSize=15728640000 memoryUsage=6273048 earlyExit=0
08-17-2022 06:00:04.170 INFO MultiValueProcessor [12673 phase_1] - Checking max_mem_usage_mb resultsSize=300 maxHeapSize=15728640000 memoryUsage=7531940 earlyExit=0
....
08-17-2022 06:00:06.484 INFO MultiValueProcessor [12673 phase_1] - Checking max_mem_usage_mb resultsSize=25200 maxHeapSize=15728640000 memoryUsage=531030711 earlyExit=0
08-17-2022 06:00:06.485 INFO MultiValueProcessor [12673 phase_1] - Checking max_mem_usage_mb resultsSize=25300 maxHeapSize=15728640000 memoryUsage=531809607 earlyExit=0
08-17-2022 06:00:13.237 FATAL ProcessRunner [9783 ProcessRunner] - Unexpected EOF from process runner child!
08-17-2022 06:00:13.238 FATAL ProcessRunner [9783 ProcessRunner] - Helper process was killed by SIGKILL. Usually this indicates that the kernel's OOM-killer has decided to terminate the daemon process.
08-17-2022 06:00:13.238 FATAL ProcessRunner [9783 ProcessRunner] - Check the kernel log (possibly /var/log/messages) for more info
08-17-2022 06:00:13.238 ERROR ProcessRunner [9783 ProcessRunner] - helper process seems to have died (child killed by signal 9: Killed)!

 

Splunk config information

/opt/splunk/etc/system/local/limit.conf

[default]
max_mem_usage_mb = 30000

 

/opt/splunk/etc/apps/search/local/limit.conf

[default]
max_mem_usage_mb = 10000

 

Even with the above settings, it seems that the memory is not actually used as much as the settings.

Splunk Spec: 16core, 64GB

 

If anyone knows about this issue, please share.

Labels (1)
0 Karma
Get Updates on the Splunk Community!

Observability Newsletter Highlights | March 2023

 March 2023 | Check out the latest and greatestSplunk APM's New Tag Filter ExperienceSplunk APM has updated ...

Security Newsletter Updates | March 2023

 March 2023 | Check out the latest and greatestUnify Your Security Operations with Splunk Mission Control The ...

Platform Newsletter Highlights | March 2023

 March 2023 | Check out the latest and greatestIntroducing Splunk Edge Processor, simplified data ...