Splunk Search

Search Query consuming high memory utilization on indexers

harshsri21
New Member

Hi,

I am trying to find a list of search queries in a specific time frame that consumed high memory on the indexers.
We have an indexer cluster of 40 indexers and search head cluster of 4 SHs, suddenly for a short span of time we experienced high memory utilization on 33 indexers and consequently 2 SHs also went down.

Please help in generating the query and understanding the cause of such behavior.

Tags (1)
0 Karma

DalJeanis
Legend

Try something like this...

index=_audit action="search" info="completed" NOT user="splunk-system-user"
| table user, is_realtime, total_run_time, exec_time ,result_count 
| eval exec_time=strftime(exec_time,"%m/%d/%Y %H:%M:%S:%3Q") 
| sort 0 - total_run_time

If something is chewing up a lot of resources, it's going to have a high total_run_time, so that query should float it up to the top. You can limit it to the time in question, plus a little before and after, and it should give you a few candidates to check for a resource hog.

You can also add to the initial search is_realtime=1, to look just at any realtime searches. They tend to be massive cpu sucks, so check them out as well.

0 Karma

harshsri21
New Member

Thanks, Can we also get a splunk query to know which processes are consuming high memory on indexers...

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...