Splunk Search

Search Query consuming high memory utilization on indexers

harshsri21
New Member

Hi,

I am trying to find a list of search queries in a specific time frame that consumed high memory on the indexers.
We have an indexer cluster of 40 indexers and search head cluster of 4 SHs, suddenly for a short span of time we experienced high memory utilization on 33 indexers and consequently 2 SHs also went down.

Please help in generating the query and understanding the cause of such behavior.

Tags (1)
0 Karma

DalJeanis
Legend

Try something like this...

index=_audit action="search" info="completed" NOT user="splunk-system-user"
| table user, is_realtime, total_run_time, exec_time ,result_count 
| eval exec_time=strftime(exec_time,"%m/%d/%Y %H:%M:%S:%3Q") 
| sort 0 - total_run_time

If something is chewing up a lot of resources, it's going to have a high total_run_time, so that query should float it up to the top. You can limit it to the time in question, plus a little before and after, and it should give you a few candidates to check for a resource hog.

You can also add to the initial search is_realtime=1, to look just at any realtime searches. They tend to be massive cpu sucks, so check them out as well.

0 Karma

harshsri21
New Member

Thanks, Can we also get a splunk query to know which processes are consuming high memory on indexers...

0 Karma
Get Updates on the Splunk Community!

.conf25 Registration is OPEN!

Ready. Set. Splunk! Your favorite Splunk user event is back and better than ever. Get ready for more technical ...

Detecting Cross-Channel Fraud with Splunk

This article is the final installment in our three-part series exploring fraud detection techniques using ...

Splunk at Cisco Live 2025: Learning, Innovation, and a Little Bit of Mr. Brightside

Pack your bags (and maybe your dancing shoes)—Cisco Live is heading to San Diego, June 8–12, 2025, and Splunk ...