Monitoring Splunk

How to find out the high memory usage searches on the Splunk search heads?

abhisplunk1
Explorer
 
Labels (1)
0 Karma
1 Solution

NareshKilaru
Engager

You can use the introspection search to find out the high memory consuming searches

 

index=_introspection sourcetype=splunk_resource_usage data.search_props.sid::* data.search_props.mode!=RT data.search_props.user!="splunk-system-user"
| eval process = 'data.process'
| eval args = 'data.args'
| eval pid = 'data.pid'
| eval ppid = 'data.ppid'
| eval elapsed = 'data.elapsed'
| eval mem_used = 'data.mem_used'
| eval mem = 'data.mem'
| eval pct_memory = 'data.pct_memory'
| eval pct_cpu = 'data.pct_cpu'
| eval sid = 'data.search_props.sid'
| eval app = 'data.search_props.app'
| eval label = 'data.search_props.label'
| eval type = 'data.search_props.type'
| eval mode = 'data.search_props.mode'
| eval user = 'data.search_props.user'
| eval role = 'data.search_props.role'
| eval label = if(isnotnull('data.search_props.label'), 'data.search_props.label', "")
| eval provenance = if(isnotnull('data.search_props.provenance'), 'data.search_props.provenance', "unknown")
| eval search_head = case(isnotnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", 'data.search_props.search_head', isnull('data.search_props.search_head') AND 'data.search_props.role' == "head", "_self", isnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", "_unknown")
| eval search_label = if('label'!="", 'label', 'sid')
| eval instance = if(isnotnull(dns_alt_name), dns_alt_name, host)
| stats max(elapsed) as runtime max(mem_used) as mem_used earliest(_time) as _time by search_label, provenance, type, mode, app, role, user, instance
| eval mem_used = round(mem_used, 2)
| sort 20 - mem_used, runtime
| eval runtime = tostring(round(runtime, 2), "duration")
| fields search_label, provenance, mem_used, instance, runtime, _time, type, mode, app, user, role
| eval _time=strftime(_time,"%+")
| rename search_label as Name, provenance as Provenance, mem_used as "Memory Usage (KB)", instance as Instance, runtime as "Search Duration", _time as Started, type as Type, mode as Mode, app as App, user as User, role as Role
| appendpipe
[ stats count
| eval Name="data unavailable"
| where count==0
| table Name ]

View solution in original post

abhisplunk1
Explorer

Thankyou This is what I am looking for.

0 Karma

NareshKilaru
Engager

You can use the introspection search to find out the high memory consuming searches

 

index=_introspection sourcetype=splunk_resource_usage data.search_props.sid::* data.search_props.mode!=RT data.search_props.user!="splunk-system-user"
| eval process = 'data.process'
| eval args = 'data.args'
| eval pid = 'data.pid'
| eval ppid = 'data.ppid'
| eval elapsed = 'data.elapsed'
| eval mem_used = 'data.mem_used'
| eval mem = 'data.mem'
| eval pct_memory = 'data.pct_memory'
| eval pct_cpu = 'data.pct_cpu'
| eval sid = 'data.search_props.sid'
| eval app = 'data.search_props.app'
| eval label = 'data.search_props.label'
| eval type = 'data.search_props.type'
| eval mode = 'data.search_props.mode'
| eval user = 'data.search_props.user'
| eval role = 'data.search_props.role'
| eval label = if(isnotnull('data.search_props.label'), 'data.search_props.label', "")
| eval provenance = if(isnotnull('data.search_props.provenance'), 'data.search_props.provenance', "unknown")
| eval search_head = case(isnotnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", 'data.search_props.search_head', isnull('data.search_props.search_head') AND 'data.search_props.role' == "head", "_self", isnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", "_unknown")
| eval search_label = if('label'!="", 'label', 'sid')
| eval instance = if(isnotnull(dns_alt_name), dns_alt_name, host)
| stats max(elapsed) as runtime max(mem_used) as mem_used earliest(_time) as _time by search_label, provenance, type, mode, app, role, user, instance
| eval mem_used = round(mem_used, 2)
| sort 20 - mem_used, runtime
| eval runtime = tostring(round(runtime, 2), "duration")
| fields search_label, provenance, mem_used, instance, runtime, _time, type, mode, app, user, role
| eval _time=strftime(_time,"%+")
| rename search_label as Name, provenance as Provenance, mem_used as "Memory Usage (KB)", instance as Instance, runtime as "Search Duration", _time as Started, type as Type, mode as Mode, app as App, user as User, role as Role
| appendpipe
[ stats count
| eval Name="data unavailable"
| where count==0
| table Name ]

SVL
Splunk Employee
Splunk Employee

Hi, in the SPL query, rename mem_used as "Memory Usage (KB)" >>> It should be in MB instead. 

Get Updates on the Splunk Community!

New This Month - Splunk Observability updates and improvements for faster ...

What’s New? This month, we’re delivering several enhancements across Splunk Observability Cloud for faster and ...

What's New in Splunk Cloud Platform 9.3.2411?

Hey Splunky People! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2411. This release ...

Buttercup Games: Further Dashboarding Techniques (Part 6)

This series of blogs assumes you have already completed the Splunk Enterprise Search Tutorial as it uses the ...