Monitoring Splunk

How to find out the high memory usage searches on the Splunk search heads?

abhisplunk1
Explorer
 
Labels (1)
0 Karma
1 Solution

NareshKilaru
Engager

You can use the introspection search to find out the high memory consuming searches

 

index=_introspection sourcetype=splunk_resource_usage data.search_props.sid::* data.search_props.mode!=RT data.search_props.user!="splunk-system-user"
| eval process = 'data.process'
| eval args = 'data.args'
| eval pid = 'data.pid'
| eval ppid = 'data.ppid'
| eval elapsed = 'data.elapsed'
| eval mem_used = 'data.mem_used'
| eval mem = 'data.mem'
| eval pct_memory = 'data.pct_memory'
| eval pct_cpu = 'data.pct_cpu'
| eval sid = 'data.search_props.sid'
| eval app = 'data.search_props.app'
| eval label = 'data.search_props.label'
| eval type = 'data.search_props.type'
| eval mode = 'data.search_props.mode'
| eval user = 'data.search_props.user'
| eval role = 'data.search_props.role'
| eval label = if(isnotnull('data.search_props.label'), 'data.search_props.label', "")
| eval provenance = if(isnotnull('data.search_props.provenance'), 'data.search_props.provenance', "unknown")
| eval search_head = case(isnotnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", 'data.search_props.search_head', isnull('data.search_props.search_head') AND 'data.search_props.role' == "head", "_self", isnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", "_unknown")
| eval search_label = if('label'!="", 'label', 'sid')
| eval instance = if(isnotnull(dns_alt_name), dns_alt_name, host)
| stats max(elapsed) as runtime max(mem_used) as mem_used earliest(_time) as _time by search_label, provenance, type, mode, app, role, user, instance
| eval mem_used = round(mem_used, 2)
| sort 20 - mem_used, runtime
| eval runtime = tostring(round(runtime, 2), "duration")
| fields search_label, provenance, mem_used, instance, runtime, _time, type, mode, app, user, role
| eval _time=strftime(_time,"%+")
| rename search_label as Name, provenance as Provenance, mem_used as "Memory Usage (KB)", instance as Instance, runtime as "Search Duration", _time as Started, type as Type, mode as Mode, app as App, user as User, role as Role
| appendpipe
[ stats count
| eval Name="data unavailable"
| where count==0
| table Name ]

View solution in original post

abhisplunk1
Explorer

Thankyou This is what I am looking for.

0 Karma

NareshKilaru
Engager

You can use the introspection search to find out the high memory consuming searches

 

index=_introspection sourcetype=splunk_resource_usage data.search_props.sid::* data.search_props.mode!=RT data.search_props.user!="splunk-system-user"
| eval process = 'data.process'
| eval args = 'data.args'
| eval pid = 'data.pid'
| eval ppid = 'data.ppid'
| eval elapsed = 'data.elapsed'
| eval mem_used = 'data.mem_used'
| eval mem = 'data.mem'
| eval pct_memory = 'data.pct_memory'
| eval pct_cpu = 'data.pct_cpu'
| eval sid = 'data.search_props.sid'
| eval app = 'data.search_props.app'
| eval label = 'data.search_props.label'
| eval type = 'data.search_props.type'
| eval mode = 'data.search_props.mode'
| eval user = 'data.search_props.user'
| eval role = 'data.search_props.role'
| eval label = if(isnotnull('data.search_props.label'), 'data.search_props.label', "")
| eval provenance = if(isnotnull('data.search_props.provenance'), 'data.search_props.provenance', "unknown")
| eval search_head = case(isnotnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", 'data.search_props.search_head', isnull('data.search_props.search_head') AND 'data.search_props.role' == "head", "_self", isnull('data.search_props.search_head') AND 'data.search_props.role' == "peer", "_unknown")
| eval search_label = if('label'!="", 'label', 'sid')
| eval instance = if(isnotnull(dns_alt_name), dns_alt_name, host)
| stats max(elapsed) as runtime max(mem_used) as mem_used earliest(_time) as _time by search_label, provenance, type, mode, app, role, user, instance
| eval mem_used = round(mem_used, 2)
| sort 20 - mem_used, runtime
| eval runtime = tostring(round(runtime, 2), "duration")
| fields search_label, provenance, mem_used, instance, runtime, _time, type, mode, app, user, role
| eval _time=strftime(_time,"%+")
| rename search_label as Name, provenance as Provenance, mem_used as "Memory Usage (KB)", instance as Instance, runtime as "Search Duration", _time as Started, type as Type, mode as Mode, app as App, user as User, role as Role
| appendpipe
[ stats count
| eval Name="data unavailable"
| where count==0
| table Name ]

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...