Dashboards & Visualizations

How do you identify which saved searches are consuming max resources to monitor on a dashboard panel?

kamal_jagga
Contributor

I have 65 Saved searches.

Now, I want to build a dashboard panel in which top 10 searches consuming max resources can be depicted. Kindly advise.

I have read answers to other questions also, but no where any specific search is mentioned.

Kindly advise.

0 Karma

gjanders
SplunkTrust
SplunkTrust

In the app Alerts for Splunk Admins or github links to the dashboards I've got a few dashboards that could help here.

Troubleshooting indexer CPU & Troubleshooting resource usage per user, you could tweak either of those to look for the search id's with the scheduler in the name...they measure CPU usage , memory usage et cetera.

As pointed out in another post you can get some of this information via the monitoring console as well, it depends on how much information you want about the impact of scheduled searches (or searches in general)

0 Karma

prakash007
Builder

@kamal_jagga : If you have a DMC configured in your environment, there are prebuilt dashboards in there...
DMC-->search--->activity--->search activity:instance--->select your SearchHead instance(Top 20 Memory-Consuming Searches if you scroll down). you can use that underlying search with some tweaks to build a dashboard.
https://docs.splunk.com/Documentation/Splunk/7.2.1/DMC/SearchactivityDeploymentwide#Interpret_result...

0 Karma

epeeran
New Member

index=_internal savedsearch_name=* NOT savedsearch_name="" | stats count by savedsearch_name | sort count | head 30

0 Karma

epeeran
New Member

index=_internal savedsearch_name=* NOT savedsearch_name="" | stats count by savedsearch_name | sort count | head 30

0 Karma

epeeran
New Member

index=_internal savedsearch_name=* NOT savedsearch_name="" | stats count by savedsearch_name | sort count | head 30

0 Karma

David
Splunk Employee
Splunk Employee

Two options for you. The specific answer to your question is:

index=_internal savedsearch_name=* NOT savedsearch_name="" sourcetype=splunk_audit OR sourcetype=audittrail | stats count sum(total_run_time) avg(total_run_time) avg(scan_count) avg(event_count) by savedsearch_name | sort 10 - "sum(total_run_time)"

That said, if you want to do large scale analysis on your search logs, I recommend checking out my app, Search Activity. The current version of the app doesn't have a report (I'm adding it into the next version). However, you can run the report with the following search:

| tstats count sum(total_run_time) avg(total_run_time) avg(scan_count) avg(event_count) values(user) from `SA_SearchHistory` where searchtype=scheduled groupby savedsearch_name | sort 10 - "sum(total_run_time)"

The benefit of my app for this analysis (in addition to all the other visibility you can get) is that in my lab, it is over 50 times faster. If you want to do more, or the first search isn't fast enough, check out the app.

0 Karma

kamal_jagga
Contributor

Hi,

The above search provided by you didn't give the results. But i removed some portion of the string and got results with the below mentioned query.

index=_internal savedsearch_name=* NOT savedsearch_name="" | stats count by savedsearch_name | sort 30

But it still didn't give the metrics of the resource consumption by individual searches.

0 Karma

epeeran
New Member

index=_internal savedsearch_name=* NOT savedsearch_name="" | stats count by savedsearch_name | sort count | head 30

0 Karma
Get Updates on the Splunk Community!

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...

Cisco Use Cases, ITSI Best Practices, and More New Articles from Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Build Your First SPL2 App!

Watch the recording now!.Do you want to SPL™, too? SPL2, Splunk's next-generation data search and preparation ...