Can you please provide instructions on how to check the history of ran report searches on the Splunk Cloud?
The reason I ask is that a number of users have complained about performance issues when running certain reports. For example, there are particular reports that normally take 5 - 10 minutes to load and at intermittent times are now taking 2 - 4 hours to load or even timeout.
I would like to correlate these searches with the affected times mentioned by the users to see if there are any cloud compute issues or if it's as simple as too many searches being run at the same time.
Are you a Splunk Cloud Customer or just running it in AWS?
Are these scheduled searches or just normal adhoc searches?
This will get you general information (set the visualization options to "Multi Series Mode" - Yes and and Y Axis - "Axis Range" to Independent.
Change host= to your appropriate search head..
index=_audit action=search host=<hostname> NOT search_id="rsa_scheduler*" NOT *ACCELERATE_DM*
(info="completed" OR status=success)
| timechart span=5m count as NumSearches p90(total_run_time) AS p90TimeInSeconds avg(total_run_time) AS AvgRunTimeInSeconds median(total_run_time) AS MedianRunTimeInSeconds
| eval MedianRunTimeInSeconds=round(MedianRunTimeInSeconds,1)
| eval p90TimeInSeconds=round(p90TimeInSeconds,1)
| rename savedsearch_name as title
We currently have a development team creating a separate user interface that calls to an API to retrieve these Splunk reports, so I would like to confirm that the Splunk Runtimes are performing as expected in the cloud to isolate the issue.