Splunk Search

"Your maximum number of concurrent searches has been reached" Why are scheduled reports not cached?

Path Finder

Background: I created a dashboard (actually a few dashboards) that used many heavy hitting searches. Well, the Splunk servers couldn't handle the load so I redesigned to using scheduled reports. The users lost flexibility and I had to do a lot more work (instead of four dashboards with user flexibility for report time ranges, I'll have 200+ reports with fixed time ranges) to create all these reports and dashboards, but it seemed to be working.

Problem: Two days later I go to redesign one of the dashboards and the server is failing with:

In handler 'savedsearch': Search not executed: Your maximum number of concurrent searches has been reached. usage=10 quota=10 user=...

It seems like the server isn't caching and remembering that it has already run the report. How can I check on this? How can I get it to reliably work and not pound the server over and over again running these reports over and over again?

0 Karma

Path Finder

I've been told that jobs must be scheduled daily. Apparently this functionality doesn't work if the report is scheduled to run monthly.

0 Karma

Builder

I may not understand this issue completely, but if you want to schedule a report to run monthly and keep a record of it then couldn't you set it to expire after 31 days? When is the current expiration period for the report?

0 Karma

Path Finder

The reports power panels on dashboards. I'd like the users to be able to view the dashboards throughout the month without triggering the reports to have to be re-run. Thus set up the dashboards using "scheduled reports" but it seems to only work if the reports are run daily, not monthly.

I've never seen anywhere to set an "expiration" for a report.

0 Karma

Builder

This will work for monthly reports as well. You can set the expiration of the report in the scheduler settings. Under the alert section, there is an 'Expiration'; by default it is set for 24 hours. You can change this to 30 days and can verify the expiration of the report after it runs by checking the 'expires' date in the job management.

0 Karma

Path Finder

Where are "scheduler settings"? These reports do not generate any alerts, they are simply to be displayed on dashboards. Can this be done on a report by report basis or is this a server wide setting?

If I inspect an individual "job" that has run, I can see that it has "alert.expires" as "24h". Where do we change / set that for a job/report that doesn't use alerting? The options I see for editing a report are "Edit Description", "Edit Permissions", "Edit Schedule", "Edit Acceleration" (as well as Clone/Embed/Delete).

0 Karma

Builder

Go to Settings > Searches, reports, and alerts. Find & click the search you want to edit. The expiration setting is about half way down.

0 Karma

Path Finder

Thank you! I very much appreciate the accurate instructions.

I've noticed that if I change the expiration setting, then change the schedule, then view the expiration again, it has been reset to 24 hours. This would make it very tricky to make sure you haven't changed it and forgotten to re-adjust the expiration length.

0 Karma

Builder

Glad it helped! I don't remember that the expiration period changes when adjusting the schedule, but I guess it's good practice to just verify the all the settings when making any changes.

0 Karma

Path Finder

Easier said than done when fiddling with a few hundred reports on a dozen dashboards.

0 Karma

Builder

Have you looked at using Accelerated Reports and Summary indexes? This should make running the heavier searches much quicker. I found it eliminated the pain of having to schedule a bunch of reports as well.

0 Karma

Path Finder

These are all driven from DBQuery searches, I don't believe accelerated reports or summary indexes would be possible.

0 Karma

Builder

My previous comments seem to have not posted.
I believe it is possible to index data from the DB into Splunk. You can also create a nightly report and save the results to a summary index and then you can use the index to create daily, weekly, monthly reports. This can also be used on your dashboard to allow users to select specific time ranges. I would probably go this route instead of creating many fixed time range reports.

0 Karma

Builder

I believe you can have Splunk index data from the database if you choose. You can also save the search results to an index and then future searches can use the index. For example, schedule a nightly search to populate the summary index. Then you can quickly search against the index to create your daily, weekly, monthly reports and allow users to filter on the dashboard.

0 Karma

Builder

I believe you can configure Splunk to index from the DB if you choose. You can also schedule the reports to run and save the results to an index. Then future searches can search against the index instead. How are you using DBQuery? I think it would be best to use SQL on the DB to create your reports/views and then just query the view from Splunk. This shouldn't cause any type of load on the Splunk server.

0 Karma

Motivator

TTL on limits.conf I believe, im not sure.

0 Karma

Motivator
0 Karma

Path Finder

I don't think that is the approach or concern. The reports are scheduled to be run at 3 AM with a two hour time window so there should be plenty of flexibility for the server to run them as it can handle them.

What I'd like to understand more is how to make sure the server saves (caches) a report it has run and how can I check on this list of cached reports? It seems like I should be able to pull up a listing of all reports being cached (and display when they were last run).

Whatever went wrong it seems to be trying to run (some of) the reports over again when the dashboard is viewed rather than instantly displaying already calculated data.

0 Karma

Motivator

I see, can you paste a piece of the code from the dashboard ?

Perhaps using "useHistory" (auto or true) parameter could fix your problem if not, you should try summary indexing.

0 Karma

Path Finder

Here is a snippet (anonymized) from the dashboard:

<panel>
  <chart>
    <title>Visits by Day of the Week</title>
    <search ref="Some URL - Activity - Visits by Day of Week - Last Month"></search>
  </chart>
</panel>

So as far as I can tell the XML is just referring to the report. But apparently it can't find the already run results of the report and so it runs it again... do this a few dozen times on very hard hitting searches and the server bogs down and starts failing.

0 Karma

Motivator

Try adding the useHistory parameter, like this. I can't tell if this is still working.

<panel>
  <chart>
     <title>Visits by Day of the Week</title>
     <search ref="Some URL - Activity - Visits by Day of Week - Last Month"></search>
     <option name="useHistory">auto</option>
   </chart>
</panel>
0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!