All Apps and Add-ons
Highlighted

Why is Splunk freezing when another dashboard is being used?

Motivator

HI

I have 2 dashboards, one is running a very heavy tstat queary that takes 60 seconds.
When i check the jobs running i can see 2 searches running.

When this i running i can't log into my other dashboard. it just hangs untill dashboard one is finished.

The CPU on the box is 18% used, 28 Cores (56 with turbo on) Intel chip box .
Intel(R) Xeon(R) CPU E5-2690 v4 @ 2.60GHz

Below are the setting i have in my limits.conf - Perhaps i have something set i should not have

[search]
batch_search_max_pipeline = 4

base_max_searches = 100

max_searches_per_cpu = 10

max_rt_search_multiplier = 4
allow_batch_mode = 1
allow_inexact_metasearch = 0
default_allow_queue = 1
disabled = 0
enable_cumulative_quota = 0
enable_datamodel_meval = 1
enable_history = 1
enable_memory_tracker = 0
force_saved_search_dispatch_as_user = 0
load_remote_bundles = 0
remote_timeline = 1
timeline_events_preview = 0
track_indextime_range = 1
truncate_report = 0
unified_search = 0
use_bloomfilter = 1
use_metadata_elimination = 1
write_multifile_results_out = 1
max_rawsize_perchunk = 0

[join]
subsearch_maxout = 200000
subsearch_maxtime = 60
subsearch_timeout = 120

[searchresults]
maxresultrows = 200000
0 Karma
Highlighted

Re: Why is Splunk freezing when another dashboard is being used?

Motivator

Hey@robertlynch2020,

Are you using the dashboards as admin user or some other user?
Try changing user-level concurrent searches limits
Or Role level search job limits in Seetings>Access>Roles

Let me know if this helps!!

0 Karma
Highlighted

Re: Why is Splunk freezing when another dashboard is being used?

Motivator

Hi

These are my current setting, i think they should be fine?

User-level concurrent search jobs limit

50
Enter the maximum number of concurrent search jobs for each user of this role.

User-level concurrent real-time search jobs limit

100
Enter the maximum number of concurrent real-time search jobs for each user of this role. This count is independent from the normal search jobs limit.

Role-level concurrent search jobs limit

200
Enter the maximum number of cumulative concurrent search jobs for this role.

Role-level concurrent real-time search jobs limit

400
Enter the maximum number of cumulative concurrent real-time search jobs for this role. This count is independent from the normal search jobs limit.

Limit total jobs disk quota

10000
Enter the total disk space in MB that can be used by a user's search jobs. For example, '100' would limit this role to 100 MB total.strong text

0 Karma
Highlighted

Re: Why is Splunk freezing when another dashboard is being used?

Motivator

assuming you are running latest splunk version
with splunk admin permissions still your dashboard-2 hang ?

limits.conf

# This section contains settings for search concurrency limits.

base_max_searches = <int>
* A constant to add to the maximum number of searches, computed as a 
  multiplier of the CPUs.
* Default: 6

max_rt_search_multiplier = <decimal number>
* A number by which the maximum number of historical searches is multiplied
  to determine the maximum number of concurrent real-time searches.
* Note: The maximum number of real-time searches is computed as:
  max_rt_searches = max_rt_search_multiplier x max_hist_searches
* Default: 1

max_searches_per_cpu = <int>
* The maximum number of concurrent historical searches for each CPU. 
  The system-wide limit of historical searches is computed as:
  max_hist_searches =  max_searches_per_cpu x number_of_cpus + base_max_searches
* NOTE: The maximum number of real-time searches is computed as:
  max_rt_searches = max_rt_search_multiplier x max_hist_searches
* Default: 1

maxhistsearches = maxsearchespercpu x numberofcpus + basemax_searches
= (1* 28) + 6 = 28 + 6 = 34

maxrtsearches = maxrtsearchmultiplier x maxhist_searches
= 1 * 34 = 34

Its not advised to modify default limits.conf without proper assessment/support.

With the default limits.conf settings you can run the above number of concurrent searches - If your splunk is still loaded , and If your dashboard -2 should at least show "Queued job waiting to start" - you can conclude that your job is waiting for resources

0 Karma
Highlighted

Re: Why is Splunk freezing when another dashboard is being used?

Motivator

HI

Thanks for this.
I am running Splunk 7.

with splunk admin permissions still your dashboard-2 hang ? - Yes it hangs on most screens only 3 jobs are running an nothing is Q. This screen will not load http://splunk:8000/en-GB/app/murex_mlc/job_manager

So to test, i run a very large search on dashboard 1 - about 5 mintes of heavy tstats.
Then when it gets going, i try and open another dashboard, but its like as if it cant connect to SPLUNK.
So i don't even get to click on dashboard 2, to drive difficult search. At this point Splunk is not really working at ll.

Even running http://splunk:8000/en-GB/app/murex_mlc/job_manager does-not load

Any help would be great 🙂

procs -----------memory---------- ---swap-- -----io---- -system-- ------cpu-----
 r  b   swpd   free   buff  cache   si   so    bi    bo   in   cs us sy id wa st
 9  0      0 3628604   4292 379449056    0    0   233   195    1    1  3  1 96  0  0
 6  0      0 4089044   4292 379941632    0    0 65061   301 113781 23623  9  4 87  0  0
 8  0      0 2759252   4292 380373920    0    0 51228   436 75845 24211 14  4 81  0  0
 7  0      0 2921780   4292 380504128    0    0 42862   325 75419 20671 13  4 83  0  0
 7  0      0 2370636   4292 380812768    0    0 38909 55606 91095 21279 10  3 87  0  0
 8  0      0 2458808   4292 380990176    0    0 50385 12916 88924 20976 10  3 86  0  0
 8  0      0 3018360   4292 381120640    0    0 48407   286 94513 16208 11  3 86  0  0
11  0      0 1739156   4292 380724224    0    0 46630   286 338549 23542 12  4 84  0  0
11  0      0 1399432   4292 380125376    0    0 10394  7102 790517 23970 15  4 81  0  0
10  0      0 1481324   4292 380425664    0    0     0   111 628053 21174 15  4 81  0  0
11  0      0 2428852   4292 380682944    0    0     0 50158 955866 22223 13  5 82  0  0
11  0      0 1462988   4292 380900480    0    0     0  8618 799501 21775 13  5 82  0  0
 9  0      0 1520836   4292 380361088    0    0     0    53 632406 16847 14  5 82  0  0
10  0      0 2000348   4292 380087680    0    0     0     2 815441 23082 14  5 81  0  0
30  0      0 1548704   4292 380093952    0    0     0     0 792144 20819 16  5 79  0  0
14  0      0 4115076   4292 376929664    0    0     6 46813 889905 24017 32  9 59  0  0
13  0      0 4267528   4292 376846592    0    0     3  4637 818489 25031 21  6 73  0  0
13  0      0 5025256   4292 376267840    0    0     3  4088 818270 25629 19  6 75  0  0
12  0      0 5800516   4292 375848128    0    0     0     0 796476 18840 19  5 76  0  0
11  0      0 6252184   4292 375760096    0    0     0   766 863946 18655 16  5 79  0  0
10  0      0 5578976   4292 376279008    0    0     0 48390 735564 23757 15  4 80  0  0
0 Karma