All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The installation part: 09:10:20 Collecting splunk-appinspect 09:10:20 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/c6/b0/5cda84ecdb188e6e1480a9f934fb1e09f3496b... See more...
The installation part: 09:10:20 Collecting splunk-appinspect 09:10:20 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/c6/b0/5cda84ecdb188e6e1480a9f934fb1e09f3496ba933e4997e22edbacbf3cc/splunk-appinspect-2.38.0.tar.gz (1.2 MB) 09:10:21 Installing build dependencies: started 09:10:33 Installing build dependencies: finished with status 'done' 09:10:33 Getting requirements to build wheel: started 09:10:34 Getting requirements to build wheel: finished with status 'done' 09:10:34 Preparing wheel metadata: started 09:10:35 Preparing wheel metadata: finished with status 'done' 09:10:35 Collecting packaging==21.3 09:10:35 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/05/8e/8de486cbd03baba4deef4142bd643a3e7bbe954a784dc1bb17142572d127/packaging-21.3-py3-none-any.whl (40 kB) 09:10:35 Collecting markdown==3.*,>=3.1.1 09:10:35 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/bb/c1/50caaec6cadc1c6adc8fe351e03bd646d6e4dd17f55fca0f4c8d7ea8d3e9/Markdown-3.5-py3-none-any.whl (101 kB) 09:10:36 Collecting ipaddress==1.*,>=1.0.22 09:10:36 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/c2/f8/49697181b1651d8347d24c095ce46c7346c37335ddc7d255833e7cde674d/ipaddress-1.0.23-py2.py3-none-any.whl (18 kB) 09:10:36 Collecting langdetect==1.*,>=1.0.7 09:10:36 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/0e/72/a3add0e4eec4eb9e2569554f7c70f4a3c27712f40e3284d483e88094cc0e/langdetect-1.0.9.tar.gz (981 kB) 09:10:38 Collecting python-magic==0.4.24 09:10:38 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/d3/99/c89223c6547df268596899334ee77b3051f606077317023617b1c43162fb/python_magic-0.4.24-py2.py3-none-any.whl (12 kB) 09:10:38 Collecting mako==1.*,>=1.0.12 09:10:38 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/03/3b/68690a035ba7347860f1b8c0cde853230ba69ff41df5884ea7d89fe68cd3/Mako-1.2.4-py3-none-any.whl (78 kB) 09:10:38 Requirement already satisfied: six==1.*,>=1.12.0 in /usr/lib/python3/dist-packages (from splunk-appinspect) (1.16.0) 09:10:38 Collecting pyyaml==6.*,>=6.0.1 09:10:38 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/7d/39/472f2554a0f1e825bd7c5afc11c817cd7a2f3657460f7159f691fbb37c51/PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (738 kB) 09:10:41 Collecting lxml==4.*,>=4.6.0 09:10:41 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/c5/a2/7876f76606725340c989b1c73b5501fc41fb21e50a8597c9ecdb63a05b27/lxml-4.9.3-cp39-cp39-manylinux_2_28_x86_64.whl (8.0 MB) 09:10:41 Requirement already satisfied: future==0.*,>=0.18.0 in /usr/lib/python3/dist-packages (from splunk-appinspect) (0.18.2) 09:10:42 Collecting beautifulsoup4==4.*,>=4.8.1 09:10:42 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/57/f4/a69c20ee4f660081a7dedb1ac57f29be9378e04edfcb90c526b923d4bebc/beautifulsoup4-4.12.2-py3-none-any.whl (142 kB) 09:10:45 Collecting regex==2022.1.18 09:10:46 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/89/8c/d587899aee993e201b369fb4007419b8a627190c52b40c2de0615f46dec1/regex-2022.1.18-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (763 kB) 09:10:46 Requirement already satisfied: jinja2<4,>=2.11.3 in /usr/local/lib/python3.9/dist-packages (from splunk-appinspect) (3.1.2) 09:10:46 Collecting semver>=2.13.0 09:10:46 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/9a/77/0cc7a8a3bc7e53d07e8f47f147b92b0960e902b8254859f4aee5c4d7866b/semver-3.0.2-py3-none-any.whl (17 kB) 09:10:46 Collecting click==7.*,>=7.0.0 09:10:46 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/d2/3d/fa76db83bf75c4f8d338c2fd15c8d33fdd7ad23a9b5e57eb6c5de26b430e/click-7.1.2-py2.py3-none-any.whl (82 kB) 09:10:47 Collecting enum34==1.*,>=1.1.6 09:10:47 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/63/f6/ccb1c83687756aeabbf3ca0f213508fcfb03883ff200d201b3a4c60cedcc/enum34-1.1.10-py3-none-any.whl (11 kB) 09:10:47 Collecting croniter<2,>0.3.34 09:10:47 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/f2/91/e5ae454da8200c6eb6cf94ca05d799b51e2cb2cc458a7737aebc0c5a21bb/croniter-1.4.1-py2.py3-none-any.whl (19 kB) 09:10:47 Collecting futures-then==0.*,>=0.1.1 09:10:47 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/17/f3/e8a942af4d2eeb68974bbfe5a6ef8a3eb25baf361a44ad6583b2d34bbc38/futures_then-0.1.1.tar.gz (3.3 kB) 09:10:48 Collecting jsoncomment==0.3.3 09:10:48 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/45/4e/f35502e602cd6c48d233719a5bb823aa7348f112456299469109a60564d6/jsoncomment-0.3.3-py3-none-any.whl (5.8 kB) 09:10:48 Collecting defusedxml==0.7.1 09:10:48 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl (25 kB) 09:10:49 Collecting chardet==3.0.4 09:10:49 Using cached https://myartifactserver/artifact/api/pypi/pypi/packages/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133 kB) 09:10:49 Collecting painter==0.*,>=0.3.1 09:10:49 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/82/a1/6b98ddf98374c29f930eae8cbdcde25480e1b83d21c32a3c4e61b0df019c/painter-0.3.1.tar.gz (5.7 kB) 09:11:06 Collecting pillow==9.5.0 09:11:06 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/3b/2b/57915b8af178e2c20bfd403ffed4521947881f9dbbfbaba48210dc59b9d7/Pillow-9.5.0-cp39-cp39-manylinux_2_28_x86_64.whl (3.4 MB) 09:11:07 Collecting soupsieve>1.2 09:11:07 Downloading https://myartifactserver/artifact/api/pypi/pypi/packages/packages/4c/f3/038b302fdfbe3be7da016777069f26ceefe11a681055ea1f7817546508e3/soupsieve-2.5-py3-none-any.whl (36 kB) 09:11:07 Requirement already satisfied: MarkupSafe>=0.9.2 in /usr/local/lib/python3.9/dist-packages (from mako==1.*,>=1.0.12->splunk-appinspect) (2.1.3) 09:11:07 Collecting importlib-metadata>=4.4 09:11:07 Using cached https://myartifactserver/artifact/api/pypi/pypi/packages/packages/cc/37/db7ba97e676af155f5fcb1a35466f446eadc9104e25b83366e8088c9c926/importlib_metadata-6.8.0-py3-none-any.whl (22 kB) 09:11:07 Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/lib/python3/dist-packages (from packaging==21.3->splunk-appinspect) (2.4.7) 09:11:08 Requirement already satisfied: python-dateutil in /usr/lib/python3/dist-packages (from croniter<2,>0.3.34->splunk-appinspect) (2.8.1) 09:11:08 Requirement already satisfied: zipp>=0.5 in /usr/lib/python3/dist-packages (from importlib-metadata>=4.4->markdown==3.*,>=3.1.1->splunk-appinspect) (1.0.0) 09:11:08 Building wheels for collected packages: splunk-appinspect, futures-then, langdetect, painter 09:11:08 Building wheel for splunk-appinspect (PEP 517): started 09:11:10 Building wheel for splunk-appinspect (PEP 517): finished with status 'done' 09:11:10 Created wheel for splunk-appinspect: filename=splunk_appinspect-2.38.0-py3-none-any.whl size=1345524 sha256=357f01e587ba950015c961fc4d87302d1daa2e898a91866586e7c9e2aa26790f 09:11:10 Stored in directory: /home/jkagent/.cache/pip/wheels/76/8b/a9/fe23bb819b710aedd68789c2c0edad998f97d4d3f21741a584 09:11:10 Building wheel for futures-then (setup.py): started 09:11:10 Building wheel for futures-then (setup.py): finished with status 'done' 09:11:10 Created wheel for futures-then: filename=futures_then-0.1.1-py3-none-any.whl size=3644 sha256=7a91593c59f54ae86d89a54bff202d4130f96a54cb8561e790a7d0e57c6756ae 09:11:10 Stored in directory: /home/jkagent/.cache/pip/wheels/6d/a2/e5/d68f808ac4d624e28e0856e004a3092987a7adedd61a901c81 09:11:10 Building wheel for langdetect (setup.py): started 09:11:11 Building wheel for langdetect (setup.py): finished with status 'done' 09:11:11 Created wheel for langdetect: filename=langdetect-1.0.9-py3-none-any.whl size=993222 sha256=6133a0efa46d94abd6e038b9ea949881648885fb168ae0d08bcea8c6d0145358 09:11:11 Stored in directory: /home/jkagent/.cache/pip/wheels/2a/b9/fd/df0c29965aef4c9a549f0e60c5a82a753cc41ac4711cba8872 09:11:11 Building wheel for painter (setup.py): started 09:11:12 Building wheel for painter (setup.py): finished with status 'done' 09:11:12 Created wheel for painter: filename=painter-0.3.1-py3-none-any.whl size=7078 sha256=1b204b1a645f5bbff9b4a79ecd290a496fb0b55ae9ac9dd97e3a7eada6666df2 09:11:12 Stored in directory: /home/jkagent/.cache/pip/wheels/90/1c/11/f0b2176bc83665853e2990a78b741b5b38ae3ef76b719bd176 09:11:12 Successfully built splunk-appinspect futures-then langdetect painter 09:11:12 Installing collected packages: soupsieve, importlib-metadata, semver, regex, pyyaml, python-magic, pillow, painter, packaging, markdown, mako, lxml, langdetect, jsoncomment, ipaddress, futures-then, enum34, defusedxml, croniter, click, chardet, beautifulsoup4, splunk-appinspect 09:11:12 WARNING: The script pysemver is installed in '/home/jkagent/.local/bin' which is not on PATH. 09:11:12 Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. 09:11:13 WARNING: The script strip-color is installed in '/home/jkagent/.local/bin' which is not on PATH. 09:11:13 Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. 09:11:14 WARNING: The script markdown_py is installed in '/home/jkagent/.local/bin' which is not on PATH. 09:11:14 Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. 09:11:14 WARNING: The script mako-render is installed in '/home/jkagent/.local/bin' which is not on PATH. 09:11:14 Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. 09:11:15 WARNING: The script chardetect is installed in '/home/jkagent/.local/bin' which is not on PATH. 09:11:15 Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. 09:11:16 WARNING: The script splunk-appinspect is installed in '/home/jkagent/.local/bin' which is not on PATH. 09:11:16 Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. 09:11:17 Successfully installed beautifulsoup4-4.12.2 chardet-3.0.4 click-7.1.2 croniter-1.4.1 defusedxml-0.7.1 enum34-1.1.10 futures-then-0.1.1 importlib-metadata-6.8.0 ipaddress-1.0.23 jsoncomment-0.3.3 langdetect-1.0.9 lxml-4.9.3 mako-1.2.4 markdown-3.5 packaging-21.3 painter-0.3.1 pillow-9.5.0 python-magic-0.4.24 pyyaml-6.0.1 regex-2022.1.18 semver-3.0.2 soupsieve-2.5 splunk-appinspect-2.38.0  
I do a local splunk-appinspect on packages before uploading them to Splunk Cloud. Each jenkins run will 'pip install splunk-appinspect'. If the same agent has been installed, it will of course not... See more...
I do a local splunk-appinspect on packages before uploading them to Splunk Cloud. Each jenkins run will 'pip install splunk-appinspect'. If the same agent has been installed, it will of course not get installed again. Here's the job run console logs: 09:11:18 LEVEL="CRITICAL" TIME="2023-10-10 01:11:18,718" NAME="root" FILENAME="main.py" MODULE="main" MESSAGE="An unexpected error occurred during the run-time of Splunk AppInspect" 09:11:18 Traceback (most recent call last): 09:11:18 File "/home/jkagent/.local/lib/python3.9/site-packages/splunk_appinspect/main.py", line 581, in validate 09:11:18 groups_to_validate = splunk_appinspect.checks.groups( 09:11:18 File "/home/jkagent/.local/lib/python3.9/site-packages/splunk_appinspect/checks.py", line 205, in groups 09:11:18 check_group_modules = import_group_modules(check_dirs) 09:11:18 File "/home/jkagent/.local/lib/python3.9/site-packages/splunk_appinspect/checks.py", line 73, in import_group_modules 09:11:18 group_module = imp.load_source(group_module_name, filepath) 09:11:18 File "/usr/lib/python3.9/imp.py", line 171, in load_source 09:11:18 module = _load(spec) 09:11:18 File "<frozen importlib._bootstrap>", line 711, in _load 09:11:18 File "<frozen importlib._bootstrap>", line 680, in _load_unlocked 09:11:18 File "<frozen importlib._bootstrap_external>", line 790, in exec_module 09:11:18 File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed 09:11:18 File "/home/jkagent/.local/lib/python3.9/site-packages/splunk_appinspect/checks/check_source_and_binaries.py", line 17, in <module> 09:11:18 import splunk_appinspect.check_routine as check_routine 09:11:18 File "/home/jkagent/.local/lib/python3.9/site-packages/splunk_appinspect/check_routine/__init__.py", line 15, in <module> 09:11:18 from .find_endpoint_usage import find_endpoint_usage 09:11:18 File "/home/jkagent/.local/lib/python3.9/site-packages/splunk_appinspect/check_routine/find_endpoint_usage.py", line 7, in <module> 09:11:18 from pyparsing import Generator 09:11:18 ImportError: cannot import name 'Generator' from 'pyparsing' (/usr/lib/python3/dist-packages/pyparsing.py) Any idea what exactly is broken and suggestion on how to solve them?
my apologies,  i read the 3 lines and missed the remaining.. Please try this...      index=sample ServiceName="cet.prd.*" earliest=-3d latest=now()     let us know what happens, thanks.  
@inventsekar Please see my post. I had already added the query there.
Hi @krish1733 .. pls update us your current Splunk Search query, how the earliest and latest you are calculating.. etc
I'm getting an error when I input earliest and latest keywords with the search query. I have set the time picker corresponding to the values used in the query.  It's showing 'Unknown search command '... See more...
I'm getting an error when I input earliest and latest keywords with the search query. I have set the time picker corresponding to the values used in the query.  It's showing 'Unknown search command 'earliest'' when trying to use those commands. I'm using splunk Enterprise version.   This is my query: index=sample ServiceName="cet.prd.*" |  earliest=-3d latest=now()
@Bazza_12  could you please clarify the part "append your site certs", is this referring to the contents under "splunk_ta_o365/lib/certifi/cacert.pem" ?
Hi @bowesmana , it works great as expected, but is there any way to flag or highlight the differentiate value. because there are 3 fields are compared. so i need to check both lookup in order to fi... See more...
Hi @bowesmana , it works great as expected, but is there any way to flag or highlight the differentiate value. because there are 3 fields are compared. so i need to check both lookup in order to find the missing info.  
@Akmal57  Something like this | inputlookup lookup_A | eval origin="A" | inputlookup append=t lookup_B | eval origin=coalesce(origin, "B") | stats dc(origin) as originCount values(origin) as origin... See more...
@Akmal57  Something like this | inputlookup lookup_A | eval origin="A" | inputlookup append=t lookup_B | eval origin=coalesce(origin, "B") | stats dc(origin) as originCount values(origin) as origins by Hostname IP OS | where originCount=1 where you load both inputs and set origin value to be where the data come, then join the two together with stats and show only those that have a single origin
@rikinet Just make the chart show a stacked chart and as you have only a single value per time, it will show one or the other Here's an example   <dashboard> <label>colourgreen</label> <row> ... See more...
@rikinet Just make the chart show a stacked chart and as you have only a single value per time, it will show one or the other Here's an example   <dashboard> <label>colourgreen</label> <row> <panel> <chart> <search> <query>| makeresults count=20 | streamstats c | eval _time=now() - (c * 60) | eval digital_value=if (random() % 2 == 1, 0.1, 1) | eval analog_value=mvindex(split("0,100,500,1000,5000,10000",","), random() % 6) | fields - c | eval digital_value_red = if(digital_value=0.1, 0.1, null()) | eval digital_value_green = if(digital_value=1, 1, null()) | fields - digital_value </query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="charting.axisLabelsX.majorLabelStyle.overflowMode">ellipsisNone</option> <option name="charting.axisLabelsX.majorLabelStyle.rotation">0</option> <option name="charting.axisTitleX.visibility">visible</option> <option name="charting.axisTitleY.visibility">visible</option> <option name="charting.axisTitleY2.visibility">visible</option> <option name="charting.axisX.abbreviation">none</option> <option name="charting.axisX.scale">linear</option> <option name="charting.axisY.abbreviation">none</option> <option name="charting.axisY.scale">linear</option> <option name="charting.axisY2.abbreviation">none</option> <option name="charting.axisY2.enabled">1</option> <option name="charting.axisY2.scale">log</option> <option name="charting.chart">column</option> <option name="charting.chart.bubbleMaximumSize">50</option> <option name="charting.chart.bubbleMinimumSize">10</option> <option name="charting.chart.bubbleSizeBy">area</option> <option name="charting.chart.nullValueMode">gaps</option> <option name="charting.chart.overlayFields">analog_value</option> <option name="charting.chart.showDataLabels">none</option> <option name="charting.chart.sliceCollapsingThreshold">0.01</option> <option name="charting.chart.stackMode">stacked</option> <option name="charting.chart.style">shiny</option> <option name="charting.drilldown">none</option> <option name="charting.fieldColors">{digital_value_red: 0xFF0000, digital_value_green: 0x00FF00}</option> <option name="charting.layout.splitSeries">0</option> <option name="charting.layout.splitSeries.allowIndependentYRanges">0</option> <option name="charting.legend.labelStyle.overflowMode">ellipsisMiddle</option> <option name="charting.legend.mode">standard</option> <option name="charting.legend.placement">right</option> <option name="charting.lineWidth">2</option> <option name="height">406</option> <option name="trellis.enabled">0</option> <option name="trellis.scales.shared">1</option> <option name="trellis.size">medium</option> </chart> </panel> </row> </dashboard>    
Hi, I have 2 lookup which is lookup A and lookup B. My lookup A will be keep update by splunk query and my lookup B is maintain manually. Both lookup contain same fields which is Hostname, IP and OS... See more...
Hi, I have 2 lookup which is lookup A and lookup B. My lookup A will be keep update by splunk query and my lookup B is maintain manually. Both lookup contain same fields which is Hostname, IP and OS. I need to compare both lookup and bring out the non match Hostname and IP. Please assist me on this. Thank You
Is the forwarder logging any errors about failing to connect to the indexers?
It probably means the splunkbase page hasn't been updated, yet.
Note that with Splunk, there are often more ways to achieve the same goal. For example, you could use this instead of streamstats | accum gbu or | accum gbu as cum_gbu In the long run, streamstat... See more...
Note that with Splunk, there are often more ways to achieve the same goal. For example, you could use this instead of streamstats | accum gbu or | accum gbu as cum_gbu In the long run, streamstats is a more useful command (and takes more time to get your head around), as it supports split by clauses, whereas accum does not, so tends to be more useful.  
Exactly what I was looking for. I haven't come across the streamstats term yet so this is great. Thank you!
So you want your usage to show a cumulative value rather than the value for the specific hour?  If so, just add this to the end | streamstats sum(gbu) as gbu which will accumulate the hourly values... See more...
So you want your usage to show a cumulative value rather than the value for the specific hour?  If so, just add this to the end | streamstats sum(gbu) as gbu which will accumulate the hourly values and replace the hourly values with cumulative total. If you want both values, then add this to the end instead of the above | streamstats sum(gbu) as cum_gbu  this will create a new field with the cumulative total
Thanks for this. The results of this don't seem to "Add up" every hour. I was hoping each hour the number would be greater, but it seems to be giving different numbers, if that makes sense.
Sure you can, just use timechart, like this index=_internal source=*license_usage.log type=Usage pool=* | timechart span=1h sum(b) as gbu | eval gbu=round(gbu/1024/1024/1024,3)
I have a search that gives me the total license usage in gb's for a given time:   index=_internal source=*license_usage.log type=Usage pool=* | stats sum(b) as bu | eval gbu=round(bu/1024/1024/1024... See more...
I have a search that gives me the total license usage in gb's for a given time:   index=_internal source=*license_usage.log type=Usage pool=* | stats sum(b) as bu | eval gbu=round(bu/1024/1024/1024,3) | fields gbu   I'd like to have a timechart/graph to show what the total is each hour of a given day. Is this possible to do with this timechart?  
So to clarify: We have a distributed environment, with a cluster of indexers being managed by a Cluster Master. We have the Search Heads configured as standalone search heads. The Search Peers are n... See more...
So to clarify: We have a distributed environment, with a cluster of indexers being managed by a Cluster Master. We have the Search Heads configured as standalone search heads. The Search Peers are not configured in distsearch.conf on the search heads - they just connect to the cluster master for the list of indexers. We attempted to remove the peers from the list of Search Peers in Distributed Search in Settings, and got an error stating, " Cannot remove peer... This peer is a part of a cluster." As you would expect in a clustered environment. We were able to delete the peers from the Cluster Master, but deleting the peers there is what causes the Search Heads to complain about losing connection to search peers, as it appears the Cluster Master doesn't inform the Search Heads about the change in the search peer list. We were also able to find a window in which there were no scheduled searches running that we could restart the search heads. Restarting the search heads caused it to reload the list of search peers from the cluster master and it stopped giving the error. Is there another way to force search heads to refresh this cached list of search peers from the Cluster Master without restarting them?