After moving to Splunk 6.4.x, the following error can occur in the UI when navigating to:
Apps —> Browse more apps
Error connecting: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed
In the splunkd.log file, the following errors also occur:
08-04-2016 03:40:40.509 -0400 ERROR ApplicationUpdater - Error checking for update, URL=https://apps.splunk.com/api/apps:resolve/checkforupgrade: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed
08-04-2016 05:06:21.332 -0400 ERROR X509Verify - X509 certificate (CN=XXXXXX,DC=XXXXX,DC=local) failed validation; error=19, reason="self signed certificate in certificate chain"
The issue occurs when you intercept/decode network packets and then re-encrypt the SSL stream with our own internal CA Root keys. To solve the issue:
-----BEGIN CERTIFICATE----- SPLUNK CERT HERE -----END CERTIFICATE----- -----BEGIN CERTIFICATE----- CUSTOMER INTERNAL CA ROOT HERE -----END CERTIFICATE-----
You must restart Splunk after these changes.
I found jbarlow's answer to work best for my environment. I can now access https://splunkbase.splunk.com using my own self-signed certs. I did finally have to quit trying to use my squid proxy to do so, though. Guess access to apps does not support https_proxy to HTTPS squid endpoint. I had to open my external firewall to allow SplunkLight out for HTTP/HTTPS which was previously disabled per PCI QSA request.
I only did this on the SplunkLight server (add appsCA.pem to my splunkCA.tds.pci.pem. I did not do the same on my forwarders which use the original splunkCA.tds.pci.crt (in PEM format) and that could be a concern since now they differ. May need to do the same on the forwarders to insure a single version of the truth.
This was related to me replacing all of the splunk certs (except for those dang apps certs) with my own self-signed certs due to the default Splunk CA cert being signed with a weak algorithm (yep - PCI QSA again).
This can also occur in following scenario...
'server.conf', in stanza [sslConfig] , a custom CA is defined in paramater "sslRootCAPath"
example
[sslConfig]
..
sslRootCAPath = /opt/certs/myCA.pem
..
In this case, per the docs, if param “sslRootCAPath” has been set (in stanza ‘sslConfig’) then caCertFile will be ignored.
http://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf
[applicationsManagement]
…
caCertFile =
* Full path to a CA (Certificate Authority) certificate(s) PEM format file.
* The must refer to a PEM format file containing one or more root CA
certificates concatenated together.
* Used only if 'sslRootCAPath' is unset.
* Used for validating SSL certificate from https://apps.splunk.com/
…
splunkd ——> splunkbase
<—— sends server cert signed by CA GlobalSign
To get round this, concatenate the appsCA.pem contents to your custom CA (as defined in sslRootCAPath in the [sslConfig] stanza)
cp yourCustomCA.pem yourCustomCA.pem.backup
cat $SPLUNK_HOME/etc/auth/appsCA.pem >> yourCustomCA.pem
then restart splunk
The issue occurs when you intercept/decode network packets and then re-encrypt the SSL stream with our own internal CA Root keys. To solve the issue:
-----BEGIN CERTIFICATE----- SPLUNK CERT HERE -----END CERTIFICATE----- -----BEGIN CERTIFICATE----- CUSTOMER INTERNAL CA ROOT HERE -----END CERTIFICATE-----
You must restart Splunk after these changes.
I just had this exact issue installing Splunk on a Windows 2022 Server running on ESXi. Followed your advice and worked like a charm. Thank you, sir.
I am having this exact same problem but the suggested resolution did not correct it. Any other suggestions?
This also worked where a PaloAlto firewall encrypted data coming into splunk
Hi Corey,
The solution is really dependent on your network architecture. You may also have a Root and/or an Intermediate CA. So you may also have to add these as well, for example:
-----BEGIN CERTIFICATE-----
SERVER CERT HERE
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
INTERMEDIATE CA IF REQUIRED
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
ROOT CA
-----END CERTIFICATE-----
If this does not help, you may like to raise a case with Splunk support.
Thanks for the reply.
I've tried this order (what I had when I posted) :
===OOTB SPLUNK CERTS (x3)===
===MY ROOT===
===MY INTERMEDIATE===
===MY SERVER===
and this order (based on your post, in case the order matters) :
===OOTB SPLUNK CERTS (x3)===
===MY SERVER===
===MY INTERMEDIATE===
===MY ROOT===
So far neither order corrects my issue. I do not have support (yet).
I think my appsCA.pem got messed up. I copied it from another instance and was able to get this to function without any weird config or root/intermediate changes.
Hi Corey,
It is difficult to say without looking at a Splunk diag, your certs and a network trace. I would recommend that you run a network trace (e.g. wireshark) and see where in the SSL handshake it is failing.