Security

Accessing apps in 6.4.x results in "Error connecting: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed"

Splunk Employee
Splunk Employee

After moving to Splunk 6.4.x, the following error can occur in the UI when navigating to:

Apps —> Browse more apps

Error connecting: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed

In the splunkd.log file, the following errors also occur:

08-04-2016 03:40:40.509 -0400 ERROR ApplicationUpdater - Error checking for update, URL=https://apps.splunk.com/api/apps:resolve/checkforupgrade: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed

08-04-2016 05:06:21.332 -0400 ERROR X509Verify - X509 certificate (CN=XXXXXX,DC=XXXXX,DC=local) failed validation; error=19, reason="self signed certificate in certificate chain"
1 Solution

Splunk Employee
Splunk Employee

The issue occurs when you intercept/decode network packets and then re-encrypt the SSL stream with our own internal CA Root keys. To solve the issue:

  1. Find the appsCA.pem in the $SPLUNK_HOME/etc/auth on the Splunk search head.
  2. Back this file up before making any changes.
  3. Open the file with a text editor and copy you internal CA Root to the appsCA.pem, for example:
-----BEGIN CERTIFICATE-----
SPLUNK CERT HERE
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
CUSTOMER INTERNAL CA ROOT HERE
-----END CERTIFICATE-----

You must restart Splunk after these changes.

View solution in original post

Explorer

I found jbarlow's answer to work best for my environment. I can now access https://splunkbase.splunk.com using my own self-signed certs. I did finally have to quit trying to use my squid proxy to do so, though. Guess access to apps does not support https_proxy to HTTPS squid endpoint. I had to open my external firewall to allow SplunkLight out for HTTP/HTTPS which was previously disabled per PCI QSA request.

I only did this on the SplunkLight server (add appsCA.pem to my splunkCA.tds.pci.pem. I did not do the same on my forwarders which use the original splunkCA.tds.pci.crt (in PEM format) and that could be a concern since now they differ. May need to do the same on the forwarders to insure a single version of the truth.

This was related to me replacing all of the splunk certs (except for those dang apps certs) with my own self-signed certs due to the default Splunk CA cert being signed with a weak algorithm (yep - PCI QSA again).

0 Karma

Splunk Employee
Splunk Employee

This can also occur in following scenario...

'server.conf', in stanza [sslConfig] , a custom CA is defined in paramater "sslRootCAPath"

example

[sslConfig]
..
sslRootCAPath = /opt/certs/myCA.pem
..

In this case, per the docs, if param “sslRootCAPath” has been set (in stanza ‘sslConfig’) then caCertFile will be ignored.

http://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf
[applicationsManagement]

caCertFile =
* Full path to a CA (Certificate Authority) certificate(s) PEM format file.
* The must refer to a PEM format file containing one or more root CA
certificates concatenated together.
* Used only if 'sslRootCAPath' is unset.
* Used for validating SSL certificate from https://apps.splunk.com/

e.g this flow..

splunkd ——> splunkbase
<—— sends server cert signed by CA GlobalSign

splunkd verifies the server cert against your custom cert in file defined in sslRootCAPath , which does not contain the CA GlobalSign (this is defined in $SPLUNK_HOME/etc/auth/appsCA.pem)

To get round this, concatenate the appsCA.pem contents to your custom CA (as defined in sslRootCAPath in the [sslConfig] stanza)

make a backup of your custom CA first

cp yourCustomCA.pem yourCustomCA.pem.backup
cat $SPLUNK_HOME/etc/auth/appsCA.pem >> yourCustomCA.pem

then restart splunk

Splunk Employee
Splunk Employee

The issue occurs when you intercept/decode network packets and then re-encrypt the SSL stream with our own internal CA Root keys. To solve the issue:

  1. Find the appsCA.pem in the $SPLUNK_HOME/etc/auth on the Splunk search head.
  2. Back this file up before making any changes.
  3. Open the file with a text editor and copy you internal CA Root to the appsCA.pem, for example:
-----BEGIN CERTIFICATE-----
SPLUNK CERT HERE
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
CUSTOMER INTERNAL CA ROOT HERE
-----END CERTIFICATE-----

You must restart Splunk after these changes.

View solution in original post

Path Finder

I am having this exact same problem but the suggested resolution did not correct it. Any other suggestions?

0 Karma

Splunk Employee
Splunk Employee

This also worked where a PaloAlto firewall encrypted data coming into splunk

0 Karma

Splunk Employee
Splunk Employee

Hi Corey,

The solution is really dependent on your network architecture. You may also have a Root and/or an Intermediate CA. So you may also have to add these as well, for example:

-----BEGIN CERTIFICATE-----
SERVER CERT HERE
-----END CERTIFICATE-----

-----BEGIN CERTIFICATE-----
INTERMEDIATE CA IF REQUIRED
-----END CERTIFICATE-----

-----BEGIN CERTIFICATE-----
ROOT CA
-----END CERTIFICATE-----

If this does not help, you may like to raise a case with Splunk support.

0 Karma

Path Finder

Thanks for the reply.

I've tried this order (what I had when I posted) :
===OOTB SPLUNK CERTS (x3)===
===MY ROOT===
===MY INTERMEDIATE===
===MY SERVER===

and this order (based on your post, in case the order matters) :

===OOTB SPLUNK CERTS (x3)===
===MY SERVER===
===MY INTERMEDIATE===
===MY ROOT===

So far neither order corrects my issue. I do not have support (yet).

0 Karma

Path Finder

I think my appsCA.pem got messed up. I copied it from another instance and was able to get this to function without any weird config or root/intermediate changes.

0 Karma

Splunk Employee
Splunk Employee

Hi Corey,

It is difficult to say without looking at a Splunk diag, your certs and a network trace. I would recommend that you run a network trace (e.g. wireshark) and see where in the SSL handshake it is failing.

0 Karma