All Apps and Add-ons

Getting a fluentd SSL error when using splunk-connect for Kubernetes

hifimarko
Engager

We are using Splunk Add-On for Kubernetes. https://splunkbase.splunk.com/app/3991/ and

Once installed via helm the components all seem to get setup well except for splunk-kubernetes-objects. The pod for splunk-kubernetes-objects responds with

2018-07-20 23:59:59 +0000 [error]: config error file="/fluentd/etc/fluent.conf" error_class=Fluent::ConfigError error="Invalid Kubernetes API v1 endpoint https://100.64.0.1:443/api: SSL_connect returned=1 errno=0 state=error: certificate verify failed (unable to get local issuer certificate)"

To resolve this we are forced to set insecure_ssl to true for fluent.conf in ConfigMap.

What causes the issue and how can we fix it?

0 Karma
1 Solution

mattymo
Splunk Employee
Splunk Employee

Hey hifimarko,

This is caused by certificate checking when calling the kube api. Thats why flipping it to insecure works, as it skips validation.

If you want to enable SSL cert validation, check out :
https://kubernetes.io/docs/reference/access-authn-authz/authentication/#authentication-strategies
https://kubernetes.io/docs/tasks/tls/managing-tls-in-a-cluster/

I believe you need to ensure api server has the proper config, then pass the proper key and cert to the configmap.

- MattyMo

View solution in original post

mattymo
Splunk Employee
Splunk Employee

Hey hifimarko,

This is caused by certificate checking when calling the kube api. Thats why flipping it to insecure works, as it skips validation.

If you want to enable SSL cert validation, check out :
https://kubernetes.io/docs/reference/access-authn-authz/authentication/#authentication-strategies
https://kubernetes.io/docs/tasks/tls/managing-tls-in-a-cluster/

I believe you need to ensure api server has the proper config, then pass the proper key and cert to the configmap.

- MattyMo

nickkwiecien
New Member

@mattymo  can you do validation with a self signed cert or does it need to be trusted by a CA? 

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Synthetic Monitoring - Resolved Incident on Detector Alerts

We’ve discovered a bug that affected the auto-clear of Synthetic Detectors in the Splunk Synthetic Monitoring ...

Video | Tom’s Smartness Journey Continues

Remember Splunk Community member Tom Kopchak? If you caught the first episode of our Smartness interview ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud? Learn how unique features like ...