Getting Data In

How to Configure Splunk Heavy Forwarder to Consume Kafka Topics based on SSL/TLS

yongyuthvis
New Member

Hello,
we are using Splunk Heavy Forwarder to consume data from Kafka topics (flow #1) and forward it to the Splunk Server (flow #2), i.e.

Kafka Cluster --- (1) ----> Splunk HF ----- (2) -----> Splunk Backend system

Kafka cluster has been configured to support SSL/TLS encryption on the port 9093, e.g. bootstrap-endpoint:9093

Could you please provide me some guidance how to configure the Splunk Heavy Forwarder to be able to consume the Kafka topics based on SSL/TLS.

Thank you very much for your guidance in advance.

Best regards
Yongyuth

Labels (1)
Tags (1)
0 Karma
1 Solution

chskm
Path Finder

@yongyuthvis This is something what we have done a year ago. Could you please let me know whether you are using TLS 1.2 or something else ? Also, you need to check with you Kafka team whether it is current available to make a successful connection and forward the data to all products. If not yet you could something that need to make it work with Kafka team (FYI... this is only if you are using Kafka to forward the data to multiple applications in your organization like Splunk, ELK, etc).
Once you are good with these, download the Splunk connect for kafka : https://splunkbase.splunk.com/app/3862/ and update the required configurations based up on the requirement shown by Splunk in the docs: https://docs.splunk.com/Documentation/KafkaConnect/latest/User/About Make sure to generate the Splunk HEC token to accept the incoming data using this token from Kafka bus. After you have done this you need to start the Kafka broker and server on Kafka Side and execute the command which is provided by Splunk in the above doc. That will start forwarding the data to Splunk HF and from there processing will happen at HF level , then sends to Splunk indexers.
Prior to execution of the Splunk commands or starting the Kafka servers, make sure to use the certs based up on your org requirements that something like self-signed or kerberos. Check with your Kafka team.
You might need to execute that data forwarding based up on the Kafka Topic every time you have a new topic created. I have used Ansible to automate the process of identifying the new topic and execution of the command. You can do this with any other automation as well. Please do accept the answer if you like it and this something that helps your scenario. Thanks.

View solution in original post

0 Karma

chskm
Path Finder

@yongyuthvis This is something what we have done a year ago. Could you please let me know whether you are using TLS 1.2 or something else ? Also, you need to check with you Kafka team whether it is current available to make a successful connection and forward the data to all products. If not yet you could something that need to make it work with Kafka team (FYI... this is only if you are using Kafka to forward the data to multiple applications in your organization like Splunk, ELK, etc).
Once you are good with these, download the Splunk connect for kafka : https://splunkbase.splunk.com/app/3862/ and update the required configurations based up on the requirement shown by Splunk in the docs: https://docs.splunk.com/Documentation/KafkaConnect/latest/User/About Make sure to generate the Splunk HEC token to accept the incoming data using this token from Kafka bus. After you have done this you need to start the Kafka broker and server on Kafka Side and execute the command which is provided by Splunk in the above doc. That will start forwarding the data to Splunk HF and from there processing will happen at HF level , then sends to Splunk indexers.
Prior to execution of the Splunk commands or starting the Kafka servers, make sure to use the certs based up on your org requirements that something like self-signed or kerberos. Check with your Kafka team.
You might need to execute that data forwarding based up on the Kafka Topic every time you have a new topic created. I have used Ansible to automate the process of identifying the new topic and execution of the command. You can do this with any other automation as well. Please do accept the answer if you like it and this something that helps your scenario. Thanks.

0 Karma

rita201
Loves-to-Learn

Please I have an ssl from the Kafka team for me to install on splunk. What configuration file should call this very from please? 

0 Karma

yongyuthvis
New Member

Hi chskm,
Thanks for your answer.
Please let me add some data for you.

Kafka cluster:
In Kafka, we have added the configuration in server.properties
...
listeners=PLAINTEXT://:9092,SSL://bootstrap-endpoint:9093
ssl.keystore.location=/opt/SP/apps/kafka/current/config/kafka01.keystore.jks
ssl.keystore.password=keystorepassword_kafka
ssl.key.password=keypassword_kafka
ssl.truststore.location=/opt/SP/apps/kafka/current/config/kafka.truststore.jks
ssl.truststore.password=truststorepassword_kafka
ssl.enabled.protocols=TLSv1.2,TLSv1.1
ssl.client.auth=none

Based on the above configuration, client appliactions (producer) can inject data into the topics using TLS encryprtion via the port 9093 .
And client applications (Consumer) can also retrieve the data from the topics using TLS via the port 9093 also. 

Splunk HF:
In our case, the Splunk HF is working as a client which will consume the data from Kafka topics and forwards the data to the Splunk server.
Till now, our Splunk HF is using the port 9092 PLAINTEXT. But, we'd like to configure teh Splnk HF to use SSL/TLS of the port 9093 instead.

Hope this helps you understand more about our use case.

Please guide us how to configure Splunk HF to be able to consume data from Kafka based on SSL/TLS.

Thank you for your support in advance.

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...