All Apps and Add-ons

Splunk Add-on for Kafka: Why doesn't the add-on read data from Kafka topic

Explorer

I configured the connection between Splunk (ver 6.5.2) and Kafka (ver 0.11) via the Splunk Add-on for Kafka (ver 1.1.0) following these steps:
- Added a Kafka Cluster in the Add-on by setting Kafka Cluster Name, Kafka Broker, partition offset set to earliest and the remaining fields left blank
- In the Data Inputs menu I selected Add New in the "Splunk Add-on for Kafka" row and:
1. added a Kafka Data Input Name
2. selected the previously created Kafka Cluster,
3. saw the correct list of topics in the cluster and selected a non-empty one
4. set the Kafka Partition Offset to earliest
5. selected a brand new index to write data to with proper writing and reading permissions to the app

When I search the index, it is empty (while performing a console consumer command in the environment I see all the logs in the selected topic).

What can it be?

0 Karma
1 Solution

Explorer

I found the problem: by checking the log files I saw an error in the encoding process, then I followed the steps for solving it shown in https://answers.splunk.com/answers/421857/splunk-add-on-for-check-point-opsec-lea-non-audit.html .

Basically I edited line 71 of the file $SPLUNK_HOME$/etc/apps/Splunk_TA_kafka/bin/splunktalib/common/util.py
from:

data = encode("utf-8", errors="xmlcharrefreplace")

to:

data = data.decode("latin-1").encode("utf-8", errors="xmlcharrefreplace")

I also noticed that even if I checked the option "earliest offset" to collect historical data, I needed to push other new data into Kafka to make the connector start indexing all the logs in the Kafka topic.

View solution in original post

0 Karma

Explorer

I found the problem: by checking the log files I saw an error in the encoding process, then I followed the steps for solving it shown in https://answers.splunk.com/answers/421857/splunk-add-on-for-check-point-opsec-lea-non-audit.html .

Basically I edited line 71 of the file $SPLUNK_HOME$/etc/apps/Splunk_TA_kafka/bin/splunktalib/common/util.py
from:

data = encode("utf-8", errors="xmlcharrefreplace")

to:

data = data.decode("latin-1").encode("utf-8", errors="xmlcharrefreplace")

I also noticed that even if I checked the option "earliest offset" to collect historical data, I needed to push other new data into Kafka to make the connector start indexing all the logs in the Kafka topic.

View solution in original post

0 Karma

SplunkTrust
SplunkTrust

If your problem is resolved, please accept one of the answers (it's OK to accept your own answer) so future readers know what the solution is.

---
If this reply helps you, an upvote would be appreciated.
0 Karma

Splunk Employee
Splunk Employee

I just tried kafka version 0.11 (kafka_2.11-0.11.0.0) with the kafka add-on, it worked for me.
Can you share these values?
The value of the listeners from this properties file: kafka_2.11-0.11.0.0/config/server.properties
listeners = PLAINTEXT://your.host.name:9092

The value of your topics: bin/kafka-topics.sh --list --zookeeper localhost:2181

The values of this Splunk configurations: /opt/splunk/etc/apps/Splunk_TA_kafka/local/kafka_credentials.conf and kafka_forwarder_credentials.conf

0 Karma

Explorer

Thank you very much, I found the problem was related to encoding, see the answer above

0 Karma

Splunk Employee
Splunk Employee

Based on this link, the add-on does not support kafka version 0.11: http://docs.splunk.com/Documentation/AddOns/latest/Kafka/About

0 Karma