All Apps and Add-ons

Any quick startup guide to do end-to-end testing from kafka to Splunk.

daniel_splunk
Splunk Employee
Splunk Employee

If you got any cheat sheet that I can setup kafka to send event to Splunk, that would help.

0 Karma

deepashri_123
Motivator
0 Karma

daniel_splunk
Splunk Employee
Splunk Employee

I did that kafka setup before and below is based on centos 7.

  1. yum upgrade

  2. yum install java-1.8.0-openjdk

  3. download kafka
    Here is the splunkbase link but it will point you to Github
    https://splunkbase.splunk.com/app/3862/

  4. Once you downloaded kafka, untar it and rename the directory

    tar -xzf kafka_2.11-2.1.0.tgz
    mv kafka_2.11-2.0.0 kafka

  5. Start kafka zookeeper and server

    cd kafka
    bin/zookeeper-server-start.sh config/zookeeper.properties > zookeeper.log &
    bin/kafka-server-start.sh config/server.properties > kafka_server.log &

  6. Create kafka topic

    bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test

    Below command can be used to list/delete topic as well.

    bin/kafka-topics.sh --list --zookeeper localhost:2181
    bin/kafka-topics.sh --zookeeper localhost:2181 --delete --topic test

  7. Send some event to topic "test" so that you can get it from kafka connect later

    bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test

    this is a test message
    ^C

  8. Check the worker properties file. Make sure the following settings are set correctly.

      key.converter=org.apache.kafka.connect.storage.StringConverter
    value.converter=org.apache.kafka.connect.storage.StringConverter
    
    key.converter.schemas.enable=false
    value.converter.schemas.enable=false
    
    internal.key.converter.schemas.enable=false
    internal.value.converter.schemas.enable=false
    
  9. Start Splunk-kafka-connect

    bin/connect-distributed.sh config/connect-distributed.properties > Kafka_connect.log &

  10. Create HEC connector in Splunk

  11. Create splunk-kafka-connect task

    curl http://localhost:8083/connectors -X POST -H "Content-Type: application/json" -d '{
    "name": "test-single-event",
    "config": {
    "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
    "tasks.max": "1",
    "topics":"test",
    "splunk.sources": "test_kafka_event",
    "splunk.indexes": "kafka_event",
    "splunk.hec.uri": "https://localhost:8088",
    "splunk.hec.token": "26faccb6-a0af-45d6-996e-7df97afb81fd",
    "splunk.hec.raw": "false",
    "splunk.hec.ack.enabled": "false",
    "splunk.hec.ack.poll.interval": "10",
    "splunk.hec.ack.poll.threads": "1",
    "splunk.hec.ssl.validate.certs": "false",
    "splunk.hec.http.keepalive": "true",
    "splunk.hec.max.http.connection.per.channel": "4",
    "splunk.hec.total.channels": "8",
    "splunk.hec.max.batch.size": "500",
    "splunk.hec.threads": "1",
    "splunk.hec.event.timeout": "300",
    "splunk.hec.socket.timeout": "120",
    "splunk.hec.track.data": "true"
    }
    }’

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...