All Apps and Add-ons

Can we run the kafka modular input on a forwarder?

anandhim
Path Finder

We don't use the UI in our deployment at all for configuration, but use the config files.
How would we configure a Kafka modular input on a forwarder so it can distribute the data to all indexers?

1 Solution

Damien_Dallimor
Ultra Champion

You certainly can. Any Modular Input can be setup on a UF (where there is no Web UI to configure it).

When you use the Web UI , the fields you configure simply get persisted to inputs.conf in the background.

So when deploying on a UF , you just have to edit your inputs.conf stanza yourself , and then restart the UF.

Example stanza :

alt text

Also , since UFs do not ship with a Python runtime , you will need to ensure that there is a system Python 2.7 runtime installed.

View solution in original post

ajayvvs
New Member

can we get a sample inputs.conf for kafka pasted here .
thanks in advance

0 Karma

Damien_Dallimor
Ultra Champion

You certainly can. Any Modular Input can be setup on a UF (where there is no Web UI to configure it).

When you use the Web UI , the fields you configure simply get persisted to inputs.conf in the background.

So when deploying on a UF , you just have to edit your inputs.conf stanza yourself , and then restart the UF.

Example stanza :

alt text

Also , since UFs do not ship with a Python runtime , you will need to ensure that there is a system Python 2.7 runtime installed.

jagadeeshm
Contributor

As per the code, [https://github.com/damiendallimore/SplunkModularInputsJavaFramework/blob/master/src/com/splunk/modin...], APP is supposed to be deployed in the same Splunk Instance where the HEC is enabled (because HEC end-point is hardcoded to localhost rather than taking it as an input unlike the port etc). Is there a strong reason this approach?

Also, I am trying to test the performance/through-put of this app, and it looks like I am not able to post more than 800 messages/sec to HEC end-point. Do you by any chance have any benchmark/metrics on how much load can this app handle? I have topics where 4k mesages were produced per second.

0 Karma

anandhim
Path Finder

Thanks Damien, I was just not sure if I can specify the index and sourcetype in the same stanza. I'll verify this soon.

0 Karma

jagadeeshm
Contributor

@Damien - Can you pleas share that example stanza? Somehow the snapshot is not loading. Also, when you deploy the app on Universal Forwarder, do we just untar the app and create the inputs.conf (would this go into /etc/system/local ??) and restart the forwarder? Please advice. Thanks!

0 Karma

Damien_Dallimor
Ultra Champion

[kafka://kafka_test]
disabled = 1
group_id = my_test_group
index = main
sourcetype = kafka
topic_name = test
zookeeper_connect_host = localhost
zookeeper_connect_port = 2181
message_handler_params =
additional_jvm_propertys =
zookeeper_connect_rawstring =
output_type = stdout

0 Karma

Damien_Dallimor
Ultra Champion

inputs.conf should go in a local directory in /etc/apps (in whatever app makes sense in your environment)

0 Karma

strive
Influencer

If you are facing issues, I think better idea would be to email to ddallimore@splunk.com . He has initiated this project.

strive
Influencer

I haven't used this. As per documentation, the inputs.conf configurations are:
[kafka://name]

name of the topic

topic_name =

consumer connection properties

zookeeper_connect_host =
zookeeper_connect_port =
group_id =
zookeeper_session_timeout_ms =
zookeeper_sync_time_ms =
auto_commit_interval_ms =
additional_consumer_properties =

message handler

message_handler_impl =
message_handler_params =

additional startup settings

additional_jvm_propertys =

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...