All Apps and Add-ons

How would the Kafka Messaging Modular Input behave if a Splunk failure occurs?

markbatesplunk
New Member

Hi,

I have a question from our SPLUNK team. We intend on using the Kafka Messaging Modular Input to ingest Kafka events. If a SPLUNK failure occurs, how would the modular input behave? The events we are consuming are being used to reconcile an end to end process, so we need to know if the modular input would continue to ingest events from KAFKA and then fail when it tried to index those events, or would it be aware of the problem with Splunk and stop ingesting Kafka events?

0 Karma
1 Solution

ryanoconnor
Builder

Part of the answer to this question may depend on how your deployment is configured.

The modular input for this TA is part of Splunk. The modular input in this case is monitoring the stdout of the kafka.py file. If the Splunk process has some sort of failure, than likely you will stop ingesting data as well.

Indexing may still be going on if a separate Splunk System is handling your indexing, however if the Splunk system that has that modular input for kafka fails, than indexing won't be happening for your Kafka data. Other day may still be coming in though if indexing is happening on another system.

You can setup two different types of monitoring that might help detect these sorts of failures.

  1. Process monitoring outside of Splunk using something like Zabbix.
  2. You could also run some sort of scheduled job in Splunk to watch for lapses in data from Kafka.

View solution in original post

ryanoconnor
Builder

Part of the answer to this question may depend on how your deployment is configured.

The modular input for this TA is part of Splunk. The modular input in this case is monitoring the stdout of the kafka.py file. If the Splunk process has some sort of failure, than likely you will stop ingesting data as well.

Indexing may still be going on if a separate Splunk System is handling your indexing, however if the Splunk system that has that modular input for kafka fails, than indexing won't be happening for your Kafka data. Other day may still be coming in though if indexing is happening on another system.

You can setup two different types of monitoring that might help detect these sorts of failures.

  1. Process monitoring outside of Splunk using something like Zabbix.
  2. You could also run some sort of scheduled job in Splunk to watch for lapses in data from Kafka.

markbatesplunk
New Member

Makes sense Ryan. thanks for your inputs

0 Karma

ppablo
Retired

Hi @markbatesplunk

Just to clarify for other users, but are you talking about modular inputs in general, or are you referring specifically to the Kafka Messaging Modular Input?
https://splunkbase.splunk.com/app/1817/

0 Karma

markbatesplunk
New Member

Correct ;O)

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...