Splunk Search

Use Kafka Streams to perform Splunk Transaction

zineddine
New Member

Hello,

My client uses an email solution that produces a log for each step in email processing, hence, we have a variable number of logs for each email sent or received.

In order to work with the received data and build queries around it, we use the transaction command of Splunk to aggregate logs by email ID.

The problem is that this is extremely heavy computational task and searches on long timeranges cannot be executed.

I'm thinking of using Kafka Streams to aggregate the logs by email ID before sending them to Splunk, I started by coding and understanding the Java code offered by Apache and I have some difficulties:

  1. First, is this feasible? Has anybody here successfully achieved that?
  2. How can I assign the extracted email ID from the log as key in a KTable?
  3. How can I manage 'windowing' the time to wait for logs with same email ID?

I find the Apache examples a bit short and I'm having hard time figuring out where to start to learn and build my own Java Kafla Stream app, if anyone can help me with this, it would be highly appreciated.

Thank you very much for your time.

Best regards.
Zineddine.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Shape the Future of Splunk: Join the Product Research Lab!

Join the Splunk Product Research Lab and connect with us in the Slack channel #product-research-lab to get ...