All Apps and Add-ons

Query - SQS based S3 Input

chintu_jain
Explorer

In AWS add on for splunk, if i create an cloudtrail input using SQS based S3, does splunk cleans up the queue of messages after it processes the file.

Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?

Tags (1)
0 Karma

ameyap16
Engager

Yes, the Splunk Heavy Forwarders delete the messages from the SQS queue once the messages are read. Of course you have to provide the required permissions to the Splunk user in AWS.

"Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?"
If you have 2 HFs with identical inputs, the traffic will be distributed by them. This essentially makes the processing of the messages and the ingestion faster and is very useful for ingesting from large S3 buckets. However, I am not sure that will solve your purpose.
In order to send all the S3 data to 2 different indexes, you should have 2 SQS queues with the same messages. You can do this by having 2 SQS queues subscribed to the same SNS which get the notification from the S3 bucket.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...