All Apps and Add-ons

Query - SQS based S3 Input

chintu_jain
Explorer

In AWS add on for splunk, if i create an cloudtrail input using SQS based S3, does splunk cleans up the queue of messages after it processes the file.

Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?

Tags (1)
0 Karma

ameyap16
Engager

Yes, the Splunk Heavy Forwarders delete the messages from the SQS queue once the messages are read. Of course you have to provide the required permissions to the Splunk user in AWS.

"Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?"
If you have 2 HFs with identical inputs, the traffic will be distributed by them. This essentially makes the processing of the messages and the ingestion faster and is very useful for ingesting from large S3 buckets. However, I am not sure that will solve your purpose.
In order to send all the S3 data to 2 different indexes, you should have 2 SQS queues with the same messages. You can do this by having 2 SQS queues subscribed to the same SNS which get the notification from the S3 bucket.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...