All Apps and Add-ons

Query - SQS based S3 Input

Explorer

In AWS add on for splunk, if i create an cloudtrail input using SQS based S3, does splunk cleans up the queue of messages after it processes the file.

Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?

Tags (1)
0 Karma

Engager

Yes, the Splunk Heavy Forwarders delete the messages from the SQS queue once the messages are read. Of course you have to provide the required permissions to the Splunk user in AWS.

"Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?"
If you have 2 HFs with identical inputs, the traffic will be distributed by them. This essentially makes the processing of the messages and the ingestion faster and is very useful for ingesting from large S3 buckets. However, I am not sure that will solve your purpose.
In order to send all the S3 data to 2 different indexes, you should have 2 SQS queues with the same messages. You can do this by having 2 SQS queues subscribed to the same SNS which get the notification from the S3 bucket.

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!