All Apps and Add-ons

Splunk Add-on for AWS for cloudtrail SQS-Based S3 input limitation

dragonchen
New Member

Hi

I have configured SQS-Based S3 Input to get cloudtrail log, but the SQS Batch Size is limited to 1 to 10.

I have also configre Interval (in seconds) to zero. But my queue is also growing. Can I break the limitation or other approach to get cloudtrail log in s3?

I am using the architect below (I have more then one hundred account)

截圖 2020-07-16 10.57.33.png

 

Thanks

Labels (2)
0 Karma

dragonchen
New Member

Hi

I have find the answer, you can create more then 1 SQS-based-s3 input to get message fsater, 1 input get 10message/second, 10 input get 100message/second.

I have created 20 input and it work find.

 

 

0 Karma

dragonchen
New Member

I have also try to edit {splunk_home}/etc/apps/Splunk_TA_aws/bin/aws_sqs_based_s3_inputs_rh.py's sqs_batch_size limitation and {splunk_home}/etc/apps/Splunk_TA_aws/local/inputs.conf 's sqs_batch_size, but it is not working

 

0 Karma