All Apps and Add-ons

Splunk Add-on for AWS for cloudtrail SQS-Based S3 input limitation

dragonchen
New Member

Hi

I have configured SQS-Based S3 Input to get cloudtrail log, but the SQS Batch Size is limited to 1 to 10.

I have also configre Interval (in seconds) to zero. But my queue is also growing. Can I break the limitation or other approach to get cloudtrail log in s3?

I am using the architect below (I have more then one hundred account)

截圖 2020-07-16 10.57.33.png

 

Thanks

Labels (2)
0 Karma

rmorlen
Splunk Employee
Splunk Employee

For SQS S3 and Cloudtrail just clone your input and create multiple inputs.  Just watch resources (CPU, Load Average) on the host where the inputs are defined. 

If you modify any python code then the TA will no longer be supported and will break with upgrades.

0 Karma

dragonchen
New Member

Hi

I have find the answer, you can create more then 1 SQS-based-s3 input to get message fsater, 1 input get 10message/second, 10 input get 100message/second.

I have created 20 input and it work find.

 

 

0 Karma

dragonchen
New Member

I have also try to edit {splunk_home}/etc/apps/Splunk_TA_aws/bin/aws_sqs_based_s3_inputs_rh.py's sqs_batch_size limitation and {splunk_home}/etc/apps/Splunk_TA_aws/local/inputs.conf 's sqs_batch_size, but it is not working

 

0 Karma
Get Updates on the Splunk Community!

Build Your First SPL2 App!

Watch the recording now!.Do you want to SPL™, too? SPL2, Splunk's next-generation data search and preparation ...

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...

[Coming Soon] Splunk Observability Cloud - Enhanced navigation with a modern look and ...

We are excited to introduce our enhanced UI that brings together AppDynamics and Splunk Observability. This is ...