All Apps and Add-ons

Splunk Add-on for AWS for cloudtrail SQS-Based S3 input limitation

dragonchen
New Member

Hi

I have configured SQS-Based S3 Input to get cloudtrail log, but the SQS Batch Size is limited to 1 to 10.

I have also configre Interval (in seconds) to zero. But my queue is also growing. Can I break the limitation or other approach to get cloudtrail log in s3?

I am using the architect below (I have more then one hundred account)

截圖 2020-07-16 10.57.33.png

 

Thanks

Labels (2)
0 Karma

rmorlen
Splunk Employee
Splunk Employee

For SQS S3 and Cloudtrail just clone your input and create multiple inputs.  Just watch resources (CPU, Load Average) on the host where the inputs are defined. 

If you modify any python code then the TA will no longer be supported and will break with upgrades.

0 Karma

dragonchen
New Member

Hi

I have find the answer, you can create more then 1 SQS-based-s3 input to get message fsater, 1 input get 10message/second, 10 input get 100message/second.

I have created 20 input and it work find.

 

 

0 Karma

dragonchen
New Member

I have also try to edit {splunk_home}/etc/apps/Splunk_TA_aws/bin/aws_sqs_based_s3_inputs_rh.py's sqs_batch_size limitation and {splunk_home}/etc/apps/Splunk_TA_aws/local/inputs.conf 's sqs_batch_size, but it is not working

 

0 Karma
Get Updates on the Splunk Community!

CX Day is Coming!

Customer Experience (CX) Day is on October 7th!! We're so excited to bring back another day full of wonderful ...

Strengthen Your Future: A Look Back at Splunk 10 Innovations and .conf25 Highlights!

The Big One: Splunk 10 is Here!  The moment many of you have been waiting for has arrived! We are thrilled to ...

Now Offering the AI Assistant Usage Dashboard in Cloud Monitoring Console

Today, we’re excited to announce the release of a brand new AI assistant usage dashboard in Cloud Monitoring ...