All Apps and Add-ons

Splunk Add-on for AWS for cloudtrail SQS-Based S3 input limitation

dragonchen
New Member

Hi

I have configured SQS-Based S3 Input to get cloudtrail log, but the SQS Batch Size is limited to 1 to 10.

I have also configre Interval (in seconds) to zero. But my queue is also growing. Can I break the limitation or other approach to get cloudtrail log in s3?

I am using the architect below (I have more then one hundred account)

截圖 2020-07-16 10.57.33.png

 

Thanks

Labels (2)
0 Karma

rmorlen
Splunk Employee
Splunk Employee

For SQS S3 and Cloudtrail just clone your input and create multiple inputs.  Just watch resources (CPU, Load Average) on the host where the inputs are defined. 

If you modify any python code then the TA will no longer be supported and will break with upgrades.

0 Karma

dragonchen
New Member

Hi

I have find the answer, you can create more then 1 SQS-based-s3 input to get message fsater, 1 input get 10message/second, 10 input get 100message/second.

I have created 20 input and it work find.

 

 

0 Karma

dragonchen
New Member

I have also try to edit {splunk_home}/etc/apps/Splunk_TA_aws/bin/aws_sqs_based_s3_inputs_rh.py's sqs_batch_size limitation and {splunk_home}/etc/apps/Splunk_TA_aws/local/inputs.conf 's sqs_batch_size, but it is not working

 

0 Karma
Get Updates on the Splunk Community!

Monitoring Postgres with OpenTelemetry

Behind every business-critical application, you’ll find databases. These behind-the-scenes stores power ...

Mastering Synthetic Browser Testing: Pro Tips to Keep Your Web App Running Smoothly

To start, if you're new to synthetic monitoring, I recommend exploring this synthetic monitoring overview. In ...

Splunk Edge Processor | Popular Use Cases to Get Started with Edge Processor

Splunk Edge Processor offers more efficient, flexible data transformation – helping you reduce noise, control ...