All Apps and Add-ons

Splunk Add-on for AWS for cloudtrail SQS-Based S3 input limitation

dragonchen
New Member

Hi

I have configured SQS-Based S3 Input to get cloudtrail log, but the SQS Batch Size is limited to 1 to 10.

I have also configre Interval (in seconds) to zero. But my queue is also growing. Can I break the limitation or other approach to get cloudtrail log in s3?

I am using the architect below (I have more then one hundred account)

截圖 2020-07-16 10.57.33.png

 

Thanks

Labels (2)
0 Karma

rmorlen
Splunk Employee
Splunk Employee

For SQS S3 and Cloudtrail just clone your input and create multiple inputs.  Just watch resources (CPU, Load Average) on the host where the inputs are defined. 

If you modify any python code then the TA will no longer be supported and will break with upgrades.

0 Karma

dragonchen
New Member

Hi

I have find the answer, you can create more then 1 SQS-based-s3 input to get message fsater, 1 input get 10message/second, 10 input get 100message/second.

I have created 20 input and it work find.

 

 

0 Karma

dragonchen
New Member

I have also try to edit {splunk_home}/etc/apps/Splunk_TA_aws/bin/aws_sqs_based_s3_inputs_rh.py's sqs_batch_size limitation and {splunk_home}/etc/apps/Splunk_TA_aws/local/inputs.conf 's sqs_batch_size, but it is not working

 

0 Karma
Get Updates on the Splunk Community!

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...

Combine Multiline Logs into a Single Event with SOCK: a Step-by-Step Guide for ...

Combine multiline logs into a single event with SOCK - a step-by-step guide for newbies Olga Malita The ...

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...