Hi
I have configured SQS-Based S3 Input to get cloudtrail log, but the SQS Batch Size is limited to 1 to 10.
I have also configre Interval (in seconds) to zero. But my queue is also growing. Can I break the limitation or other approach to get cloudtrail log in s3?
I am using the architect below (I have more then one hundred account)
Thanks
For SQS S3 and Cloudtrail just clone your input and create multiple inputs. Just watch resources (CPU, Load Average) on the host where the inputs are defined.
If you modify any python code then the TA will no longer be supported and will break with upgrades.
Hi
I have find the answer, you can create more then 1 SQS-based-s3 input to get message fsater, 1 input get 10message/second, 10 input get 100message/second.
I have created 20 input and it work find.
I have also try to edit {splunk_home}/etc/apps/Splunk_TA_aws/bin/aws_sqs_based_s3_inputs_rh.py's sqs_batch_size limitation and {splunk_home}/etc/apps/Splunk_TA_aws/local/inputs.conf 's sqs_batch_size, but it is not working