The "answer" above is not valid since the S3 add-on does not seem to traverse into subdirectories correctly unless you add an entire bucket as the target.
i.e. given a bucket "log-bucket" that contains ELB logs you would only be able to monitor the entire bucket with a single input or a single directory/object. When setting up ELB and CloudTrail logging AWS manages the directory structure and organization of those logs in the S3 bucket you specify.
So in "log-bucket" you will have.
/AWSLogs
/AWSLogs/12345678890 (this is your account number)
/AWSLogs/12345678890/elasticloadbalancing
/AWSLogs/12345678890/elasticloadbalancing/us-east-1
Now you get to the actual log directories organized by date:
/AWSLogs/12345678890/elasticloadbalancing/us-east-1/{YEAR}/{Month}/{Day}/something.log
If you were to put CloudTrail logs in this same bucket they would be in the same dir structure.
/AWSLogs/12345678890/CloudTrail/us-east-1/{YEAR}/{Month}/{Day}/something.log
If you have CloudFront and S3 access logs in this same bucket then you would have more issues when monitoring the entire bucket.
Using an input of s3://log-bucket/AWSLogs/12345678890/CloudTrail/
Would give the following error:
Encountered the following error while trying to update: In handler 's3': Invalid configuration specified: No objects found inside s3://log-bucket/AWSLogs/12345678890/CloudTrail/.
In addition to these problems the S3 add-on s3.py script does not appear to handle "paging" of buckets properly. i.e. If there is over 1000 objects in a bucket then the script will only ever see the first 1000 objects because the script does not use markers to page through the results.
See: http://answers.splunk.com/answers/66611/splunk-for-amazon-s3-add-on-not-able-to-fetch-all-logs
... View more