All Apps and Add-ons

AWS CloudTrail does not index if the S3 bucket has organization ID in the folder path

New Member


We have enabled CloudTrail from AWS Organization as a result CloudTrail creates the bucket with the following folder structure.
{bucket-name}/AWSLogs/{organization-id}/{account-id}/CloudTrail/{Region ID}/{YYYY}/{MM}/{DD}/{file_name}.json.gz.

When using the "Incremental S3" Splunk does not index the logs because of the "organization-id" within the path.

Is there a way I can tell Splunk to accept the "organization-id" and proceed with indexing?

Thanks in advance.

0 Karma

Path Finder

We ran into this same issue unfortunately.  A previous administrator set it up using Generic S3.  Over time the performance got so bad we tried using Incremental S3 and ran into this issue.  Due to the history and the need to 'just make it work', we went the route of modifying the app to include the org id in the path.

The caveat here is that this means you can't use the CloudTrail Incremental S3 input for anything else unless it contains the same org id structure.  It also means we're running the app on an on-prem forwarder and not within Splunk Cloud...

In our case our plan is to move to streaming logs through HEC via Amazon Kinesis, so the workaround is temporary.  Otherwise I would have not agreed to hacking up the add-on in the first place...

File Splunk_TA_aws/bin/splunk_ta_aws/modinputs/incremental_s3/

    def _enumerate_partitions(self, s3, bucket):
        partitions = list()
        #prefix = self._prefix + 'AWSLogs/'
        prefix = self._prefix + 'AWSLogs/o-1234567890/'


0 Karma

Splunk Employee
Splunk Employee

Try using the SQS Based S3 approach to collect the data from the S3 bucket. Here are the steps to set it up :

1) Setup SQS queue - cloudtrail
2) Setup SQS queue (dead letter queue) - cloudtrail-dlq
3) Set SQS queue (cloudtrail) permission to S3 to write to it (or just open it provisionally to all services)
4) Configure CloudTrail S3 bucket event to trigger the SQS queue (cloudtrail) on Object PUT
5) Setup SQS Based S3 input on your Heavy Forwarder

0 Karma

New Member

@amiracl I believe the SQS method will import logs from the time the setup is configured but how will I import the existing logs?

0 Karma

Ultra Champion

Are you using the latest version of the AWS TA?

If my comment helps, please give it a thumbs up!
0 Karma

New Member

Yes, I'm using Splunk 8.0.1

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

March Community Office Hours Security Series Uncovered!

Hello Splunk Community! In March, Splunk Community Office Hours spotlighted our fabulous Splunk Threat ...

Stay Connected: Your Guide to April Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars in April. This post ...