All Apps and Add-ons

AWS CloudTrail does not index if the S3 bucket has organization ID in the folder path

supratiksekhar
New Member

Hello

We have enabled CloudTrail from AWS Organization as a result CloudTrail creates the bucket with the following folder structure.
{bucket-name}/AWSLogs/{organization-id}/{account-id}/CloudTrail/{Region ID}/{YYYY}/{MM}/{DD}/{file_name}.json.gz.

When using the "Incremental S3" Splunk does not index the logs because of the "organization-id" within the path.

Is there a way I can tell Splunk to accept the "organization-id" and proceed with indexing?

Thanks in advance.

0 Karma

JosephHobbs
Path Finder

We ran into this same issue unfortunately.  A previous administrator set it up using Generic S3.  Over time the performance got so bad we tried using Incremental S3 and ran into this issue.  Due to the history and the need to 'just make it work', we went the route of modifying the app to include the org id in the path.

The caveat here is that this means you can't use the CloudTrail Incremental S3 input for anything else unless it contains the same org id structure.  It also means we're running the app on an on-prem forwarder and not within Splunk Cloud...

In our case our plan is to move to streaming logs through HEC via Amazon Kinesis, so the workaround is temporary.  Otherwise I would have not agreed to hacking up the add-on in the first place...

File Splunk_TA_aws/bin/splunk_ta_aws/modinputs/incremental_s3/cloudtrail_logs.py

    def _enumerate_partitions(self, s3, bucket):
        partitions = list()
        #prefix = self._prefix + 'AWSLogs/'
        prefix = self._prefix + 'AWSLogs/o-1234567890/'

 

0 Karma

amiracle
Splunk Employee
Splunk Employee

Try using the SQS Based S3 approach to collect the data from the S3 bucket. Here are the steps to set it up :

1) Setup SQS queue - cloudtrail
2) Setup SQS queue (dead letter queue) - cloudtrail-dlq
3) Set SQS queue (cloudtrail) permission to S3 to write to it (or just open it provisionally to all services)
4) Configure CloudTrail S3 bucket event to trigger the SQS queue (cloudtrail) on Object PUT
5) Setup SQS Based S3 input on your Heavy Forwarder

0 Karma

supratiksekhar
New Member

@amiracl I believe the SQS method will import logs from the time the setup is configured but how will I import the existing logs?

0 Karma

nickhills
Ultra Champion

Are you using the latest version of the AWS TA?

If my comment helps, please give it a thumbs up!
0 Karma

supratiksekhar
New Member

Yes, I'm using Splunk 8.0.1

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...