Installation

Splunk Add-on for AWS

phuha
Loves-to-Learn Everything

Dear Team,

This is my setup for Splunk at home:

1 - splunk enterprise standalone

2 - multiple AWS accounts with Cloudtrails enabled.

3 - Splunk add-on for AWS

I've completed the setup and have the Splunk add-on for AWS running with Cloudflare log received from S3 to my index=cloudflare.

I've completed the cloudtrail log setup for index=cloudtrail1. However, right now i have about 8 differents AWS account for different purposes with cloudtrail enabled, how can i put them all into one index, but do not have to create 8 different ones?

Labels (2)
0 Karma

phuha
Loves-to-Learn Everything

@alonsocaio 

The input that i'm using is Generic S3.

1 - For first option, I just worry that when I consolidate all the the logs into the main account, I will have to create a new S3 bucket, and put all the logs from other accounts into it. Will this create any difficulty in differentiate between which and which for my Splunk?

2 - For second option, If i do that, in the index setup, how do I config and let Splunk know which log is log, or I don't need to care at all?

Thank you so much for your time, and support @alonsocaio .

0 Karma

alonsocaio
Contributor

1 - Actually yes, you could consolidate all logs in one bucket into the main account. Then you would create only one input in Splunk. Since the CloudTrail record contains the AWS Account ID that generated that event/action, you don't need to worry about differentiating the logs.

2 - You could set 2 or more Generic S3 inputs, from different accounts, to send logs to the same index. Since the records contains the AWS Account ID you will be able to differentiate logs from account A or account B using your search query.

Just an example, for both scenarios, you would search something like:

index=AWS sourcetype=aws:cloudtrail aws_account_id=XXXX OR aws_account_id=YYYY ...

Also, I suggest you to test and validate which option is best for your environment. Maybe the second option will be easier, since It needs only Splunk configurations.

0 Karma

alonsocaio
Contributor

Hi @phuha 

Which input are you using for cloudtrail?

One alternative you could try is centralizing your AWS CloudTrail logs into your main account. Then, in Splunk, you would need to add just one input for getting logs from your S3/SQS.

Also, you can have multiple inputs into configured in Splunk Add-on for AWS sending logs for the same index and same sourcetype. Ex.: Account 1 and Account 2 will have two inputs configured, but will have their logs sent to the index "cloudtrail" and sourcetype will be "aws:cloudtrail"

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...