Installation

Splunk Add-on for AWS

phuha
Loves-to-Learn Everything

Dear Team,

This is my setup for Splunk at home:

1 - splunk enterprise standalone

2 - multiple AWS accounts with Cloudtrails enabled.

3 - Splunk add-on for AWS

I've completed the setup and have the Splunk add-on for AWS running with Cloudflare log received from S3 to my index=cloudflare.

I've completed the cloudtrail log setup for index=cloudtrail1. However, right now i have about 8 differents AWS account for different purposes with cloudtrail enabled, how can i put them all into one index, but do not have to create 8 different ones?

Labels (2)
0 Karma

phuha
Loves-to-Learn Everything

@alonsocaio 

The input that i'm using is Generic S3.

1 - For first option, I just worry that when I consolidate all the the logs into the main account, I will have to create a new S3 bucket, and put all the logs from other accounts into it. Will this create any difficulty in differentiate between which and which for my Splunk?

2 - For second option, If i do that, in the index setup, how do I config and let Splunk know which log is log, or I don't need to care at all?

Thank you so much for your time, and support @alonsocaio .

0 Karma

alonsocaio
Contributor

1 - Actually yes, you could consolidate all logs in one bucket into the main account. Then you would create only one input in Splunk. Since the CloudTrail record contains the AWS Account ID that generated that event/action, you don't need to worry about differentiating the logs.

2 - You could set 2 or more Generic S3 inputs, from different accounts, to send logs to the same index. Since the records contains the AWS Account ID you will be able to differentiate logs from account A or account B using your search query.

Just an example, for both scenarios, you would search something like:

index=AWS sourcetype=aws:cloudtrail aws_account_id=XXXX OR aws_account_id=YYYY ...

Also, I suggest you to test and validate which option is best for your environment. Maybe the second option will be easier, since It needs only Splunk configurations.

0 Karma

alonsocaio
Contributor

Hi @phuha 

Which input are you using for cloudtrail?

One alternative you could try is centralizing your AWS CloudTrail logs into your main account. Then, in Splunk, you would need to add just one input for getting logs from your S3/SQS.

Also, you can have multiple inputs into configured in Splunk Add-on for AWS sending logs for the same index and same sourcetype. Ex.: Account 1 and Account 2 will have two inputs configured, but will have their logs sent to the index "cloudtrail" and sourcetype will be "aws:cloudtrail"

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...