All Apps and Add-ons

Splunk Add-on for Amazon Web Services: How to setup a cloudtrail/s3 input?

cloudlockmatt
New Member

I'm unable to setup an install of Splunk Enterprise (6.6.0) to read cloudtrail logs from a s3 bucket using the Splunk Add-on for Amazon Web Services. The cloudtrail logs are encrypted with kms. The instance running Splunk has a instance role (named splunk), to which I have added the single policy defined here: https://docs.splunk.com/Documentation/AddOns/released/AWS/ConfigureAWSpermissions#Configure_one_poli...

I also grant the role the ability to decrypt using the kms key:

    {
      "Effect": "Allow",
      "Action": [
        "kms:Decrypt",
        "kms:ReEncryptFrom"
      ],
      "Resource": "<my_kms_key_id>"
    }

When adding the AWS S3 input:

  • I choose "splunk" as the AWS account - I assume this is the instance role, but odd that it shows under Account.
  • Blank assume role
  • Default s3 hostname of s3.amazonaws.com
  • Specified my s3 bucket
  • On settings tab chose CloudTrail as the log type, the rest default
  • On templates tab I provided a prefix of "auditlog" as that is the prefix cloudtrail is configured to write with.

When I click "Create", it spins for a few seconds then reports an error of:

{"messages":[{"type":"ERROR","text":"Unexpected error \"<class 'boto.exception.S3ResponseError'>\" from python handler: \"S3ResponseError: 403 Forbidden\n\". See splunkd.log for more details."}]}

I do have VPC endpoint for s3 setup for the vpc, and it has the default policy to grant star dot star.

Anyone know what I am doing wrong? Any help appreciated, thanks

0 Karma

santosh407
Engager

@cloudlockmatt I also faced the same issue and I was able to resolve the issue by providing the below permission to the AWS account or role configured on the Splunk Add on for AWS. 

  1. kms:ListAliases
  2. kms:Decrypt
  3. s3:GetBucketLocation
  4. S3:GetBucketObject - In my case, I have missed this permission 
  5. s3:ListAllMyBuckets
  6. s3:PutBucketAcl
  7. s3:PutBucketPolicy
  8. s3:PutObject
0 Karma

mreynov_splunk
Splunk Employee
Splunk Employee

Incremental S3 does not yet support KMS ootb

0 Karma

cloudlockmatt
New Member

Good to know, thanks for the help. Unfortunately, this also fails with ELB access logs (for incremental only, in the same bucket), which are not encrypted with KMS, so while that could be related, I'm not sure how...could the presence of any encrypted files in the same bucket cause a failure even though I am pointing the input at a prefix which does not have any KMS encrypted files in it? If so, why? And how do I work around it?

For future reference, when you say it doesn't work OOTB, does that mean that there is a way to get it to work with KMS with some tweaking?

0 Karma

JamieTaschetti
New Member

Im have the same issue 😞

0 Karma

cloudlockmatt
New Member

If incremental is unchecked, I am able to get past the create input dialog, and logs do get pulled from s3. Does anyone know what it is about the incremental setting that would cause this permissions failure?

0 Karma

cloudlockmatt
New Member

This also happens when I try reading elb access logs from the same bucket, so its not specific to cloudtrail and the kms setup is not to blame. The bucket policy does not have any explicit DENYs either, and to be safe, I grant access to the splunk role in both the bucket policy and the iam policy attached to the splunk role.

0 Karma

amiracle
Splunk Employee
Splunk Employee

Were you able to pull data from another non-secured S3 bucket with the same ELB logs? Also, have you tried sending your data into Splunk using HEC instead of using the S3 modular input?

0 Karma

cloudlockmatt
New Member

Thanks for the help. No, we don't have any non-secured buckets. We have also standardized on keeping our raw logs in s3 for long term storage, sent there via a cloudwatch log agent on each instance. To change that would be too intrusive at this stage. I'm sure there are many other alternatives we could try to step around this s3 permissions problem, but that doesn't help us move forward at this time.

I am able to perform the required s3 actions using the aws cli on the same instance using the same instance profile role that splunk is supposed to be using. So my permissions are more than likely correct, and the fault lies with whatever I am doing wrong, or not doing, when setting up splunk. This is a pretty basic stock install of enterprise version, with aws addon/app installed, and nothing else, so no idea why it won't work. I also tried adding "S3_USE_SIGV4 = True" to the bottom of "/opt/splunk/etc/splunk-launch.conf" and restarting, but that had no effect - the add-on still fails to connect to the bucket to read the logs - fails on a head_bucket call - it is however, able to get a list of all buckets for me to choose from when setting up the input, just fails when I click "Create".

0 Karma

cloudlockmatt
New Member

Note that I am able to use the aws cli to access the bucket from the instance (thereby using the same instance role the input is being configured to use):

aws s3 ls s3://my-logs-bucket/auditlog/
aws s3 cp s3://my-logs-bucket/auditlog/AWSLogs/1234567890/CloudTrail/us-east-1/2017/05/21/1234567890_CloudTrail_us-east-1_20170521T2355Z_okixa7EyfCNkfXnt.json.gz .
aws s3api head-bucket --bucket my-logs-bucket
0 Karma

cloudlockmatt
New Member

The splunkd.log shows that is failing to perform a head_bucket, yet I am able to do so using the aws cli.

05-23-2017 02:08:32.030 +0000 ERROR AdminManagerExternal - Stack trace from python handler:\nTraceback (most recent call last):\n  File "/opt/splunk/lib/python2.7/site-packages/splunk/admin.py", line 130, in init\n    hand.execute(info)\n  File "/opt/splunk/lib/python2.7/site-packages/splunk/admin.py", line 593, in execute\n    if self.requestedAction == ACTION_CREATE:   self.handleCreate(confInfo)\n  File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws_rh_inputs_logs.py", line 42, in handleCreate\n    datainput.DataInputHandler.handleCreate(self, confInfo)\n  File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunktalib/rest_manager/datainput.py", line 28, in handleCreate\n    args = self.encode(self.callerArgs.data)\n  File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws_rh_inputs_logs.py", line 61, in encode\n    args['bucket_region'] = self._get_bucket(args)\n  File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws_rh_inputs_logs.py", line 76, in _get_bucket\n    return get_region_for_bucketname(config)\n  File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/s3_mod/aws_s3_common.py", line 109, in get_region_for_bucketname\n    conn.get_bucket(config[asc.bucket_name])\n  File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/boto/s3/connection.py", line 506, in get_bucket\n    return self.head_bucket(bucket_name, headers=headers)\n  File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/boto/s3/connection.py", line 539, in head_bucket\n    raise err\nS3ResponseError: S3ResponseError: 403 Forbidden\n\n
0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...