All Apps and Add-ons

Splunk Add-on for Amazon Web Services: Why are we not able to to pull metrics for the cloudwatch AWS/S3 namespace?

rtennysonad
Engager

The environment has 1 search head, and 2 indexers running Splunk 7.2.1. The Splunk App for AWS (5.1.2) and Splunk Add-on (4.6.0), are installed. The add-on is loaded on the sh to manage the inputs.

We have setup multiple inputs that are functioning correctly, including the cloudwatch AWS/EC2, EBS, RDS inputs, however when attempting to set up cloudwatch AWS/S3 namespace we are unable to ingest data.

We have verified via the docs that all index macros are config'd correctly, as well as aws permissoins are set correctly.

We see no errors in the aws:cloudwatch:logs
(index=_internal sourcetype=aws:cloudwatch:log )

We see in the logs that the following steps are processing for the AWS/S3 inputs without error:

  • Create task for data input.
  • Start querying data points.
  • Start running batches.
  • Batches completed.
  • Querying data points finished.

Our inputs.conf is listed below. Is there something missing in our configurations below, or any additional thoughts
regarding troubleshooting that we could attempt?

Thanks for any help.

inputs.conf:

[aws_cloudwatch://XXXXXXXX_aws_cloudwatch_56f74518-6038-4bcd-bf6d-c51f687860aa]
aws_account = XXXXXXXX
aws_region = us-west-1
index = aws
metric_dimensions = [{"StorageType":["AllStorageTypes"],"BucketName":["rmp-files"]}]
metric_names = ["NumberOfObjects"]
metric_namespace = AWS/S3
period = 60
polling_interval = 3600
sourcetype = aws:cloudwatch
statistics = ["Average"]
use_metric_format = false

[aws_cloudwatch://XXXXXXXX_aws_cloudwatch_6ff56c72-8097-460d-bbfc-a0e126f6953c]
aws_account = XXXXXXXX
aws_region = us-west-1
index = aws
metric_dimensions = [{"StorageType":["StandardStorage"],"BucketName":["rmp-files"]}]
metric_names = ["BucketSizeBytes"]
metric_namespace = AWS/S3
period = 60
polling_interval = 3600
sourcetype = aws:cloudwatch
statistics = ["Average"]
use_metric_format = false

s_alatroshi
New Member

I am facing the same issue and I am out of my troubleshooting options , I hope someone can help here .

Thanks

0 Karma

junchen2019
Engager

we have the same issue here

0 Karma

dpsoukup
Engager

I am not sure if it will work for you or not, but here is what I did that got around the issue.

In the Add-on configuration of inputs,
-edit the CloudWatch input and then "Edit in advanced mode" and remove the AWS/S3 Namespace totally and save/update
-Create a new CloudWatch input (slightly different name) and delete all of the Namespaces except AWS/S3
--- in this new input for just AWS/S3 change the period value under "Advanced Settings" to 3600

The value for Period has to be the same for all of related Namespaces so you cannot change just this one without creating a new input, even in the inputs.conf file (it does not like any variation in period within the same defined input) and the S3 bucket CloudWatch metrics are only reported once per day, that may be the issue.

tvergov
Explorer

thanks buddy. saved the day!

0 Karma

carlkennedy
Path Finder

I thought there was no way this solution would work but sure enough, it fixed things right up. Running v4.6.1 of Splunk Add-on for AWS in Splunk Cloud. Now getting all metrics plus S3 NumberOfObjects and BucketSizeBytes.

0 Karma

adcallahan
Engager

dpsoukup,

Your suggestion fixed my issue! I greatly appreciate your assistance!

Thanks

0 Karma

rtennysonad
Engager

Thank you dpsoukup, this fixed the original issue. Much appreciated.

0 Karma

D2SI
Communicator

It resolved our issue as well, thanks!!

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...