Getting Data In

Splunk Add on for AWS

Jenifer
Loves-to-Learn

Hi Team,

We are trying to onboard AWS cloudwatch metrics and events data to splunk , we decided to go with splunk Add on for AWS pull mechanism. I am trying to configure a custom namespace and metrics created in  AWS to splunk , I am unable to see the metrics there . I edited the default aws namespaces and added my custom namespace . Is this right method to add my custom metrics. Can someone guide here. 

Labels (1)
0 Karma

Jenifer
Loves-to-Learn

Even though I configured in AWS correctly I am getting this error in Splunk

Metric event data without a metric name and properly formated numerical values are invalid and cannot be indexed. Ensure the input metric data is not malformed, have one or more keys of the form "metric_name:< metric > " (e.g..."metric_name:cpu.idle") with corresponding floating point values.

0 Karma

datadevops
Path Finder

Hey there,

Adding custom AWS metrics to Splunk with the pull mechanism can be tricky! Editing the default namespaces isn't quite the way to go. Here's the key:

  1. New stanza in inputs.conf: Create a new section for your custom namespace with namespace set to its exact name (e.g., MyCompany/CustomMetrics).
  2. Specify metrics (optional): Add metric_names if you want specific metrics, otherwise use . for all.
  3. Set sourcetype and other params: Ensure sourcetype is aws:cloudwatch and adjust index and period as needed.

Remember to restart the Splunk Forwarder for the changes to take effect.

If you're still facing issues, double-check your namespace name and Splunk logs for errors. And feel free to ask if you need more help!

~ If the reply helps, a Karma upvote would be appreciated

0 Karma

datadevops
Path Finder

Hey there,

Adding custom metrics from AWS CloudWatch to Splunk using the Splunk Add-on for AWS pull mechanism can be tricky, but I'm here to help!

Here's the key: While editing the default aws namespaces might seem intuitive, it's not the recommended approach for custom metrics.

Instead, follow these steps:

  1. Identify your custom namespace: Make sure you know the exact namespace (e.g., "MyCompany/CustomMetrics") created in AWS CloudWatch.
  2. Configure inputs.conf:
    • Add a new stanza for your custom namespace under the [aws:cloudwatch] section.
    • Use namespace to specify your custom namespace (e.g., namespace = MyCompany/CustomMetrics).
    • Optionally, define specific metric_names or use . for all metrics.
    • Set sourcetype to aws:cloudwatch.
    • Adjust other parameters like index and period as needed.
  3. Restart Splunk Forwarder: For the changes to take effect, restart the Splunk Forwarder running inputs.conf.

Example inputs.conf stanza:

[aws:cloudwatch://custom_metrics]
# Replace with your actual namespace
namespace = MyCompany/CustomMetrics
# Optional: Filter specific metrics
# metric_names = metric1, metric2
sourcetype = aws:cloudwatch
index = main
period = 60

Additional Tips:

  • Double-check your namespace name for accuracy.
  • Use Splunk Web's "Inputs" section to verify if your new input is active.
  • If you still face issues, check Splunk logs for errors related to your custom namespace input.

Remember: This approach specifically caters to custom metrics. If you're dealing with custom events, the process might differ slightly. Feel free to share more details if you need further assistance!

~ If the reply helps, a Karma upvote would be appreciated

0 Karma

Jenifer
Loves-to-Learn

Hi Team,

We are configuring in it in our splunk web, my AWS namespace is 
custom/namespace 
metric 1 ,2 .... metric is created with no dimensions.
Configured in splunk 
AWS account : xxx
AWS region: eu-west-1
index: B
Metrics Configuration : custom/namespace 
Name: A
SourceType: aws:cloudwatch

I still don't see any logs, Am I configuring in right way 

0 Karma

Jenifer
Loves-to-Learn

I had aws:cloudwatch:metrics to get the custom metrics.

Is there any way where I can get all the aws cloudwatch log group directly rather mentioning one by one , because when there is new log group created we have to reconfigure it and there chance of forgetting to add new log group

0 Karma
Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...