Splunk Observability Cloud

Assistance Needed: Configuring StatsD Exporter for Splunk OpenTelemetry Collector

rahusri2
Path Finder

Hello,

I'm setting up StatsD to send custom metrics from an AWS EC2 instance, where the Splunk OpenTelemetry Collector is running to Splunk Observability Cloud.

I've configured StatsD as a receiver using guidelines from the https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiverHere's my configuration for StatsD configured in the agent_config.yaml file.

receivers:
  statsd:
    endpoint: "localhost:8125"
    aggregation_interval: 60s
    enable_metric_type: false
    is_monotonic_counter: false
    timer_histogram_mapping:
      - statsd_type: "histogram"
        observer_type: "histogram"
        histogram:
          max_size: 50
      - statsd_type: "distribution"
        observer_type: "histogram"
        histogram: 
          max_size: 50    
      - statsd_type: "timing"
        observer_type: "summary"

The GitHub documentation provides exporter configurations, but I'm unsure how to implement them effectively. As per the github document below is mentioned.

exporters:
  file:
    path: ./test.json

service:
  pipelines:
    metrics:
      receivers: [statsd]
      exporters: [file]

Below is the receivers configuration which I am setting in the service configuration section in the in the agent_config.yaml as mentioned below:

service:
  pipelines:
    metrics:
      receivers: [hostmetrics, otlp, signalfx, statsd]
      processors: [memory_limiter, batch, resourcedetection]
      exporters: [signalfx]

When I add "statsd" ("receivers: [hostmetrics, otlp, signalfx, statsd]" and "exporters: [signalfx]") as one of the more receivers as mentioned above and restart the "systemctl restart splunk-otel-collector.service", splunk otel collector agent stop sending any metric to the Splunk Observability Cloud and when I remove statsd (receivers: [hostmetrics, otlp, signalfx]) then splunk otel collector agent starts sending any metric to the Splunk Observability Cloud.

What should be correct/supported ad receiver/exporter to be configured in the service section for the statsd?

Thanks

0 Karma
1 Solution

bishida
Splunk Employee
Splunk Employee

update: We did get this resolved earlier today. The cause was a port conflict as 8125 was already in use. With statsd, this can be tricky to catch because it's UDP--so normal testing methods for TCP ports don't work. We found that 8127 was available and used that to get it working. If anyone else encounters this, be sure to check logs (e.g., /var/log/messages or /var/log/syslog) for port conflict error messages.

View solution in original post

bishida
Splunk Employee
Splunk Employee

update: We did get this resolved earlier today. The cause was a port conflict as 8125 was already in use. With statsd, this can be tricky to catch because it's UDP--so normal testing methods for TCP ports don't work. We found that 8127 was available and used that to get it working. If anyone else encounters this, be sure to check logs (e.g., /var/log/messages or /var/log/syslog) for port conflict error messages.

rahusri2
Path Finder

Hello @bishida,

Thank you for taking the time to look into it and for all your help and support. It's truly appreciated.

0 Karma

PaulPanther
Motivator

Have you checked the logs of the Otel Collector?

Could you please define a separate pipeline for the statsd metrics like:

service:
  pipelines:
    metrics/statsd:
      receivers: 
        - statsd
      exporters: 
        - signalfx
0 Karma
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...