Hello Team,
I have successfully set up Splunk Observability Cloud to monitor Amazon Web Services through Amazon CloudWatch and can now observe all AWS services via IAM role.
Additionally, I have a gRPC application running on an AWS EC2 instance, which generates custom metrics using a StatsD server via Golang. I would like to send these custom metrics to Splunk Observability Cloud to monitor the health of the gRPC application, along with the logs it generates.
On my AWS Linux machine, I can see that the host monitoring agent is installed and the splunk-otel-collector service is running. Could you please advise on the method to send the custom metrics and logs generated by the StatsD server from the Golang gRPC application to Splunk Observability Cloud for monitoring?
Thank you.
Hello @bishida,
Hello @bishida,
Hello @bishida,
I have gone through the repo for the statsdreceiver but I was not able to configure it successfully.
receivers: statsd: statsd/2: endpoint: "localhost:8127" aggregation_interval: 70s enable_metric_type: true is_monotonic_counter: false timer_histogram_mapping: - statsd_type: "histogram" observer_type: "gauge" - statsd_type: "timing" observer_type: "histogram" histogram: max_size: 100 - statsd_type: "distribution" observer_type: "summary" summary: percentiles: [0, 10, 50, 90, 95, 100]
I tried to configure the above but it was not working, here I am not sure how Splunk Oberverability Cloud will know to listen to 8127 port.
Let me explain my use case in detail:
I have a couple of EC2 Linux instances on which the statsd server is running and it is generating some custom gRPC metrics from a golang application running on port UDP:8125 (statsd).
Now, I want these custom gRPC metrics from a golang application running on port UDP:8125 (statsd) to send to Splunk Oberverability Cloud, so that I can monitor these custom gRPC metrics there, but this we need to make a connection between EC2 Linux instance and Splunk Oberverability Cloud, Splunk Oberverability Cloud should able to receive these custom gRPC metrics as we don't have any hostname/IP address for Splunk Oberverability Cloud we have to use some agent for doing this, I think we can using "splunk-otel-collector.service"
Currently I am able to capture the predefined metrics such as "^aws.ec2.cpu.utilization", system.filesystem.usage etc on my Splunk Oberverability Cloud but now I also want the custom gRPC metrics same like this.
Before this setup I used a setup in which I was having multiple EC2 Linux instances on which statsd server was running and I was a serepate Spunk Enterprise EC2 instance and it was collecting all the metrics there. But Spunk Enterprise provides commands to connect instances to Spunk Enterprise using "./splunk enable listen 9997" and "./splunk add <destination_hostname>:9997" and I was using the below configuration to do so.
"statsd": { "statsd_max_packetsize": 1400, "statsd_server" : "destination_hostname", "statsd_port" : "8125" },
Same thing I want to achieve using Splunk Oberverability Cloud. Can you please explain in detail how we can connect EC2 instances with Splunk Oberverability Cloud to send custom gRPC metrics from a golang application running on port UDP:8125 (statsd), if using https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiver is the only way then what changes I want to make in the configuration files related to the custom metric collection (has to be added anywhere in this directory), hostname, ports name mentioning in any files etc in details.
Thanks
Thanks for the suggestion @bishida
I think you have some options.
You could configure a statsd receiver on your OTel collector and then send your metrics to that receiver's listening endpoint.
https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiver
If it's possible to export OTLP formatted metrics and/or logs, you could send them to the OTLP receiver's listening endpoint on your OTel collector (port 4317 for grpc or 4318 for http).
For logs that you can collect from disk, you could use a Splunk universal forwarder to get your logs into Splunk Cloud or Splunk Enterprise. Or, you could use an OTel filelog receiver to collect those logs from disk and send to Splunk Cloud/Enterprise via an HEC (http event collector) endpoint.
https://docs.splunk.com/observability/en/gdi/opentelemetry/components/filelog-receiver.html
Hello @bishida,
Here is a working example of statsd receiver:
After you restart the collector, it will be listening on UDP port 8125. Since this is UDP and not TCP, you can't test the port like you normally would and get a response. Send a test metric to that port and then search for it in the Metric Finder in O11y Cloud.
echo "statsd.test.metric:42|c|#mykey:#myval" | nc -w 1 -u -4 localhost 8125
Hello @bishida,
Thanks for the reply.