Splunk Observability Cloud

Integrating Golang gRPC Application Custom Metrics and Logs with Splunk Observability Cloud

rahusri2
Path Finder

Hello Team,

I have successfully set up Splunk Observability Cloud to monitor Amazon Web Services through Amazon CloudWatch and can now observe all AWS services via IAM role.

Additionally, I have a gRPC application running on an AWS EC2 instance, which generates custom metrics using a StatsD server via Golang. I would like to send these custom metrics to Splunk Observability Cloud to monitor the health of the gRPC application, along with the logs it generates.

On my AWS Linux machine, I can see that the host monitoring agent is installed and the splunk-otel-collector service is running. Could you please advise on the method to send the custom metrics and logs generated by the StatsD server from the Golang gRPC application to Splunk Observability Cloud for monitoring?

Thank you.

0 Karma

rahulsrivastava
Observer

Hello @bishida,

I have gone through the repo for the statsdreceiver but I was not able to configure it successfully.

receivers: statsd: statsd/2: endpoint: "localhost:8127" aggregation_interval: 70s enable_metric_type: true is_monotonic_counter: false timer_histogram_mapping: - statsd_type: "histogram" observer_type: "gauge" - statsd_type: "timing" observer_type: "histogram" histogram: max_size: 100 - statsd_type: "distribution" observer_type: "summary" summary: percentiles: [0, 10, 50, 90, 95, 100]

I tried to configure above but it was not working, here I am not sure how Splunk Oberverability Cloud will know to listen to 8127 post.

Let me explain my use case in detail:

I have couple of EC2 Linux instance on which statsd server is running and it generating some custom gRPC metrics from a golang application running on port UDP:8125 (statsd).

Now, I want these custom gRPC metrics from a golang application running on port UDP:8125 (statsd) to send to Splunk Oberverability Cloud, so that I can monitor these custom gRPC metrics there, but this we need to make a connection between EC2 Linux instance and Splunk Oberverability Cloud, Splunk Oberverability Cloud should able to recieve these custom gRPC metrics as we don't have any hostname/IP address for Splunk Oberverability Cloud we have to use some agent for doing this, I think we can using "splunk-otel-collector.service"

Currently I am able to capture the predefined metrices such "^aws.ec2.cpu.utilization", system.filesystem.usage etc on my Splunk Oberverability Cloud but now I also want the custom gRPC metrics same like this.

Before this setup I using a setup in which I was having multiple EC2 Linux instance on which statsd server was running and I was a serepate Spunk Enterprise EC2 instance and it was collecting all the metrics there. But Spunk Enterprise provide commands to connect instances to Spunk Enterprise using "./splunk enable listen 9997" and "./splunk add <destination_hostname>:9997" and I was using below configuration to do so.

"statsd": { "statsd_max_packetsize": 1400, "statsd_server" : "destination_hostname", "statsd_port" : "8125" },

Same thing I want to achieve using Splunk Oberverability Cloud. Can you please explain in detail how we can connect EC2 instances with Splunk Oberverability Cloud to send custom gRPC metrics from a golang application running on port UDP:8125 (statsd), if using https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiver is the only way then what changes I want to make in the configuration files related to the custom metric collection (has to be added any where in this directory), hostname, ports name mentioning in any files etc in details.

Thanks
0 Karma

rahusri2
Path Finder

Thanks for the suggestion @bishida 

0 Karma

bishida
Splunk Employee
Splunk Employee

I think you have some options.

You could configure a statsd receiver on your OTel collector and then send your metrics to that receiver's listening endpoint.
https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiver

If it's possible to export OTLP formatted metrics and/or logs, you could send them to the OTLP receiver's listening endpoint on your OTel collector (port 4317 for grpc or 4318 for http).

For logs that you can collect from disk, you could use a Splunk universal forwarder to get your logs into Splunk Cloud or Splunk Enterprise. Or, you could use an OTel filelog receiver to collect those logs from disk and send to Splunk Cloud/Enterprise via an HEC (http event collector) endpoint.

https://docs.splunk.com/observability/en/gdi/opentelemetry/components/filelog-receiver.html

rahusri2
Path Finder

Hello @bishida,

I have gone through the repo for the statsdreceiver but I was not able to configure it successfully.

receivers: statsd: statsd/2: endpoint: "localhost:8127" aggregation_interval: 70s enable_metric_type: true is_monotonic_counter: false timer_histogram_mapping: - statsd_type: "histogram" observer_type: "gauge" - statsd_type: "timing" observer_type: "histogram" histogram: max_size: 100 - statsd_type: "distribution" observer_type: "summary" summary: percentiles: [0, 10, 50, 90, 95, 100]

I tried to configure above but it was not working, here I am not sure how Splunk Oberverability Cloud will know to listen to 8127 post.

Let me explain my use case in detail:

I have couple of EC2 Linux instance on which statsd server is running and it generating some custom gRPC metrics from a golang application running on port UDP:8125 (statsd).

Now, I want these custom gRPC metrics from a golang application running on port UDP:8125 (statsd) to send to Splunk Oberverability Cloud, so that I can monitor these custom gRPC metrics there, but this we need to make a connection between EC2 Linux instance and Splunk Oberverability Cloud, Splunk Oberverability Cloud should able to recieve these custom gRPC metrics as we don't have any hostname/IP address for Splunk Oberverability Cloud we have to use some agent for doing this, I think we can using "splunk-otel-collector.service"

Currently I am able to capture the predefined metrices such "^aws.ec2.cpu.utilization", system.filesystem.usage etc on my Splunk Oberverability Cloud but now I also want the custom gRPC metrics same like this.

Before this setup I using a setup in which I was having multiple EC2 Linux instance on which statsd server was running and I was a serepate Spunk Enterprise EC2 instance and it was collecting all the metrics there. But Spunk Enterprise provide commands to connect instances to Spunk Enterprise using "./splunk enable listen 9997" and "./splunk add <destination_hostname>:9997" and I was using below configuration to do so.

"statsd": { "statsd_max_packetsize": 1400, "statsd_server" : "destination_hostname", "statsd_port" : "8125" },

Same thing I want to achieve using Splunk Oberverability Cloud. Can you please explain in detail how we can connect EC2 instances with Splunk Oberverability Cloud to send custom gRPC metrics from a golang application running on port UDP:8125 (statsd), if using https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiver is the only way then what changes I want to make in the configuration files related to the custom metric collection (has to be added any where in this directory), hostname, ports name mentioning in any files etc in details.

Thanks
0 Karma

bishida
Splunk Employee
Splunk Employee

Here is a working example of statsd receiver:

bishida_0-1734450335510.png

bishida_1-1734450389955.png


After you restart the collector, it will be listening on UDP port 8125. Since this is UDP and not TCP, you can't test the port like you normally would and get a response. Send a test metric to that port and then search for it in the Metric Finder in O11y Cloud.

echo "statsd.test.metric:42|c|#mykey:#myval" | nc -w 1 -u -4 localhost 8125

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...