All Apps and Add-ons

Ingesting HAProxy stats in to Splunk

sunka55
New Member

All,

I do not see any mention of HAProxy stats in the add on documentation? I see we can collect great performance stats by enabling from haproxy.

ref: https://www.datadoghq.com/blog/how-to-collect-haproxy-metrics/#stats-page

Anybody can throw some light on whats the right way of doing this?

Thanks,
Sam

0 Karma

mghocke
Path Finder

The way I just recently did that was to run a cronjob every 10 minutes that pulls the metrics from haproxy via http in CSV format:

curl -s -u <username:password> http://localhost:<stats port>/stats;csv

It needs a bit of post processing so you can use the regular CSV data ingest by a universal forwarder. Here is the script I am using:

TS=`date +"%Y%m%d%H%M%S"`
DIR=<monitored directory>
FILE=$DIR/haproxy-metrics.$TS.csv

[ -d $DIR ] || mkdir -p $DIR
/usr/bin/curl -s -u <credentials> http://localhost:<port>/stats\;csv \
  | sed -e 's/^# /# time,/' -e 's/^\([^#]\)/'$TS',\1/' -e 's/# //' > $FILE

 The sed command makes sure that a time column is inserted in front with the current time and the '#' is removed from the header.

On the forwarder you can configure the inputs.conf to monitor that directory and give it the sourcetype csv or you can roll your own according to this doc page. Splunk will take care of the rest. Make sure you clean out that metrics directory once in a while so they don't collect there forever.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...