All Apps and Add-ons

Ingesting HAProxy stats in to Splunk

sunka55
New Member

All,

I do not see any mention of HAProxy stats in the add on documentation? I see we can collect great performance stats by enabling from haproxy.

ref: https://www.datadoghq.com/blog/how-to-collect-haproxy-metrics/#stats-page

Anybody can throw some light on whats the right way of doing this?

Thanks,
Sam

0 Karma

mghocke
Path Finder

The way I just recently did that was to run a cronjob every 10 minutes that pulls the metrics from haproxy via http in CSV format:

curl -s -u <username:password> http://localhost:<stats port>/stats;csv

It needs a bit of post processing so you can use the regular CSV data ingest by a universal forwarder. Here is the script I am using:

TS=`date +"%Y%m%d%H%M%S"`
DIR=<monitored directory>
FILE=$DIR/haproxy-metrics.$TS.csv

[ -d $DIR ] || mkdir -p $DIR
/usr/bin/curl -s -u <credentials> http://localhost:<port>/stats\;csv \
  | sed -e 's/^# /# time,/' -e 's/^\([^#]\)/'$TS',\1/' -e 's/# //' > $FILE

 The sed command makes sure that a time column is inserted in front with the current time and the '#' is removed from the header.

On the forwarder you can configure the inputs.conf to monitor that directory and give it the sourcetype csv or you can roll your own according to this doc page. Splunk will take care of the rest. Make sure you clean out that metrics directory once in a while so they don't collect there forever.

0 Karma
Get Updates on the Splunk Community!

.conf25 Community Recap

Hello Splunkers, And just like that, .conf25 is in the books! What an incredible few days — full of learning, ...

Splunk App Developers | .conf25 Recap & What’s Next

If you stopped by the Builder Bar at .conf25 this year, thank you! The retro tech beer garden vibes were ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...