I would like to be able to use an API endpoint to get metrics from SignalFx, but the documentation is very confusing.
I have found this API:
https://dev.splunk.com/observability/reference/api/signalflow/latest#endpoint-execute-signalflow-com...
But I am unable to get the Signalflow query (that works in APM Dashboard just fine) to work in the API, and also I see this in the above documentation:
"Text string containing a SignalFlow program that has one or more detect().publish() output streams." , which seems to indicate that this endpoint only is to publish Detectors.....
All I want to do is to be able to execute Signalflow queries, should be pretty straight forward....
What am I missing?
The time parameters should be "start" and "stop" (not "start" and "end"). Here is another command line example to try:
# get cpu.utilization over past 10 minutes
> STOP_MS=$(date -u +%s000)
> START_MS=$(( STOP_MS - 10*60*1000 ))
> REALM=us1
> TOKEN=$$$$YOUR-API-TOKEN$$$$
> curl -sS -N \
-X POST "https://stream.${REALM}.signalfx.com/v2/signalflow/execute?start=${START_MS}&stop=${STOP_MS}&resolut..." \
-H "Content-Type: application/json" \
-H "X-SF-Token: ${TOKEN}" \
-d '{"programText":"data(\"cpu.utilization\").mean(over=\"1m\").publish(label=\"cpu\")"}'
I had missed the stop/end part (first used TimeSeriesWindow which used endMS.....). Thanks!
Anyways, I do get data back now, but the format is different from what normally comes back from a Rest API.
Part of the returned data:
event: control-message
data: {
data: "event" : "STREAM_START",
data: "timestampMs" : 1756827385654,
data: "traceId" : "592efa8f2c37d7c3"
data: }
event: control-message
data: {
data: "event" : "JOB_START",
data: "handle" : "Gz10mHVAEA4",
data: "timestampMs" : 1756827385735
data: }
event: metadata
data: {
data: "properties" : {
data: "cloud.region" : "westeurope",
data: "computationId" : "Gz10mHVAEA4",
data: "k8s.namespace.name" : "XXXXXXXXXX-acc",
data: "sf_isPreQuantized" : true,
data: "sf_key" : [ "sf_originatingMetric", "sf_metric", "computationId" ],
data: "sf_metric" : "_SF_COMP_Gz10mHVAEA4_02-PUBLISH_METRIC",
data: "sf_organizationID" : "hgfhgfhgfhgfhgf",
data: "sf_originatingMetric" : "container_cpu_utilization",
data: "sf_resolutionMs" : 60000,
data: "sf_singletonFixedDimensions" : [ "k8s.namespace.name", "cloud.region", "sf_metric" ],
data: "sf_streamLabel" : "A",
data: "sf_type" : "MetricTimeSeries"
data: },
data: "tsId" : "AAAAAIB-Sgg"
data: }
event: data
id: data-1756811940000
data: {
data: "data" : [ {
data: "tsId" : "AAAAAIB-Sgg",
data: "value" : 0.48333333333333334
data: } ],
data: "logicalTimestampMs" : 1756811940000,
data: "maxDelayMs" : 10000
data: }
Hi @dmoberg
The SignalFlow “execute computation” API runs any SignalFlow program, but it will only return results for streams you publish().
Are you able to share any examples of what you have tried so far? The example in the docs should be a good starting point and has previously worked for me.
curl -X POST "https://stream.{REALM}.signalfx.com/v2/signalflow/start" \ -H "Content-Type: application/json" \ -H "X-SF-Token: <value>" \ -H "Last-Event-ID: <value>" \ -d '{ "programText": "A = data(\'trans.latency\').mean(over=Args[\'ui.dashboard_window\']).mean().publish(); detect(when(A>threshold(5))).publish(\'detector_name\');", "programArgs": { "ui.dashboard_window": "10m" } }'
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
I actually do get the API request to execute, but the strange thing is that I get the same response back no matter what I specify as the metric (in below example container_cpu_utilization). But if I specify the metric as something random like "hello" I still get data returned, which seems very odd. It is almost as the commands are not used.
Also, it does not seem that the data is limited to the start and end that is passed in as queries as I can see timestamps not matching this range coming back.