Dashboards & Visualizations

Service health dashboard

linusconcepcion
Engager

I have several services sending their logs over to splunk.

I'd like to generate a daily dashboard report that looks like the one at the bottom of this page:
http://status.aws.amazon.com/

Basically, all the rows would be my various services. The columns would be the last 5-10 days. There would be a green, yellow, or red mark in the cells depending on the number of ERRORs that appear on the logs.

Is a report like this possible?

Tags (1)
0 Karma
1 Solution

gkanapathy
Splunk Employee
Splunk Employee

yes, though it's slightly easier to do it in Splunk with the rows and columns the other way:

source=mylogs earliest=-5d@d  "ERROR" | timechart span=1d count by ServiceName

And then display it with the "heatmap" overlay. To transpose:

source=mylogs earliest=-5d@d  "ERROR" | timechart span=1d count by ServiceName | fieldformat _time=strftime("%Y-%m-%d", _time) | transpose

View solution in original post

gkanapathy
Splunk Employee
Splunk Employee

yes, though it's slightly easier to do it in Splunk with the rows and columns the other way:

source=mylogs earliest=-5d@d  "ERROR" | timechart span=1d count by ServiceName

And then display it with the "heatmap" overlay. To transpose:

source=mylogs earliest=-5d@d  "ERROR" | timechart span=1d count by ServiceName | fieldformat _time=strftime("%Y-%m-%d", _time) | transpose

linusconcepcion
Engager

This works fine. Thanks!

0 Karma

linusconcepcion
Engager

Thank you. I'll give this a shot and mark this as the answer if it works.

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...