Splunk Search

Charting / Graphing Multiple data sources one chart

splunkedout
Explorer

has anyone experimented with showing statistics for the same time slot over multiple time periods ?

e.g. imagine a chart that shows the number of transaction over a 24 hour period as a line graph. now imagine a second line (of a different color) that has the 24 hour transaction from yesterday, and another that has a weekly average for 24 hour transaction.

i appreciate any insights.

Tags (3)
1 Solution

carasso
Splunk Employee
Splunk Employee

Comparing week-over-week results used to a pain in Splunk, with complex date calculations. No more. Now there is a better way.

I wrote a convenient search command called "timewrap" that does it all, for arbitrary time periods.

... | timechart count span=1d | timewrap w

That's it!

http://apps.splunk.com/app/1645/

View solution in original post

0 Karma

carasso
Splunk Employee
Splunk Employee

Comparing week-over-week results used to a pain in Splunk, with complex date calculations. No more. Now there is a better way.

I wrote a convenient search command called "timewrap" that does it all, for arbitrary time periods.

... | timechart count span=1d | timewrap w

That's it!

http://apps.splunk.com/app/1645/

0 Karma

splunkedout
Explorer

this is very insightful. there are lots of neat 'splunk moves' here that i didn't know existed. i appreciate you taking the time to write this up.

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

For the simple day over day case, the recipe for this solution is pretty well covered in:

http://answers.splunk.com/questions/2712/line-chart-comparing-yesterdays-result-with-todays-result-i...

To add a series for the weekly average, for each of the previous seven days at each time bucket, the search is significantly more involved. To do this, the first task is to | append [] some results that are the computation of the average. The details of the inner search depend on what type of aggregate function that you're using, in this case, you're looking at count, so it's not too bad:

... | append [search earliest=-7d@d latest=@d ... | eval _time = ((_time - relative_time(now(), "-7d@d")) % 86400) + relative_time(now(), "@d") | bin _time span=15m | chart count by _time | eval metric = count/7 | eval marker = "weekly average"]

Now you just have to glue this right before the final | chart in the day_over_day search in the referenced post, and you should have your answer.

Putting it all together:

<data> earliest=-1d@d latest=@h
| timechart span=15m count as metric
| addinfo
| eval marker = if(_time < info_min_time + 86400, "yesterday", "today")
| eval _time = if(_time < info_min_time + 86400, _time + 86400, _time)
| append [search <data> earliest=-7d@d latest=@d eventtype=download 
         | eval _time = ((_time - relative_time(now(), "-7d@d")) % 86400) + relative_time(now(), "@d") 
         | bin _time span=15m
         | chart count by _time
         | eval metric = count/7
         | eval marker = "weekly average"]
| chart median(metric) by _time marker
Get Updates on the Splunk Community!

Admin Your Splunk Cloud, Your Way

Join us to maximize different techniques to best tune Splunk Cloud. In this Tech Enablement, you will get ...

Cloud Platform | Discontinuing support for TLS version 1.0 and 1.1

Overview Transport Layer Security (TLS) is a security communications protocol that lets two computers, ...

New Customer Testimonials

Enterprises of all sizes and across different industries are accelerating cloud adoption by migrating ...