Splunk Search

Charting / Graphing Multiple data sources one chart

splunkedout
Explorer

has anyone experimented with showing statistics for the same time slot over multiple time periods ?

e.g. imagine a chart that shows the number of transaction over a 24 hour period as a line graph. now imagine a second line (of a different color) that has the 24 hour transaction from yesterday, and another that has a weekly average for 24 hour transaction.

i appreciate any insights.

Tags (3)
1 Solution

carasso
Splunk Employee
Splunk Employee

Comparing week-over-week results used to a pain in Splunk, with complex date calculations. No more. Now there is a better way.

I wrote a convenient search command called "timewrap" that does it all, for arbitrary time periods.

... | timechart count span=1d | timewrap w

That's it!

http://apps.splunk.com/app/1645/

View solution in original post

0 Karma

carasso
Splunk Employee
Splunk Employee

Comparing week-over-week results used to a pain in Splunk, with complex date calculations. No more. Now there is a better way.

I wrote a convenient search command called "timewrap" that does it all, for arbitrary time periods.

... | timechart count span=1d | timewrap w

That's it!

http://apps.splunk.com/app/1645/

0 Karma

splunkedout
Explorer

this is very insightful. there are lots of neat 'splunk moves' here that i didn't know existed. i appreciate you taking the time to write this up.

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

For the simple day over day case, the recipe for this solution is pretty well covered in:

http://answers.splunk.com/questions/2712/line-chart-comparing-yesterdays-result-with-todays-result-i...

To add a series for the weekly average, for each of the previous seven days at each time bucket, the search is significantly more involved. To do this, the first task is to | append [] some results that are the computation of the average. The details of the inner search depend on what type of aggregate function that you're using, in this case, you're looking at count, so it's not too bad:

... | append [search earliest=-7d@d latest=@d ... | eval _time = ((_time - relative_time(now(), "-7d@d")) % 86400) + relative_time(now(), "@d") | bin _time span=15m | chart count by _time | eval metric = count/7 | eval marker = "weekly average"]

Now you just have to glue this right before the final | chart in the day_over_day search in the referenced post, and you should have your answer.

Putting it all together:

<data> earliest=-1d@d latest=@h
| timechart span=15m count as metric
| addinfo
| eval marker = if(_time < info_min_time + 86400, "yesterday", "today")
| eval _time = if(_time < info_min_time + 86400, _time + 86400, _time)
| append [search <data> earliest=-7d@d latest=@d eventtype=download 
         | eval _time = ((_time - relative_time(now(), "-7d@d")) % 86400) + relative_time(now(), "@d") 
         | bin _time span=15m
         | chart count by _time
         | eval metric = count/7
         | eval marker = "weekly average"]
| chart median(metric) by _time marker
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...