Splunk Search

Daily average for the past month by field

vonAnden
Explorer

Hey

I have created a graph displaying earned fee with the following code:

Type=trade-confirmed |
rename Properties.DailyStats{}.ConsumptionDay AS ConsumptionDay, Properties.DailyStats{}.DailyFee AS DailyFee |
eval y=mvzip(DailyFee, ConsumptionDay) | mvexpand y | eval y= split(y, ",") | eval DailyFee=mvindex(y,0) |
eval ConsumptionDay=mvindex(y,1) |
eval _time=strptime('ConsumptionDay', "%Y-%m-%dT%H:%M:%S.%N") |
timechart sum(DailyFee) as "Total fee" span=1d

Now I want to make a graph showing the average fee - meaning that each date on the x-axis should show the average fee from the specific day and 30 days back.
So today the 22th of september - the 22th of september on x-axis should show the average for the past month (22th of August)
for example

DATA:
date -> fees
22. sep -> 50
14. sep -> 200
22. aug -> 70

Graph -> values:
22th of september -> 320/30 = 10,66
23th of september -> 250/30 = 8,33

I have tried this code:
| timechart sum(DailyFee) as "Total fee" span=1d | timechart span=1mon mean("Total fee")
but it only returns one result for each month and not for each day and the result doesn't seems right anyway..

So I hope someone can help me get it right 🙂

Tags (1)
0 Karma

vonAnden
Explorer

I have almost solved the problem. The 30 days running average is now calculating the correct average 1 month after the first event has occurred. For example when splunk calculate the average 1 day after the first event has occurred the average splunk calculates is only for 2 days instead of 30 days. Because the dates from _time starts from the date that the first event occurred.

The solution so far:

| timechart sum(DailyFee) as "Total fee" span=1d | fillnull value=0 "Total fee" | streamstats avg("Total fee") as "Running 30 days" window=30

yannK
Splunk Employee
Splunk Employee

Great, I didn't understood that you wanted a rolling 30 days, not a month average.
The streamstats is the good way.

0 Karma

vonAnden
Explorer

yes that is my mistake I didn't know the right word for it before.. So I tried to describe it but apparently not good enough 🙂

0 Karma

vonAnden
Explorer

When I only use the following:

| timechart sum(DailyFee) as "Total fee" span=1d | timechart span=1mon mean("Total fee")

I get these result in statistics:
2014-03-11 -> value (incorrect average)
2014-04-10 -> value (incorrect average)
2014-05-11 -> value (incorrect average)
2014-06-09 -> value (incorrect average)
2014-07-09 -> value (incorrect average)
2014-08-08 -> value (incorrect average)
2014-09-07 -> value (incorrect average)

first of all the first event I have is from the 2014-04-09 so I don't understand why it calculate for marts.. and why these dates. Second the averages is incorrect the results doesn't make sense..

If I use "makecontinuous _time span=1d" I get every date from the 2014-03-11, so to get all dates seems to work with that command. but I still only get values from the above dates

So any ideas how to calculate the average for each day ?

I think it has be some kind of dynamic command because each day needs to calculate the average from 30 days before up to the specific day..

0 Karma

yannK
Splunk Employee
Splunk Employee

your problem is that you seems have just some days, no values for each days.

You can try to use

But I have no working example, It may require some eval comparison magic to make the "fee" values continuous.

0 Karma

somesoni2
Revered Legend

Try with span=30d instead of 1mon.

0 Karma
Get Updates on the Splunk Community!

Index This | I’m short for "configuration file.” What am I?

May 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with a Special ...

New Articles from Academic Learning Partners, Help Expand Lantern’s Use Case Library, ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Your Guide to SPL2 at .conf24!

So, you’re headed to .conf24? You’re in for a good time. Las Vegas weather is just *chef’s kiss* beautiful in ...