Splunk Search

## How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days

Explorer

Hello,

I have a requirement to find the rolling average  and variance % as per below requirement. If there is no event for any date then we should have an 0 events for that missing date so that we've continuously dates in our report.

The "7d Rolling average Daily Event Count" column is the average count of events ingested each day for the last 7 days NOT including today (yesterday thru previous 6 days).

"Variance" is the difference in count of events between today's event count and the 7d rolling Avg.  (Today's event count minus the 7d rolling average event count).

"% Variance" is the percentage difference between today's event count and the 7d rolling average (Variance divided by 7d rolling average ).

"average Daily Variance" is the absolute value of the 7d rolling average of the % Variance values, not including today (yesterday thru previous 6 days).

Example:

 data source Last event time Event Count 7d rolling average event count Variance % Variance average Daily Variance test 9/3/2021 2957 2060 897 44% 24% test 9/2/2021 1438 2064 -626 -30% 24% test 9/1/2021 2906 2055 851 41% 23% test 8/31/2021 2753 2036 718 35% 22% test 8/30/2021 2131 2036 95 5% 22% test 8/29/2021 2235 2010 225 11% 23% test 8/28/2021 3126 1961 1165 59% 21% test 8/27/2021 2785 1931 854 44% 20% test 8/26/2021 1331 1939 -608 -31% 20% test 8/25/2021 1685 1950 -265 -14% 20% test 8/24/2021 1426 1984 -558 -28% 20% test 8/23/2021 1939 1965 -26 -1% 21% test 8/22/2021 2467 1966 501 25% 20% test 8/21/2021 1482 2010 -528 -26% 20% test 8/20/2021 2026 2016 10 0% 20%

Thanks for your help in advance.

Labels (2)

• ### tstats

1 Solution
SplunkTrust

Concatenate the count and time, then split them up again after the untable.

``````| makeresults
| eval _raw="max_time_each_day	data_source	today_count
Sep 15,2021 07:25:01 AM EDT	ABC	14503
Sep 14,2021 23:59:51 PM EDT	ABC	51570
Sep 13,2021 23:59:57 PM EDT	ABC	56331
Sep 12,2021 23:59:59 PM EDT	ABC	55717
Sep 11,2021 23:59:51 PM EDT	ABC	54480
Sep 10,2021 23:59:49 PM EDT	ABC	65367
Sep 09,2021 23:59:59 PM EDT	ABC	61999
Sep 08,2021 23:59:57 PM EDT	ABC	55405
Sep 07,2021 23:59:51 PM EDT	ABC	62327
Sep 06,2021 23:59:48 PM EDT	ABC	54137
Sep 05,2021 23:59:56 PM EDT	ABC	49224
Sep 04,2021 23:59:54 PM EDT	ABC	47783
Sep 03,2021 23:59:52 PM EDT	ABC	52699
Sep 02,2021 23:59:53 PM EDT	ABC	70145
Sep 01,2021 23:59:57 PM EDT	ABC	79071
Sep 14,2021 10:05:16 AM EDT	XYZ	21
Sep 13,2021 10:32:58 AM EDT	XYZ	23
Sep 10,2021 11:30:07 AM EDT	XYZ	22
Sep 09,2021 09:51:28 AM EDT	XYZ	19
Sep 08,2021 09:56:16 AM EDT	XYZ	19
Sep 05,2021 04:32:44 AM EDT	XYZ	19
Sep 02,2021 10:03:06 AM EDT	XYZ	19
Sep 01,2021 04:32:54 AM EDT	XYZ	19
Sep 15,2021 04:32:00 AM EDT	PQR	229
Sep 14,2021 04:31:59 AM EDT	PQR	268
Sep 13,2021 04:32:03 AM EDT	PQR	302
Sep 12,2021 04:31:59 AM EDT	PQR	302
Sep 11,2021 04:32:15 AM EDT	PQR	297
Sep 10,2021 04:32:00 AM EDT	PQR	305
Sep 09,2021 04:32:04 AM EDT	PQR	267
Sep 08,2021 04:32:02 AM EDT	PQR	267
Sep 07,2021 04:32:12 AM EDT	PQR	305
Sep 06,2021 04:32:01 AM EDT	PQR	305
Sep 05,2021 04:31:53 AM EDT	PQR	195
Sep 04,2021 04:31:52 AM EDT	PQR	157
Sep 03,2021 04:32:01 AM EDT	PQR	267
Sep 02,2021 04:31:59 AM EDT	PQR	157
Sep 01,2021 04:32:53 AM EDT	PQR	305
Sep 14,2021 10:05:15 AM EDT	DST	103
Sep 13,2021 10:33:00 AM EDT	DST	109
Sep 10,2021 11:30:07 AM EDT	DST	106
Sep 09,2021 04:31:55 AM EDT	DST	105
Sep 08,2021 09:51:06 AM EDT	DST	36
Sep 07,2021 15:44:18 PM EDT	DST	71
Sep 02,2021 04:31:59 AM EDT	DST	105
Sep 01,2021 16:44:02 PM EDT	DST	105"
| multikv forceheader=1
| table max* data* today*

| eval day=strptime(max_time_each_day,"%b %d,%Y")
| fieldformat day=strftime(day,"%Y/%m/%d")
| eval today_count=today_count."!".max_time_each_day
| xyseries day data_source today_count
| makecontinuous day
| fillnull value=0
| untable day data_source today_count
| eval today_count=split(today_count,"!")
| eval max_time_each_day=mvindex(today_count,1)
| eval today_count=mvindex(today_count,0)
| eval max_time_each_day=if(isnull(max_time_each_day),strftime(day,"%b %d,%Y"),max_time_each_day)
| fields - day``````
SplunkTrust

Without some idea of your data, it's not easy to give you a precise answer, however, this query is a simulated query that will produce some random data over the last month and then create a table like yours. It is the use of streamstats that will calculate the rolling averages for you.

``````| makeresults
| eval row=mvrange(1,20000)
| mvexpand row
| eval _time=_time-((random() % 30 + 1) * 86400)
| timechart span=1d count
| eval skip_day=relative_time(now(), "-9d@d")
| eval count=if(_time=skip_day, 0, count)
| fields - skip_day
| streamstats window=8 current=f avg(count) as avg7day
| eval variance=round(count-avg7day), varperc=round(variance/avg7day*100)
| streamstats window=8 current=f avg(varperc) as avgvarperc
| eval avgvarperc=round(avgvarperc), avg7day=round(avg7day)
| table _time count avg7day variance varperc avgvarperc
| rename count as "Event Count", avg7day as "7 Day Rolling Average", variance as Variance, varperc as "% Variance", avgvarperc as "Average Daily Variance"``````

I have introduced a 0 value day so you can see that it has no impact on the final table. I wasn't sure that the average daily variance calculation is correct based on your description, but hopefully this gives you enough to get what you want.

Explorer

Hello,

Thanks for your prompt reply but query that you shared is not giving expected result.
I've achieved 7d rolling average and variance. But now I've to fill the gaps in "Last Event Time" column.  i.e.  I want to show missing dates with 0 event count. Could you please suggest how to fill that gap?

 Data Source Last Event Time Event Count 7d rolling average event count Variance %Variance data_source1 Sep 13,2021 10:32:58 AM EDT 23 20 3 15 data_source1 Sep 10,2021 11:30:07 AM EDT 22 17 5 30 data_source1 Sep 09,2021 09:51:28 AM EDT 19 14 5 36 data_source1 Sep 08,2021 09:56:16 AM EDT 19 11 8 73 data_source1 Sep 05,2021 04:32:44 AM EDT 19 9 10 112 data_source1 Sep 02,2021 10:03:06 AM EDT 19 6 13 217 data_source1 Sep 01,2021 04:32:54 AM EDT 19 3 16 534 data_source1 Aug 31,2021 10:13:22 AM EDT 19 0 19 0
SplunkTrust
``````| makeresults
| eval _raw="Data Source	Last Event Time	Event Count	7d rolling average event count	Variance	Percent_Variance
data_source1	Sep 13,2021 10:32:58 AM EDT	23	20	3	15
data_source1	Sep 10,2021 11:30:07 AM EDT	22	17	5	30
data_source1	Sep 09,2021 09:51:28 AM EDT	19	14	5	36
data_source1	Sep 08,2021 09:56:16 AM EDT	19	11	8	73
data_source1	Sep 05,2021 04:32:44 AM EDT	19	9	10	112
data_source1	Sep 02,2021 10:03:06 AM EDT	19	6	13	217
data_source1	Sep 01,2021 04:32:54 AM EDT	19	3	16	534
data_source1	Aug 31,2021 10:13:22 AM EDT	19	0	19	0"
| multikv forceheader=1
| fields - _* linecount
| table Data_Source Last_Event_Time Event_Count * Percent_Variance

| eval day=strptime(Last_Event_Time,"%b %d,%Y")
| fieldformat day=strftime(day,"%Y/%m/%d")
| makecontinuous day
| eval Event_Count=0
| filldown Data_Source``````
SplunkTrust

Please share your query

Explorer

Hello,

Thanks for your response. Here below is my query.

| tstats count as per_day_count latest(_time) as LTime_per_day where (index=xxxx earliest=-14d@d latest=now) groupby source _time
| bin span=1d _time
| lookup data_sources.csv source OUTPUTNEW data_source, Monitor
| search Monitor=Yes
| eval Last_event_time_per_day=strftime(LTime_per_day,"%b %d,%Y %H:%M:%S %p %Z")

| join type=outer data_source
[ tstats count latest(_time) as LTime values(sourcetype) as sourcetype where (index=yyyy* OR (index=zzzz sourcetype=*amp*)earliest=-d@d latest=now) groupby source _time
| bin span=1d _time
| lookup data_sources.csv source OUTPUTNEW data_source Monitor
| eval Last_event_time=strftime(LTime,"%b %d,%Y %H:%M:%S %p %Z")
| eval status= if(count > 0, "Data Received","NO Data Received")
]
| eval Last_event_time=strftime(LTime,"%b %d,%Y %H:%M:%S %p %Z")
| table data_source sourcetype count Last_event_time Last_event_time_per_day status per_day_count _time
| fillnull value="NO Data Received" status
| sort - status
| rename count as "Event Count"
| sort + sourcetype
| fillnull value="-"
| streamstats window=7 current=f list(per_day_count) as count1 by data_source
| stats latest(per_day_count) as today_count, max(Last_event_time_per_day) as max_time_each_day by _time data_source
| sort data_source, -_time

and here below is a sample subset returned by the above query:

 max_time_each_day data_source today_count Sep 15,2021 07:25:01 AM EDT ABC 14503 Sep 14,2021 23:59:51 PM EDT ABC 51570 Sep 13,2021 23:59:57 PM EDT ABC 56331 Sep 12,2021 23:59:59 PM EDT ABC 55717 Sep 11,2021 23:59:51 PM EDT ABC 54480 Sep 10,2021 23:59:49 PM EDT ABC 65367 Sep 09,2021 23:59:59 PM EDT ABC 61999 Sep 08,2021 23:59:57 PM EDT ABC 55405 Sep 07,2021 23:59:51 PM EDT ABC 62327 Sep 06,2021 23:59:48 PM EDT ABC 54137 Sep 05,2021 23:59:56 PM EDT ABC 49224 Sep 04,2021 23:59:54 PM EDT ABC 47783 Sep 03,2021 23:59:52 PM EDT ABC 52699 Sep 02,2021 23:59:53 PM EDT ABC 70145 Sep 01,2021 23:59:57 PM EDT ABC 79071 Sep 14,2021 10:05:16 AM EDT XYZ 21 Sep 13,2021 10:32:58 AM EDT XYZ 23 Sep 10,2021 11:30:07 AM EDT XYZ 22 Sep 09,2021 09:51:28 AM EDT XYZ 19 Sep 08,2021 09:56:16 AM EDT XYZ 19 Sep 05,2021 04:32:44 AM EDT XYZ 19 Sep 02,2021 10:03:06 AM EDT XYZ 19 Sep 01,2021 04:32:54 AM EDT XYZ 19 Sep 15,2021 04:32:00 AM EDT PQR 229 Sep 14,2021 04:31:59 AM EDT PQR 268 Sep 13,2021 04:32:03 AM EDT PQR 302 Sep 12,2021 04:31:59 AM EDT PQR 302 Sep 11,2021 04:32:15 AM EDT PQR 297 Sep 10,2021 04:32:00 AM EDT PQR 305 Sep 09,2021 04:32:04 AM EDT PQR 267 Sep 08,2021 04:32:02 AM EDT PQR 267 Sep 07,2021 04:32:12 AM EDT PQR 305 Sep 06,2021 04:32:01 AM EDT PQR 305 Sep 05,2021 04:31:53 AM EDT PQR 195 Sep 04,2021 04:31:52 AM EDT PQR 157 Sep 03,2021 04:32:01 AM EDT PQR 267 Sep 02,2021 04:31:59 AM EDT PQR 157 Sep 01,2021 04:32:53 AM EDT PQR 305 Sep 14,2021 10:05:15 AM EDT DST 103 Sep 13,2021 10:33:00 AM EDT DST 109 Sep 10,2021 11:30:07 AM EDT DST 106 Sep 09,2021 04:31:55 AM EDT DST 105 Sep 08,2021 09:51:06 AM EDT DST 36 Sep 07,2021 15:44:18 PM EDT DST 71 Sep 02,2021 04:31:59 AM EDT DST 105 Sep 01,2021 16:44:02 PM EDT DST 105

My requirement is that to fill the time gaps for data_source "XYZ" and "DST" and for these two data sources I should have today_count=0 for those gaps.

Thanks in advance.

Regards,

Tags (1)
SplunkTrust

Oh dear... I have come up with a horrible piece of SPL to get from your final table to the result you want - but I have to believe that it is possible another way 😁

``````| makeresults
| eval _raw="max_time_each_day	data_source	today_count
Sep 15,2021 07:25:01 AM EDT	ABC	14503
Sep 14,2021 23:59:51 PM EDT	ABC	51570
Sep 13,2021 23:59:57 PM EDT	ABC	56331
Sep 12,2021 23:59:59 PM EDT	ABC	55717
Sep 11,2021 23:59:51 PM EDT	ABC	54480
Sep 10,2021 23:59:49 PM EDT	ABC	65367
Sep 09,2021 23:59:59 PM EDT	ABC	61999
Sep 08,2021 23:59:57 PM EDT	ABC	55405
Sep 07,2021 23:59:51 PM EDT	ABC	62327
Sep 06,2021 23:59:48 PM EDT	ABC	54137
Sep 05,2021 23:59:56 PM EDT	ABC	49224
Sep 04,2021 23:59:54 PM EDT	ABC	47783
Sep 03,2021 23:59:52 PM EDT	ABC	52699
Sep 02,2021 23:59:53 PM EDT	ABC	70145
Sep 01,2021 23:59:57 PM EDT	ABC	79071
Sep 14,2021 10:05:16 AM EDT	XYZ	21
Sep 13,2021 10:32:58 AM EDT	XYZ	23
Sep 10,2021 11:30:07 AM EDT	XYZ	22
Sep 09,2021 09:51:28 AM EDT	XYZ	19
Sep 08,2021 09:56:16 AM EDT	XYZ	19
Sep 05,2021 04:32:44 AM EDT	XYZ	19
Sep 02,2021 10:03:06 AM EDT	XYZ	19
Sep 01,2021 04:32:54 AM EDT	XYZ	19
Sep 15,2021 04:32:00 AM EDT	PQR	229
Sep 14,2021 04:31:59 AM EDT	PQR	268
Sep 13,2021 04:32:03 AM EDT	PQR	302
Sep 12,2021 04:31:59 AM EDT	PQR	302
Sep 11,2021 04:32:15 AM EDT	PQR	297
Sep 10,2021 04:32:00 AM EDT	PQR	305
Sep 09,2021 04:32:04 AM EDT	PQR	267
Sep 08,2021 04:32:02 AM EDT	PQR	267
Sep 07,2021 04:32:12 AM EDT	PQR	305
Sep 06,2021 04:32:01 AM EDT	PQR	305
Sep 05,2021 04:31:53 AM EDT	PQR	195
Sep 04,2021 04:31:52 AM EDT	PQR	157
Sep 03,2021 04:32:01 AM EDT	PQR	267
Sep 02,2021 04:31:59 AM EDT	PQR	157
Sep 01,2021 04:32:53 AM EDT	PQR	305
Sep 14,2021 10:05:15 AM EDT	DST	103
Sep 13,2021 10:33:00 AM EDT	DST	109
Sep 10,2021 11:30:07 AM EDT	DST	106
Sep 09,2021 04:31:55 AM EDT	DST	105
Sep 08,2021 09:51:06 AM EDT	DST	36
Sep 07,2021 15:44:18 PM EDT	DST	71
Sep 02,2021 04:31:59 AM EDT	DST	105
Sep 01,2021 16:44:02 PM EDT	DST	105"
| multikv forceheader=1
| table max* data* today*
| eval _time=strptime(max_time_each_day, "%b %d,%Y"), first_day=relative_time(now(), "@d")
| eval day=round(first_day-_time)
| eval d=mvrange(86400,(16*86400),86400)
| eventstats values(day) as day by data_source
| eval missing=mvmap(d,if(isnull(mvfind(day,d)),d,null()))
| streamstats c by data_source
| fields - d day
| eval missing=case(c=1 AND mvcount(missing)>0,mvappend(missing,_time),c=1 AND isnull(missing),_time, 1==1, _time)
| mvexpand missing
| eval t2=first_day-(missing)
| eval today_count=if(_time!=missing,0,today_count), max_time_each_day=if(_time!=missing,strftime(t2,"%b %d,%Y %H:%M:%S %p %Z"), max_time_each_day), _time=if(_time!=missing,t2,_time)
| fields - missing t2 first_day
| sort data_source -_time``````

In essence, what this does is to find out which dates are missing by

• Work out what day this row related to (day)
• Create an MV field of the missing days by using mvrange to calculate the 15 day range of expected values
• (d)
• Gets the first entry for each data source (streamstats c)
• Using mvmap (Splunk 😎 to create another MV field of the missing days (missing)
• Has to make sure that the field contains info for the CURRENT row
• Expands the missing field to create the extra rows
• The sets up the values of the empty entries - ensure that the original row data is not overwritten (_time!=missing)

bit of a hack, but not sure if makecontinuous can be used with the grouping. Other ways to create rows is to use an append or join to create the date range rows and then to aggregate back, but that would mean repeating the search in the subsearch to find out the data sources that exist and would still probably involve some mvexpand.

@ITWhisperer I am sure you can find a better way to do this...

SplunkTrust

@bowesmana Thanks for the vote of confidence 😀

@mnj1809 If you are not too concerned about the actual times (which I guess you aren't since you have bin span=1d, you could try something like this

``````| makeresults
| eval _raw="max_time_each_day	data_source	today_count
Sep 15,2021 07:25:01 AM EDT	ABC	14503
Sep 14,2021 23:59:51 PM EDT	ABC	51570
Sep 13,2021 23:59:57 PM EDT	ABC	56331
Sep 12,2021 23:59:59 PM EDT	ABC	55717
Sep 11,2021 23:59:51 PM EDT	ABC	54480
Sep 10,2021 23:59:49 PM EDT	ABC	65367
Sep 09,2021 23:59:59 PM EDT	ABC	61999
Sep 08,2021 23:59:57 PM EDT	ABC	55405
Sep 07,2021 23:59:51 PM EDT	ABC	62327
Sep 06,2021 23:59:48 PM EDT	ABC	54137
Sep 05,2021 23:59:56 PM EDT	ABC	49224
Sep 04,2021 23:59:54 PM EDT	ABC	47783
Sep 03,2021 23:59:52 PM EDT	ABC	52699
Sep 02,2021 23:59:53 PM EDT	ABC	70145
Sep 01,2021 23:59:57 PM EDT	ABC	79071
Sep 14,2021 10:05:16 AM EDT	XYZ	21
Sep 13,2021 10:32:58 AM EDT	XYZ	23
Sep 10,2021 11:30:07 AM EDT	XYZ	22
Sep 09,2021 09:51:28 AM EDT	XYZ	19
Sep 08,2021 09:56:16 AM EDT	XYZ	19
Sep 05,2021 04:32:44 AM EDT	XYZ	19
Sep 02,2021 10:03:06 AM EDT	XYZ	19
Sep 01,2021 04:32:54 AM EDT	XYZ	19
Sep 15,2021 04:32:00 AM EDT	PQR	229
Sep 14,2021 04:31:59 AM EDT	PQR	268
Sep 13,2021 04:32:03 AM EDT	PQR	302
Sep 12,2021 04:31:59 AM EDT	PQR	302
Sep 11,2021 04:32:15 AM EDT	PQR	297
Sep 10,2021 04:32:00 AM EDT	PQR	305
Sep 09,2021 04:32:04 AM EDT	PQR	267
Sep 08,2021 04:32:02 AM EDT	PQR	267
Sep 07,2021 04:32:12 AM EDT	PQR	305
Sep 06,2021 04:32:01 AM EDT	PQR	305
Sep 05,2021 04:31:53 AM EDT	PQR	195
Sep 04,2021 04:31:52 AM EDT	PQR	157
Sep 03,2021 04:32:01 AM EDT	PQR	267
Sep 02,2021 04:31:59 AM EDT	PQR	157
Sep 01,2021 04:32:53 AM EDT	PQR	305
Sep 14,2021 10:05:15 AM EDT	DST	103
Sep 13,2021 10:33:00 AM EDT	DST	109
Sep 10,2021 11:30:07 AM EDT	DST	106
Sep 09,2021 04:31:55 AM EDT	DST	105
Sep 08,2021 09:51:06 AM EDT	DST	36
Sep 07,2021 15:44:18 PM EDT	DST	71
Sep 02,2021 04:31:59 AM EDT	DST	105
Sep 01,2021 16:44:02 PM EDT	DST	105"
| multikv forceheader=1
| table max* data* today*

| eval day=strptime(max_time_each_day,"%b %d,%Y")
| fieldformat day=strftime(day,"%Y/%m/%d")
| xyseries day data_source today_count
| makecontinuous day
| fillnull value=0
| untable day data_source today_count``````
SplunkTrust

Knew it!

xyseries/makecontinuous/untable - neat - solves the issue of the split clause. Good technique

Tags (1)
Explorer

Thanks for making my day Champ :)... Basically I am concerned about the actual times.. so your previous solution works perfectly. Only one last thing is left that suppose if I execute the SPL query today and if any data source not sent the data today , in this case I need the event_count =0 for today as well and as soon as I get the data for that data source then event_count should be that actual count.

Tags (2)
SplunkTrust

Concatenate the count and time, then split them up again after the untable.

``````| makeresults
| eval _raw="max_time_each_day	data_source	today_count
Sep 15,2021 07:25:01 AM EDT	ABC	14503
Sep 14,2021 23:59:51 PM EDT	ABC	51570
Sep 13,2021 23:59:57 PM EDT	ABC	56331
Sep 12,2021 23:59:59 PM EDT	ABC	55717
Sep 11,2021 23:59:51 PM EDT	ABC	54480
Sep 10,2021 23:59:49 PM EDT	ABC	65367
Sep 09,2021 23:59:59 PM EDT	ABC	61999
Sep 08,2021 23:59:57 PM EDT	ABC	55405
Sep 07,2021 23:59:51 PM EDT	ABC	62327
Sep 06,2021 23:59:48 PM EDT	ABC	54137
Sep 05,2021 23:59:56 PM EDT	ABC	49224
Sep 04,2021 23:59:54 PM EDT	ABC	47783
Sep 03,2021 23:59:52 PM EDT	ABC	52699
Sep 02,2021 23:59:53 PM EDT	ABC	70145
Sep 01,2021 23:59:57 PM EDT	ABC	79071
Sep 14,2021 10:05:16 AM EDT	XYZ	21
Sep 13,2021 10:32:58 AM EDT	XYZ	23
Sep 10,2021 11:30:07 AM EDT	XYZ	22
Sep 09,2021 09:51:28 AM EDT	XYZ	19
Sep 08,2021 09:56:16 AM EDT	XYZ	19
Sep 05,2021 04:32:44 AM EDT	XYZ	19
Sep 02,2021 10:03:06 AM EDT	XYZ	19
Sep 01,2021 04:32:54 AM EDT	XYZ	19
Sep 15,2021 04:32:00 AM EDT	PQR	229
Sep 14,2021 04:31:59 AM EDT	PQR	268
Sep 13,2021 04:32:03 AM EDT	PQR	302
Sep 12,2021 04:31:59 AM EDT	PQR	302
Sep 11,2021 04:32:15 AM EDT	PQR	297
Sep 10,2021 04:32:00 AM EDT	PQR	305
Sep 09,2021 04:32:04 AM EDT	PQR	267
Sep 08,2021 04:32:02 AM EDT	PQR	267
Sep 07,2021 04:32:12 AM EDT	PQR	305
Sep 06,2021 04:32:01 AM EDT	PQR	305
Sep 05,2021 04:31:53 AM EDT	PQR	195
Sep 04,2021 04:31:52 AM EDT	PQR	157
Sep 03,2021 04:32:01 AM EDT	PQR	267
Sep 02,2021 04:31:59 AM EDT	PQR	157
Sep 01,2021 04:32:53 AM EDT	PQR	305
Sep 14,2021 10:05:15 AM EDT	DST	103
Sep 13,2021 10:33:00 AM EDT	DST	109
Sep 10,2021 11:30:07 AM EDT	DST	106
Sep 09,2021 04:31:55 AM EDT	DST	105
Sep 08,2021 09:51:06 AM EDT	DST	36
Sep 07,2021 15:44:18 PM EDT	DST	71
Sep 02,2021 04:31:59 AM EDT	DST	105
Sep 01,2021 16:44:02 PM EDT	DST	105"
| multikv forceheader=1
| table max* data* today*

| eval day=strptime(max_time_each_day,"%b %d,%Y")
| fieldformat day=strftime(day,"%Y/%m/%d")
| eval today_count=today_count."!".max_time_each_day
| xyseries day data_source today_count
| makecontinuous day
| fillnull value=0
| untable day data_source today_count
| eval today_count=split(today_count,"!")
| eval max_time_each_day=mvindex(today_count,1)
| eval today_count=mvindex(today_count,0)
| eval max_time_each_day=if(isnull(max_time_each_day),strftime(day,"%b %d,%Y"),max_time_each_day)
| fields - day``````
Explorer

@ITWhisperer Thank you so much for your efforts. 🙂

Get Updates on the Splunk Community!

#### Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

#### NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

#### Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...