<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567575#M197814</link>
    <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/225168"&gt;@ITWhisperer&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Knew it!&lt;/P&gt;&lt;P&gt;xyseries/makecontinuous/untable - neat - solves the issue of the split clause. Good technique&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Mon, 20 Sep 2021 03:54:08 GMT</pubDate>
    <dc:creator>bowesmana</dc:creator>
    <dc:date>2021-09-20T03:54:08Z</dc:date>
    <item>
      <title>How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/566899#M197555</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;I have a requirement to find the rolling average&amp;nbsp; and variance % as per below requirement. If there is no event for any date then we should have an 0 events for that missing date so that we've continuously dates in our report.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;The "7d Rolling average Daily Event Count" column is the average count of events ingested each day for the last 7 days NOT including today&amp;nbsp;(yesterday thru previous 6 days).&lt;/P&gt;&lt;P&gt;"Variance" is the difference in &lt;EM&gt;count of events&lt;/EM&gt; between today's event count and the 7d rolling Avg.&amp;nbsp; (Today's event count minus the 7d rolling average event count).&lt;/P&gt;&lt;P&gt;"% Variance" is the &lt;EM&gt;percentage&lt;/EM&gt; difference between today's event count and the 7d rolling average (Variance divided by 7d rolling average ).&lt;/P&gt;&lt;P&gt;"average Daily Variance" is the &lt;EM&gt;absolute value&lt;/EM&gt; of the 7d rolling average of the % Variance values, not including today (yesterday thru previous 6 days).&lt;/P&gt;&lt;P&gt;Example:&lt;/P&gt;&lt;TABLE width="400"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="124.203px" height="113px"&gt;data source&lt;/TD&gt;&lt;TD width="132.844px" height="113px"&gt;Last event time&lt;/TD&gt;&lt;TD width="54.9062px" height="113px"&gt;Event Count&lt;/TD&gt;&lt;TD width="67.8125px" height="113px"&gt;7d rolling average event count&lt;/TD&gt;&lt;TD width="73.7969px" height="113px"&gt;Variance&lt;/TD&gt;&lt;TD width="73.7969px" height="113px"&gt;% Variance&lt;/TD&gt;&lt;TD width="73.7969px" height="113px"&gt;average Daily Variance&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;9/3/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2957&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2060&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;897&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;44%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;24%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;9/2/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;1438&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2064&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-626&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-30%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;24%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;9/1/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2906&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2055&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;851&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;41%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;23%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/31/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2753&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2036&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;718&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;35%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;22%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/30/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2131&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2036&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;95&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;5%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;22%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/29/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2235&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2010&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;225&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;11%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;23%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/28/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;3126&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;1961&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;1165&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;59%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;21%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/27/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2785&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;1931&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;854&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;44%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;20%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/26/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;1331&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;1939&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-608&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-31%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;20%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/25/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;1685&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;1950&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-265&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-14%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;20%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/24/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;1426&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;1984&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-558&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-28%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;20%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/23/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;1939&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;1965&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-26&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-1%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;21%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/22/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2467&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;1966&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;501&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;25%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;20%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/21/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;1482&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2010&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-528&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;-26%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;20%&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="124.203px" height="25px"&gt;test&lt;/TD&gt;&lt;TD width="132.844px" height="25px"&gt;8/20/2021&lt;/TD&gt;&lt;TD width="54.9062px" height="25px"&gt;2026&lt;/TD&gt;&lt;TD width="67.8125px" height="25px"&gt;2016&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;10&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;0%&lt;/TD&gt;&lt;TD width="73.7969px" height="25px"&gt;20%&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your help in advance.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Sep 2021 17:20:56 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/566899#M197555</guid>
      <dc:creator>mnj1809</dc:creator>
      <dc:date>2021-09-13T17:20:56Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/566921#M197562</link>
      <description>&lt;P&gt;Without some idea of your data, it's not easy to give you a precise answer, however, this query is a simulated query that will produce some random data over the last month and then create a table like yours. It is the use of streamstats that will calculate the rolling averages for you.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults
| eval row=mvrange(1,20000)
| mvexpand row
| eval _time=_time-((random() % 30 + 1) * 86400)
| timechart span=1d count
| eval skip_day=relative_time(now(), "-9d@d")
| eval count=if(_time=skip_day, 0, count)
| fields - skip_day
| streamstats window=8 current=f avg(count) as avg7day
| eval variance=round(count-avg7day), varperc=round(variance/avg7day*100)
| streamstats window=8 current=f avg(varperc) as avgvarperc
| eval avgvarperc=round(avgvarperc), avg7day=round(avg7day)
| table _time count avg7day variance varperc avgvarperc
| rename count as "Event Count", avg7day as "7 Day Rolling Average", variance as Variance, varperc as "% Variance", avgvarperc as "Average Daily Variance"&lt;/LI-CODE&gt;&lt;P&gt;I have introduced a 0 value day so you can see that it has no impact on the final table. I wasn't sure that the average daily variance calculation is correct based on your description, but hopefully this gives you enough to get what you want.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Sep 2021 22:59:41 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/566921#M197562</guid>
      <dc:creator>bowesmana</dc:creator>
      <dc:date>2021-09-13T22:59:41Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/566948#M197574</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Thanks for your prompt reply but query that you shared is not giving expected result.&lt;BR /&gt;I've achieved 7d rolling average and variance. But now I've to fill the gaps in "Last Event Time" column.&amp;nbsp; i.e.&amp;nbsp; I want to show missing dates with 0 event count. Could you please suggest how to fill that gap?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;TABLE width="700"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="157"&gt;Data Source&lt;/TD&gt;&lt;TD width="187"&gt;Last Event Time&lt;/TD&gt;&lt;TD width="82"&gt;Event Count&lt;/TD&gt;&lt;TD width="198"&gt;7d rolling average event count&lt;/TD&gt;&lt;TD width="61"&gt;Variance&lt;/TD&gt;&lt;TD width="72"&gt;%Variance&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Sep 13,2021 10:32:58 AM EDT&lt;/TD&gt;&lt;TD&gt;23&lt;/TD&gt;&lt;TD&gt;20&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;15&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Sep 10,2021 11:30:07 AM EDT&lt;/TD&gt;&lt;TD&gt;22&lt;/TD&gt;&lt;TD&gt;17&lt;/TD&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;30&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Sep 09,2021 09:51:28 AM EDT&lt;/TD&gt;&lt;TD&gt;19&lt;/TD&gt;&lt;TD&gt;14&lt;/TD&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;36&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Sep 08,2021 09:56:16 AM EDT&lt;/TD&gt;&lt;TD&gt;19&lt;/TD&gt;&lt;TD&gt;11&lt;/TD&gt;&lt;TD&gt;8&lt;/TD&gt;&lt;TD&gt;73&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Sep 05,2021 04:32:44 AM EDT&lt;/TD&gt;&lt;TD&gt;19&lt;/TD&gt;&lt;TD&gt;9&lt;/TD&gt;&lt;TD&gt;10&lt;/TD&gt;&lt;TD&gt;112&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Sep 02,2021 10:03:06 AM EDT&lt;/TD&gt;&lt;TD&gt;19&lt;/TD&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;13&lt;/TD&gt;&lt;TD&gt;217&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Sep 01,2021 04:32:54 AM EDT&lt;/TD&gt;&lt;TD&gt;19&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;16&lt;/TD&gt;&lt;TD&gt;534&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;data_source1&lt;/TD&gt;&lt;TD&gt;Aug 31,2021 10:13:22 AM EDT&lt;/TD&gt;&lt;TD&gt;19&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;19&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;</description>
      <pubDate>Tue, 14 Sep 2021 07:52:44 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/566948#M197574</guid>
      <dc:creator>mnj1809</dc:creator>
      <dc:date>2021-09-14T07:52:44Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567039#M197597</link>
      <description>&lt;P&gt;Please share your query&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 22:50:23 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567039#M197597</guid>
      <dc:creator>bowesmana</dc:creator>
      <dc:date>2021-09-14T22:50:23Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567068#M197605</link>
      <description>&lt;LI-CODE lang="markup"&gt;| makeresults
| eval _raw="Data Source	Last Event Time	Event Count	7d rolling average event count	Variance	Percent_Variance
data_source1	Sep 13,2021 10:32:58 AM EDT	23	20	3	15
data_source1	Sep 10,2021 11:30:07 AM EDT	22	17	5	30
data_source1	Sep 09,2021 09:51:28 AM EDT	19	14	5	36
data_source1	Sep 08,2021 09:56:16 AM EDT	19	11	8	73
data_source1	Sep 05,2021 04:32:44 AM EDT	19	9	10	112
data_source1	Sep 02,2021 10:03:06 AM EDT	19	6	13	217
data_source1	Sep 01,2021 04:32:54 AM EDT	19	3	16	534
data_source1	Aug 31,2021 10:13:22 AM EDT	19	0	19	0"
| multikv forceheader=1
| fields - _* linecount
| table Data_Source Last_Event_Time Event_Count * Percent_Variance



| eval day=strptime(Last_Event_Time,"%b %d,%Y")
| fieldformat day=strftime(day,"%Y/%m/%d")
| makecontinuous day
| eval Event_Count=0
| filldown Data_Source&lt;/LI-CODE&gt;</description>
      <pubDate>Wed, 15 Sep 2021 07:54:50 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567068#M197605</guid>
      <dc:creator>ITWhisperer</dc:creator>
      <dc:date>2021-09-15T07:54:50Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567098#M197618</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;Thanks for your response. Here below is my query.&lt;/P&gt;&lt;P&gt;| tstats count as per_day_count latest(_time) as LTime_per_day where (index=xxxx earliest=-14d@d latest=now) groupby source _time&lt;BR /&gt;| bin span=1d _time&lt;BR /&gt;| lookup data_sources.csv source OUTPUTNEW data_source, Monitor&lt;BR /&gt;| search Monitor=Yes&lt;BR /&gt;| eval Last_event_time_per_day=strftime(LTime_per_day,"%b %d,%Y %H:%M:%S %p %Z")&lt;BR /&gt;&lt;BR /&gt;| join type=outer data_source&lt;BR /&gt;[ tstats count latest(_time) as LTime values(sourcetype) as sourcetype where (index=yyyy* OR (index=zzzz sourcetype=*amp*)earliest=-d@d latest=now) groupby source _time&lt;BR /&gt;| bin span=1d _time&lt;BR /&gt;| lookup data_sources.csv source OUTPUTNEW data_source Monitor&lt;BR /&gt;| eval Last_event_time=strftime(LTime,"%b %d,%Y %H:%M:%S %p %Z")&lt;BR /&gt;| eval status= if(count &amp;gt; 0, "Data Received","NO Data Received")&lt;BR /&gt;]&lt;BR /&gt;| eval Last_event_time=strftime(LTime,"%b %d,%Y %H:%M:%S %p %Z")&lt;BR /&gt;| table data_source sourcetype count Last_event_time Last_event_time_per_day status per_day_count _time&lt;BR /&gt;| fillnull value="NO Data Received" status&lt;BR /&gt;| sort - status&lt;BR /&gt;| rename count as "Event Count"&lt;BR /&gt;| sort + sourcetype&lt;BR /&gt;| fillnull value="-"&lt;BR /&gt;| streamstats window=7 current=f list(per_day_count) as count1 by data_source&lt;BR /&gt;| stats latest(per_day_count) as today_count, max(Last_event_time_per_day) as max_time_each_day by _time data_source&lt;BR /&gt;| sort data_source, -_time&lt;BR /&gt;&lt;BR /&gt;and here below is a sample subset returned by the above query:&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;TABLE width="600"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="181.453px" height="25px"&gt;max_time_each_day&lt;/TD&gt;&lt;TD width="165.641px" height="25px"&gt;data_source&lt;/TD&gt;&lt;TD width="102.906px" height="25px"&gt;today_count&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 15,2021 07:25:01 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;14503&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 14,2021 23:59:51 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;51570&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 13,2021 23:59:57 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;56331&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 12,2021 23:59:59 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;55717&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 11,2021 23:59:51 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;54480&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 10,2021 23:59:49 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;65367&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 09,2021 23:59:59 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;61999&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 08,2021 23:59:57 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;55405&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 07,2021 23:59:51 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;62327&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 06,2021 23:59:48 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;54137&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 05,2021 23:59:56 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;49224&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 04,2021 23:59:54 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;47783&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 03,2021 23:59:52 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;52699&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 02,2021 23:59:53 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;70145&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 01,2021 23:59:57 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;ABC&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;79071&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 14,2021 10:05:16 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;21&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 13,2021 10:32:58 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;23&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 10,2021 11:30:07 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;22&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 09,2021 09:51:28 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;19&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 08,2021 09:56:16 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;19&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 05,2021 04:32:44 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;19&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 02,2021 10:03:06 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;19&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 01,2021 04:32:54 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;XYZ&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;19&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 15,2021 04:32:00 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;229&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 14,2021 04:31:59 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;268&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 13,2021 04:32:03 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;302&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 12,2021 04:31:59 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;302&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 11,2021 04:32:15 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;297&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 10,2021 04:32:00 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;305&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 09,2021 04:32:04 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;267&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 08,2021 04:32:02 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;267&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 07,2021 04:32:12 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;305&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 06,2021 04:32:01 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;305&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 05,2021 04:31:53 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;195&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 04,2021 04:31:52 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;157&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 03,2021 04:32:01 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;267&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 02,2021 04:31:59 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;157&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 01,2021 04:32:53 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;PQR&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;305&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 14,2021 10:05:15 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;103&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 13,2021 10:33:00 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;109&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 10,2021 11:30:07 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;106&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 09,2021 04:31:55 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;105&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 08,2021 09:51:06 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;36&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 07,2021 15:44:18 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;71&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 02,2021 04:31:59 AM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;105&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="181.453px" height="47px"&gt;Sep 01,2021 16:44:02 PM EDT&lt;/TD&gt;&lt;TD width="165.641px" height="47px"&gt;DST&lt;/TD&gt;&lt;TD width="102.906px" height="47px"&gt;105&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My requirement is that to fill the time gaps for data_source "XYZ" and "DST" and for these two data sources I should have today_count=0 for those gaps.&lt;/P&gt;&lt;P&gt;Thanks in advance.&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;</description>
      <pubDate>Wed, 15 Sep 2021 11:42:01 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567098#M197618</guid>
      <dc:creator>mnj1809</dc:creator>
      <dc:date>2021-09-15T11:42:01Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567167#M197640</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/189754"&gt;@mnj1809&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Oh dear... I have come up with a horrible piece of SPL to get from your final table to the result you want - but I have to believe that it is possible another way&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":beaming_face_with_smiling_eyes:"&gt;😁&lt;/span&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults
| eval _raw="max_time_each_day	data_source	today_count
Sep 15,2021 07:25:01 AM EDT	ABC	14503
Sep 14,2021 23:59:51 PM EDT	ABC	51570
Sep 13,2021 23:59:57 PM EDT	ABC	56331
Sep 12,2021 23:59:59 PM EDT	ABC	55717
Sep 11,2021 23:59:51 PM EDT	ABC	54480
Sep 10,2021 23:59:49 PM EDT	ABC	65367
Sep 09,2021 23:59:59 PM EDT	ABC	61999
Sep 08,2021 23:59:57 PM EDT	ABC	55405
Sep 07,2021 23:59:51 PM EDT	ABC	62327
Sep 06,2021 23:59:48 PM EDT	ABC	54137
Sep 05,2021 23:59:56 PM EDT	ABC	49224
Sep 04,2021 23:59:54 PM EDT	ABC	47783
Sep 03,2021 23:59:52 PM EDT	ABC	52699
Sep 02,2021 23:59:53 PM EDT	ABC	70145
Sep 01,2021 23:59:57 PM EDT	ABC	79071
Sep 14,2021 10:05:16 AM EDT	XYZ	21
Sep 13,2021 10:32:58 AM EDT	XYZ	23
Sep 10,2021 11:30:07 AM EDT	XYZ	22
Sep 09,2021 09:51:28 AM EDT	XYZ	19
Sep 08,2021 09:56:16 AM EDT	XYZ	19
Sep 05,2021 04:32:44 AM EDT	XYZ	19
Sep 02,2021 10:03:06 AM EDT	XYZ	19
Sep 01,2021 04:32:54 AM EDT	XYZ	19
Sep 15,2021 04:32:00 AM EDT	PQR	229
Sep 14,2021 04:31:59 AM EDT	PQR	268
Sep 13,2021 04:32:03 AM EDT	PQR	302
Sep 12,2021 04:31:59 AM EDT	PQR	302
Sep 11,2021 04:32:15 AM EDT	PQR	297
Sep 10,2021 04:32:00 AM EDT	PQR	305
Sep 09,2021 04:32:04 AM EDT	PQR	267
Sep 08,2021 04:32:02 AM EDT	PQR	267
Sep 07,2021 04:32:12 AM EDT	PQR	305
Sep 06,2021 04:32:01 AM EDT	PQR	305
Sep 05,2021 04:31:53 AM EDT	PQR	195
Sep 04,2021 04:31:52 AM EDT	PQR	157
Sep 03,2021 04:32:01 AM EDT	PQR	267
Sep 02,2021 04:31:59 AM EDT	PQR	157
Sep 01,2021 04:32:53 AM EDT	PQR	305
Sep 14,2021 10:05:15 AM EDT	DST	103
Sep 13,2021 10:33:00 AM EDT	DST	109
Sep 10,2021 11:30:07 AM EDT	DST	106
Sep 09,2021 04:31:55 AM EDT	DST	105
Sep 08,2021 09:51:06 AM EDT	DST	36
Sep 07,2021 15:44:18 PM EDT	DST	71
Sep 02,2021 04:31:59 AM EDT	DST	105
Sep 01,2021 16:44:02 PM EDT	DST	105"
| multikv forceheader=1
| table max* data* today*
| eval _time=strptime(max_time_each_day, "%b %d,%Y"), first_day=relative_time(now(), "@d")
| eval day=round(first_day-_time)
| eval d=mvrange(86400,(16*86400),86400)
| eventstats values(day) as day by data_source
| eval missing=mvmap(d,if(isnull(mvfind(day,d)),d,null()))
| streamstats c by data_source
| fields - d day
| eval missing=case(c=1 AND mvcount(missing)&amp;gt;0,mvappend(missing,_time),c=1 AND isnull(missing),_time, 1==1, _time)
| mvexpand missing
| eval t2=first_day-(missing)
| eval today_count=if(_time!=missing,0,today_count), max_time_each_day=if(_time!=missing,strftime(t2,"%b %d,%Y %H:%M:%S %p %Z"), max_time_each_day), _time=if(_time!=missing,t2,_time)
| fields - missing t2 first_day
| sort data_source -_time&lt;/LI-CODE&gt;&lt;P&gt;In essence, what this does is to find out which dates are missing by&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Work out what day this row related to (day)&lt;/LI&gt;&lt;LI&gt;Create an MV field of the missing days by using mvrange to calculate the 15 day range of expected values&lt;/LI&gt;&lt;LI&gt;(d)&lt;/LI&gt;&lt;LI&gt;Gets the first entry for each data source (streamstats c)&lt;/LI&gt;&lt;LI&gt;Using mvmap (Splunk &lt;span class="lia-unicode-emoji" title=":smiling_face_with_sunglasses:"&gt;😎&lt;/span&gt; to create another MV field of the missing days (missing)&lt;/LI&gt;&lt;LI&gt;Has to make sure that the field contains info for the CURRENT row&lt;/LI&gt;&lt;LI&gt;Expands the missing field to create the extra rows&lt;/LI&gt;&lt;LI&gt;The sets up the values of the empty entries - ensure that the original row data is not overwritten (_time!=missing)&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;bit of a hack, but not sure if makecontinuous can be used with the grouping. Other ways to create rows is to use an append or join to create the date range rows and then to aggregate back, but that would mean repeating the search in the subsearch to find out the data sources that exist and would still probably involve some mvexpand.&lt;/P&gt;&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/225168"&gt;@ITWhisperer&lt;/a&gt;&amp;nbsp;I am sure you can find a better way to do this...&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 15 Sep 2021 23:30:35 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567167#M197640</guid>
      <dc:creator>bowesmana</dc:creator>
      <dc:date>2021-09-15T23:30:35Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567226#M197655</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/6367"&gt;@bowesmana&lt;/a&gt;&amp;nbsp;Thanks for the vote of confidence&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":grinning_face:"&gt;😀&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/189754"&gt;@mnj1809&lt;/a&gt;&amp;nbsp;If you are not too concerned about the actual times (which I guess you aren't since you have bin span=1d, you could try something like this&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults
| eval _raw="max_time_each_day	data_source	today_count
Sep 15,2021 07:25:01 AM EDT	ABC	14503
Sep 14,2021 23:59:51 PM EDT	ABC	51570
Sep 13,2021 23:59:57 PM EDT	ABC	56331
Sep 12,2021 23:59:59 PM EDT	ABC	55717
Sep 11,2021 23:59:51 PM EDT	ABC	54480
Sep 10,2021 23:59:49 PM EDT	ABC	65367
Sep 09,2021 23:59:59 PM EDT	ABC	61999
Sep 08,2021 23:59:57 PM EDT	ABC	55405
Sep 07,2021 23:59:51 PM EDT	ABC	62327
Sep 06,2021 23:59:48 PM EDT	ABC	54137
Sep 05,2021 23:59:56 PM EDT	ABC	49224
Sep 04,2021 23:59:54 PM EDT	ABC	47783
Sep 03,2021 23:59:52 PM EDT	ABC	52699
Sep 02,2021 23:59:53 PM EDT	ABC	70145
Sep 01,2021 23:59:57 PM EDT	ABC	79071
Sep 14,2021 10:05:16 AM EDT	XYZ	21
Sep 13,2021 10:32:58 AM EDT	XYZ	23
Sep 10,2021 11:30:07 AM EDT	XYZ	22
Sep 09,2021 09:51:28 AM EDT	XYZ	19
Sep 08,2021 09:56:16 AM EDT	XYZ	19
Sep 05,2021 04:32:44 AM EDT	XYZ	19
Sep 02,2021 10:03:06 AM EDT	XYZ	19
Sep 01,2021 04:32:54 AM EDT	XYZ	19
Sep 15,2021 04:32:00 AM EDT	PQR	229
Sep 14,2021 04:31:59 AM EDT	PQR	268
Sep 13,2021 04:32:03 AM EDT	PQR	302
Sep 12,2021 04:31:59 AM EDT	PQR	302
Sep 11,2021 04:32:15 AM EDT	PQR	297
Sep 10,2021 04:32:00 AM EDT	PQR	305
Sep 09,2021 04:32:04 AM EDT	PQR	267
Sep 08,2021 04:32:02 AM EDT	PQR	267
Sep 07,2021 04:32:12 AM EDT	PQR	305
Sep 06,2021 04:32:01 AM EDT	PQR	305
Sep 05,2021 04:31:53 AM EDT	PQR	195
Sep 04,2021 04:31:52 AM EDT	PQR	157
Sep 03,2021 04:32:01 AM EDT	PQR	267
Sep 02,2021 04:31:59 AM EDT	PQR	157
Sep 01,2021 04:32:53 AM EDT	PQR	305
Sep 14,2021 10:05:15 AM EDT	DST	103
Sep 13,2021 10:33:00 AM EDT	DST	109
Sep 10,2021 11:30:07 AM EDT	DST	106
Sep 09,2021 04:31:55 AM EDT	DST	105
Sep 08,2021 09:51:06 AM EDT	DST	36
Sep 07,2021 15:44:18 PM EDT	DST	71
Sep 02,2021 04:31:59 AM EDT	DST	105
Sep 01,2021 16:44:02 PM EDT	DST	105"
| multikv forceheader=1
| table max* data* today*


| eval day=strptime(max_time_each_day,"%b %d,%Y")
| fieldformat day=strftime(day,"%Y/%m/%d")
| xyseries day data_source today_count
| makecontinuous day
| fillnull value=0
| untable day data_source today_count&lt;/LI-CODE&gt;</description>
      <pubDate>Thu, 16 Sep 2021 08:32:31 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567226#M197655</guid>
      <dc:creator>ITWhisperer</dc:creator>
      <dc:date>2021-09-16T08:32:31Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567239#M197661</link>
      <description>&lt;P&gt;Thanks for making my day Champ :)... Basically I am concerned about the actual times.. so your previous solution works perfectly. Only one last thing is left that suppose if I execute the SPL query today and if any data source not sent the data today , in this case I need the event_count =0 for today as well and as soon as I get the data for that data source then event_count should be that actual count.&lt;/P&gt;</description>
      <pubDate>Thu, 16 Sep 2021 09:50:56 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567239#M197661</guid>
      <dc:creator>mnj1809</dc:creator>
      <dc:date>2021-09-16T09:50:56Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567246#M197664</link>
      <description>&lt;P&gt;Concatenate the count and time, then split them up again after the untable.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults
| eval _raw="max_time_each_day	data_source	today_count
Sep 15,2021 07:25:01 AM EDT	ABC	14503
Sep 14,2021 23:59:51 PM EDT	ABC	51570
Sep 13,2021 23:59:57 PM EDT	ABC	56331
Sep 12,2021 23:59:59 PM EDT	ABC	55717
Sep 11,2021 23:59:51 PM EDT	ABC	54480
Sep 10,2021 23:59:49 PM EDT	ABC	65367
Sep 09,2021 23:59:59 PM EDT	ABC	61999
Sep 08,2021 23:59:57 PM EDT	ABC	55405
Sep 07,2021 23:59:51 PM EDT	ABC	62327
Sep 06,2021 23:59:48 PM EDT	ABC	54137
Sep 05,2021 23:59:56 PM EDT	ABC	49224
Sep 04,2021 23:59:54 PM EDT	ABC	47783
Sep 03,2021 23:59:52 PM EDT	ABC	52699
Sep 02,2021 23:59:53 PM EDT	ABC	70145
Sep 01,2021 23:59:57 PM EDT	ABC	79071
Sep 14,2021 10:05:16 AM EDT	XYZ	21
Sep 13,2021 10:32:58 AM EDT	XYZ	23
Sep 10,2021 11:30:07 AM EDT	XYZ	22
Sep 09,2021 09:51:28 AM EDT	XYZ	19
Sep 08,2021 09:56:16 AM EDT	XYZ	19
Sep 05,2021 04:32:44 AM EDT	XYZ	19
Sep 02,2021 10:03:06 AM EDT	XYZ	19
Sep 01,2021 04:32:54 AM EDT	XYZ	19
Sep 15,2021 04:32:00 AM EDT	PQR	229
Sep 14,2021 04:31:59 AM EDT	PQR	268
Sep 13,2021 04:32:03 AM EDT	PQR	302
Sep 12,2021 04:31:59 AM EDT	PQR	302
Sep 11,2021 04:32:15 AM EDT	PQR	297
Sep 10,2021 04:32:00 AM EDT	PQR	305
Sep 09,2021 04:32:04 AM EDT	PQR	267
Sep 08,2021 04:32:02 AM EDT	PQR	267
Sep 07,2021 04:32:12 AM EDT	PQR	305
Sep 06,2021 04:32:01 AM EDT	PQR	305
Sep 05,2021 04:31:53 AM EDT	PQR	195
Sep 04,2021 04:31:52 AM EDT	PQR	157
Sep 03,2021 04:32:01 AM EDT	PQR	267
Sep 02,2021 04:31:59 AM EDT	PQR	157
Sep 01,2021 04:32:53 AM EDT	PQR	305
Sep 14,2021 10:05:15 AM EDT	DST	103
Sep 13,2021 10:33:00 AM EDT	DST	109
Sep 10,2021 11:30:07 AM EDT	DST	106
Sep 09,2021 04:31:55 AM EDT	DST	105
Sep 08,2021 09:51:06 AM EDT	DST	36
Sep 07,2021 15:44:18 PM EDT	DST	71
Sep 02,2021 04:31:59 AM EDT	DST	105
Sep 01,2021 16:44:02 PM EDT	DST	105"
| multikv forceheader=1
| table max* data* today*


| eval day=strptime(max_time_each_day,"%b %d,%Y")
| fieldformat day=strftime(day,"%Y/%m/%d")
| eval today_count=today_count."!".max_time_each_day
| xyseries day data_source today_count
| makecontinuous day
| fillnull value=0
| untable day data_source today_count
| eval today_count=split(today_count,"!")
| eval max_time_each_day=mvindex(today_count,1)
| eval today_count=mvindex(today_count,0)
| eval max_time_each_day=if(isnull(max_time_each_day),strftime(day,"%b %d,%Y"),max_time_each_day)
| fields - day&lt;/LI-CODE&gt;</description>
      <pubDate>Thu, 16 Sep 2021 11:47:28 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567246#M197664</guid>
      <dc:creator>ITWhisperer</dc:creator>
      <dc:date>2021-09-16T11:47:28Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567254#M197670</link>
      <description>&lt;P&gt;&lt;SPAN&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/225168"&gt;@ITWhisperer&lt;/a&gt;&amp;nbsp;Thank you so much for your efforts. &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 16 Sep 2021 12:52:17 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567254#M197670</guid>
      <dc:creator>mnj1809</dc:creator>
      <dc:date>2021-09-16T12:52:17Z</dc:date>
    </item>
    <item>
      <title>Re: How to find rolling average for last7 days and fill the time gaps where there is no event in last 7 days</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567575#M197814</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/225168"&gt;@ITWhisperer&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Knew it!&lt;/P&gt;&lt;P&gt;xyseries/makecontinuous/untable - neat - solves the issue of the split clause. Good technique&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 20 Sep 2021 03:54:08 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-find-rolling-average-for-last7-days-and-fill-the-time/m-p/567575#M197814</guid>
      <dc:creator>bowesmana</dc:creator>
      <dc:date>2021-09-20T03:54:08Z</dc:date>
    </item>
  </channel>
</rss>

