<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Timechart duration, with values rolling into next span in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412915#M119012</link>
    <description>&lt;P&gt;Yeah, I called that out that it's limited to two buckets.  Can you identify the maximum duration you have observed?  Knowing the maximum amount of carryover buckets to deal with can help with what you need to do.&lt;/P&gt;</description>
    <pubDate>Wed, 24 Jul 2019 14:52:15 GMT</pubDate>
    <dc:creator>dmarling</dc:creator>
    <dc:date>2019-07-24T14:52:15Z</dc:date>
    <item>
      <title>Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412909#M119006</link>
      <description>&lt;P&gt;I am running the below search to get a sum of starvation per 15 minute period. The problem I am having, is that duration is always attributed to the start time of the event; So if the starvation runs over more than one 15 minutes period, it's still attributing it back to the start time-slice. Ideally I need it to roll over seconds into the next span if they exceed 900 seconds.&lt;/P&gt;

&lt;P&gt;Here is the search:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3
| transaction EquipmentID  startswith=TagBitData="1" endswith=TagBitData="0"
| timechart span=15m sum(duration) as total
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Here is an example of the result&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;_time                  sum
2019-07-22 02:15:00 98.893
2019-07-22 02:30:00 937.92
2019-07-22 02:45:00 1009.674
2019-07-22 03:00:00 2593.638
2019-07-22 03:15:00  
2019-07-22 03:30:00  
2019-07-22 03:45:00 706.153
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;I need to to show a maximum of 900 in the sum and roll down any difference. &lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 12:19:38 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412909#M119006</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-24T12:19:38Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412910#M119007</link>
      <description>&lt;P&gt;Hi ALXWBR,&lt;BR /&gt;
did you already seen the option "keeporphans" of transaction command (at &lt;A href="https://docs.splunk.com/Documentation/Splunk/7.3.0/SearchReference/Transaction"&gt;https://docs.splunk.com/Documentation/Splunk/7.3.0/SearchReference/Transaction&lt;/A&gt; )?&lt;/P&gt;

&lt;P&gt;Bye.&lt;BR /&gt;
Giuseppe&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 12:48:55 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412910#M119007</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2019-07-24T12:48:55Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412911#M119008</link>
      <description>&lt;P&gt;Hi Giuseppe&lt;/P&gt;

&lt;P&gt;There are no orphans, there's only ever a start and an end event which is very consistent. &lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 12:54:32 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412911#M119008</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-24T12:54:32Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412912#M119009</link>
      <description>&lt;P&gt;You can use streamstats to carryover the overflow to the next 15 minute bucket and then perform some evals to make sure you are not overflowing the next one.  The below query is making the assumption that your overflow won't exceed two 15 minute buckets.  If that happens you'll need to add another streamstats and round of adjusted totals/overflows.  Hopefully there's a better way to do this, but this is the only way I could think of.  Here's a run anywhere example using the sample data you provided:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| makeresults count=1 
| eval data=" 2019-07-22 02:15:00    98.893
 2019-07-22 02:30:00    937.92
 2019-07-22 02:45:00    1009.674
 2019-07-22 03:00:00    2593.638
 2019-07-22 03:15:00     
 2019-07-22 03:30:00     
 2019-07-22 03:45:00    706.153" 
| rex max_match=0 field=data "(?&amp;lt;data&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}[^\n]+)" 
| mvexpand data 
| rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})" 
| rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s+(?&amp;lt;total&amp;gt;[^\s]+)" 
| eval _time=strptime(time, "%Y-%m-%d %H:%M:%S") 
| fields - time data
| eval overflow=if(total&amp;gt;900, total-900, null())
| streamstats window=1 current=false values(overflow) as previousoverflow
| eval adjustedtotal=coalesce(previousoverflow,0)+coalesce(total, 0)
| eval adjustedoverflow=if(adjustedtotal&amp;gt;900, adjustedtotal-900, 0)
| streamstats window=1 current=false values(adjustedoverflow) as previousadjustedoverflow
| eval finalTotal=case(adjustedtotal=0, previousadjustedoverflow, adjustedtotal&amp;gt;900, 900, NOT adjustedtotal=0 AND NOT adjustedtotal&amp;gt;900, adjustedtotal)
| rename total as oldTotal
| table _time oldTotal overflow previousoverflow adjustedtotal adjustedoverflow  previousadjustedoverflow finalTotal
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Here's it with the query that you provided:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3 
| transaction EquipmentID startswith=TagBitData="1" endswith=TagBitData="0" 
| timechart span=15m sum(duration) as total 
| eval overflow=if(total&amp;gt;900, total-900, null()) 
| streamstats window=1 current=false values(overflow) as previousoverflow 
| eval adjustedtotal=coalesce(previousoverflow,0)+coalesce(total, 0) 
| eval adjustedoverflow=if(adjustedtotal&amp;gt;900, adjustedtotal-900, 0) 
| streamstats window=1 current=false values(adjustedoverflow) as previousadjustedoverflow 
| eval finalTotal=case(adjustedtotal=0, previousadjustedoverflow, adjustedtotal&amp;gt;900, 900, NOT adjustedtotal=0 AND NOT adjustedtotal&amp;gt;900, adjustedtotal) 
| rename total as oldTotal 
| table _time oldTotal overflow previousoverflow adjustedtotal adjustedoverflow previousadjustedoverflow finalTotal
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Wed, 24 Jul 2019 14:14:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412912#M119009</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-24T14:14:16Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412913#M119010</link>
      <description>&lt;P&gt;If the query wasn't clear, the "finalTotal" column is what you would use on your report.  I included all of the fields as a way to show the work that the evals are doing.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 14:18:23 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412913#M119010</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-24T14:18:23Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412914#M119011</link>
      <description>&lt;P&gt;Thanks for this. However it's not quite doing the trick. Its only rolling over twice, but if the number is in excess of 2700, it doesn't continue to keep rolling down until the seconds have all bee accounted for.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 14:48:05 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412914#M119011</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-24T14:48:05Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412915#M119012</link>
      <description>&lt;P&gt;Yeah, I called that out that it's limited to two buckets.  Can you identify the maximum duration you have observed?  Knowing the maximum amount of carryover buckets to deal with can help with what you need to do.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 14:52:15 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412915#M119012</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-24T14:52:15Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412916#M119013</link>
      <description>&lt;P&gt;oh yeah sorry! Well the machines could be out of use for a few hours at times, for example, the day I am looking at right now has a transaction over 10,000 seconds long, which would roll over into 12 buckets.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 14:57:34 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412916#M119013</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-24T14:57:34Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412917#M119014</link>
      <description>&lt;P&gt;Okay, yeah that's doable, but gets irritatingly complicated the more buckets you go down.  I'll need to find something that iterates the process automatically without having to have 12 sets of streamstats as that's not really sustainable.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 15:00:58 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412917#M119014</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-24T15:00:58Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412918#M119015</link>
      <description>&lt;P&gt;I've been racking my brain with streamstats&amp;gt;eval&amp;gt;if combos for two days now and just can't seem to find the combo.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 15:03:50 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412918#M119015</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-24T15:03:50Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412919#M119016</link>
      <description>&lt;P&gt;Okay I took this thing back to the drawing board.  Due to concurrency we really  just need to identify groups where a duration will cause it to go over to another 15 minute bucket and determine how many 15 minute buckets it needs.  If something else has multiple buckets inbetween that spill over and need less buckets than one of those 10k seconds ones you found then we don't really care about it as it'll be 900 seconds in this timechart due to that concurrency.&lt;/P&gt;

&lt;P&gt;With that in mind if we create a subsearch that assigns groups based on how many buckets are needed and then create some rules around that it should accomplish what we need.  I took your example and added some data to it for testing different use cases.  Here's the run anywhere example:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| makeresults count=1 
| eval data=" 2019-07-22 02:00:00    
2019-07-22 02:15:00    98.893
  2019-07-22 02:30:00    937.92
  2019-07-22 02:45:00    1009.674
  2019-07-22 03:00:00    2593.638
  2019-07-22 03:15:00      
  2019-07-22 03:30:00     
  2019-07-22 03:45:00    706.153
  2019-07-22 04:00:00    2800.153
  2019-07-22 04:15:00     
  2019-07-22 04:30:00     
  2019-07-22 04:45:00     
  2019-07-22 05:00:00     
  2019-07-22 05:15:00     
  2019-07-22 05:30:00     " 
| rex max_match=0 field=data "(?&amp;lt;data&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}[^\n]+)" 
| mvexpand data 
| rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})" 
| rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s+(?&amp;lt;total&amp;gt;[^\s]+)" 
| eval _time=strptime(time, "%Y-%m-%d %H:%M:%S") 
| fields - time data 
| eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0)-1, (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)), 0) 
| streamstats current=f count as counter 
| eval carryover=if(total&amp;gt;900, total-900, 0) 
| eval future=if(carryover&amp;gt;0, counter+amount, 0) 
| eval group=case( 
    [| makeresults count=1 
    | eval data=" 2019-07-22 02:00:00   
2019-07-22 02:15:00    98.893
  2019-07-22 02:30:00    937.92
  2019-07-22 02:45:00    1009.674
  2019-07-22 03:00:00    2593.638
  2019-07-22 03:15:00      
  2019-07-22 03:30:00     
  2019-07-22 03:45:00    706.153
  2019-07-22 04:00:00    2800.153
  2019-07-22 04:15:00     
  2019-07-22 04:30:00     
  2019-07-22 04:45:00     
  2019-07-22 05:00:00     
  2019-07-22 05:15:00     
  2019-07-22 05:30:00     " 
    | rex max_match=0 field=data "(?&amp;lt;data&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}[^\n]+)" 
    | mvexpand data 
    | rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})" 
    | rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s+(?&amp;lt;total&amp;gt;[^\s]+)" 
    | eval _time=strptime(time, "%Y-%m-%d %H:%M:%S") 
    | fields - time data 
    | eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0)-1, (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)), 0) 
    | streamstats current=f count as counter 
    | eval carryover=if(total&amp;gt;900, total-900, 0) 
    | eval future=if(carryover&amp;gt;0, counter+amount, 0) 
    | search NOT future=0 
    | stats min(counter) as counter by future 
    | sort 0 - future 
    | eval future="counter&amp;lt;=".future 
    | eval counter="counter&amp;gt;=".counter 
    | streamstats count as group 
    | eval search="(".counter." AND ".future."), \"".group."\"" 
    | stats list(search) as search 
    | eval search=mvjoin(search, ", ")]) 
| eventstats max(amount) as maxamount max(future) as maxfuture max(total) as maxcarryover count as groupcount by group 
| eval finalTotal=case(isnotnull(group) AND (NOT counter=maxfuture OR groupcount=1), 900, isnotnull(group) AND counter=maxfuture, maxcarryover-(maxamount*900), isnull(group), total)
| rename total as oldTotal
| table _time oldTotal finalTotal
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;If you add a larger number on there it will automatically account for that since it's mapping groups based on how many buckets the largest duration requires.  Here's it mapped to your specific use case:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3 
| transaction EquipmentID startswith=TagBitData="1" endswith=TagBitData="0" 
| timechart span=15m sum(duration) as total 
| eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0)-1, (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)), 0) 
| streamstats current=f count as counter 
| eval carryover=if(total&amp;gt;900, total-900, 0) 
| eval future=if(carryover&amp;gt;0, counter+amount, 0) 
| eval group=case( 
    [ search index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3 
    | transaction EquipmentID startswith=TagBitData="1" endswith=TagBitData="0" 
    | timechart span=15m sum(duration) as total 
    | eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0)-1, (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)), 0) 
    | streamstats current=f count as counter 
    | eval carryover=if(total&amp;gt;900, total-900, 0) 
    | eval future=if(carryover&amp;gt;0, counter+amount, 0) 
    | search NOT future=0 
    | stats min(counter) as counter by future 
    | sort 0 - future 
    | eval future="counter&amp;lt;=".future 
    | eval counter="counter&amp;gt;=".counter 
    | streamstats count as group 
    | eval search="(".counter." AND ".future."), \"".group."\"" 
    | stats list(search) as search 
    | eval search=mvjoin(search, ", ")]) 
| eventstats max(amount) as maxamount max(future) as maxfuture max(total) as maxcarryover count as groupcount by group 
| eval finalTotal=case(isnotnull(group) AND (NOT counter=maxfuture OR groupcount=1), 900, isnotnull(group) AND counter=maxfuture, maxcarryover-(maxamount*900), isnull(group), total) 
| rename total as oldTotal 
| table _time oldTotal finalTotal
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Your only problem at this point would be if the subsearch that generates the groups exceeds 30 seconds which will cause it to timeout if you have the generic timeouts in your confs.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2019 19:28:29 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412919#M119016</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-24T19:28:29Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412920#M119017</link>
      <description>&lt;P&gt;Sorry, but still not there. That period between 03:00 and 04:00 isn't right.&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;_time   oldTotal    finalTotal
2019-07-22 00:00:00      
2019-07-22 00:15:00      
2019-07-22 00:30:00      
2019-07-22 00:45:00      
2019-07-22 01:00:00      
2019-07-22 01:15:00      
2019-07-22 01:30:00      
2019-07-22 01:45:00      
2019-07-22 02:00:00      
2019-07-22 02:15:00 98.893  98.893
2019-07-22 02:30:00 937.92  900
2019-07-22 02:45:00 1009.674    900
2019-07-22 03:00:00 2593.638    900
2019-07-22 03:15:00     900
2019-07-22 03:30:00     793.638
2019-07-22 03:45:00 706.153 706.153
2019-07-22 04:00:00 1015.469    900
2019-07-22 04:15:00 451.892 115.469
2019-07-22 04:30:00 329.411 329.411
2019-07-22 04:45:00 319.438 319.438
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 25 Jul 2019 09:09:37 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412920#M119017</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-25T09:09:37Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412921#M119018</link>
      <description>&lt;P&gt;Thanks for the additional use cases.  I believe I have them accounted for now with this iteration:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| makeresults count=1 
| eval data=" 2019-07-22 00:00:00          
 2019-07-22 00:15:00          
 2019-07-22 00:30:00          
 2019-07-22 00:45:00          
 2019-07-22 01:00:00          
 2019-07-22 01:15:00          
 2019-07-22 01:30:00          
 2019-07-22 01:45:00          
 2019-07-22 02:00:00          
 2019-07-22 02:15:00    98.893
   2019-07-22 02:30:00    937.92
   2019-07-22 02:45:00    1009.674
   2019-07-22 03:00:00    2593.638
   2019-07-22 03:15:00      
   2019-07-22 03:30:00     
   2019-07-22 03:45:00    706.153
   2019-07-22 04:00:00    1015.469
   2019-07-22 04:15:00    451.892
   2019-07-22 04:30:00    329.411
   2019-07-22 04:45:00    319.438
   2019-07-22 05:00:00     
   2019-07-22 05:15:00     
   2019-07-22 05:30:00     " 
| rex max_match=0 field=data "(?&amp;lt;data&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}[^\n]+)" 
| mvexpand data 
| rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})" 
| rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s+(?&amp;lt;total&amp;gt;[^\s]+)" 
| eval _time=strptime(time, "%Y-%m-%d %H:%M:%S") 
| fields - time data 
| eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0), (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)+1), 0) 
| streamstats current=f count as counter 
| eval carryover=if(total&amp;gt;900, total-900, 0) 
| eval future=if(carryover&amp;gt;0, counter+amount, 0) 
| eval group=case( 
    [| makeresults count=1 
    | eval data=" 2019-07-22 00:00:00          
 2019-07-22 00:15:00          
 2019-07-22 00:30:00          
 2019-07-22 00:45:00          
 2019-07-22 01:00:00          
 2019-07-22 01:15:00          
 2019-07-22 01:30:00          
 2019-07-22 01:45:00          
 2019-07-22 02:00:00          
 2019-07-22 02:15:00    98.893
   2019-07-22 02:30:00    937.92
   2019-07-22 02:45:00    1009.674
   2019-07-22 03:00:00    2593.638
   2019-07-22 03:15:00      
   2019-07-22 03:30:00     
   2019-07-22 03:45:00    706.153
   2019-07-22 04:00:00    1015.469
   2019-07-22 04:15:00    451.892
   2019-07-22 04:30:00    329.411
   2019-07-22 04:45:00    319.438
   2019-07-22 05:00:00     
   2019-07-22 05:15:00     
   2019-07-22 05:30:00     " 
    | rex max_match=0 field=data "(?&amp;lt;data&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}[^\n]+)" 
    | mvexpand data 
    | rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})" 
    | rex field=data "(?&amp;lt;time&amp;gt;\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s+(?&amp;lt;total&amp;gt;[^\s]+)" 
    | eval _time=strptime(time, "%Y-%m-%d %H:%M:%S") 
    | fields - time data 
    | eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0), (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)+1), 0) 
    | streamstats current=f count as counter 
    | eval carryover=if(total&amp;gt;900, total-900, 0) 
    | eval future=if(carryover&amp;gt;0, counter+amount, 0) 
    | search NOT future=0 
    | stats min(counter) as counter by future 
    | sort 0 - future 
    | eval future="counter&amp;lt;=".future 
    | eval counter="counter&amp;gt;=".counter 
    | streamstats count as group 
    | eval search="(".counter." AND ".future."), \"".group."\"" 
    | stats list(search) as search 
    | eval search=mvjoin(search, ", ")]) 
| eventstats max(amount) as maxamount max(future) as maxfuture max(total) as maxcarryover count as groupcount by group 
| eval finalTotal=case(isnotnull(group) AND (NOT counter=maxfuture OR groupcount=1), 900, isnotnull(group) AND counter=maxfuture, (maxcarryover-((maxamount-1)*900))+coalesce(total,0), isnull(group), total) 
| eval finalcarryover=if(finalTotal&amp;gt;900, finalTotal-900, null()) 
| streamstats window=1 current=f values(finalcarryover) as lastfinalcarryover 
| eval finaltotal=case(finalTotal&amp;gt;=900, 900, finalTotal&amp;lt;900 AND (lastfinalcarryover+finalTotal)&amp;lt;900, lastfinalcarryover+finalTotal, finalTotal&amp;lt;900 AND (lastfinalcarryover+finalTotal)&amp;gt;900, 900, finalTotal&amp;lt;900 AND NOT (lastfinalcarryover+finalTotal)&amp;lt;900 AND NOT (lastfinalcarryover+finalTotal)&amp;gt;900, finalTotal) 
| rename total as oldTotal 
| table _time oldTotal finaltotal
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;This returns this results:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;_time   oldTotal    finaltotal
2019-07-22 00:00:00      
2019-07-22 00:15:00      
2019-07-22 00:30:00      
2019-07-22 00:45:00      
2019-07-22 01:00:00      
2019-07-22 01:15:00      
2019-07-22 01:30:00      
2019-07-22 01:45:00      
2019-07-22 02:00:00      
2019-07-22 02:15:00 98.893  98.893
2019-07-22 02:30:00 937.92  900
2019-07-22 02:45:00 1009.674    900
2019-07-22 03:00:00 2593.638    900
2019-07-22 03:15:00     900
2019-07-22 03:30:00     900
2019-07-22 03:45:00 706.153 900
2019-07-22 04:00:00 1015.469    900
2019-07-22 04:15:00 451.892 900
2019-07-22 04:30:00 329.411 444.880
2019-07-22 04:45:00 319.438 319.438
2019-07-22 05:00:00      
2019-07-22 05:15:00      
2019-07-22 05:30:00      
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 25 Jul 2019 13:15:05 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412921#M119018</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-25T13:15:05Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412922#M119019</link>
      <description>&lt;P&gt;I'm getting an error in the | eval group=case(.....&lt;/P&gt;

&lt;P&gt;Only works with makeresults but not search&lt;/P&gt;

&lt;P&gt;&lt;span class="lia-unicode-emoji" title=":disappointed_face:"&gt;😞&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Jul 2019 13:32:30 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412922#M119019</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-25T13:32:30Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412923#M119020</link>
      <description>&lt;P&gt;What's the error?&lt;/P&gt;</description>
      <pubDate>Thu, 25 Jul 2019 13:35:24 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412923#M119020</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-25T13:35:24Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412924#M119021</link>
      <description>&lt;P&gt;Error in 'eval' command: The expression is malformed. An unexpected character is reached at ') )'.&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3 
 | transaction EquipmentID startswith=TagBitData="1" endswith=TagBitData="0" 
 | timechart span=15m sum(duration) as total 
 | eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0), (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)+1), 0) 
 | streamstats current=f count as counter 
 | eval carryover=if(total&amp;gt;900, total-900, 0) 
 | eval future=if(carryover&amp;gt;0, counter+amount, 0) 
 | eval group=case( 
     [| search index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3 
 | transaction EquipmentID startswith=TagBitData="1" endswith=TagBitData="0" 
 | timechart span=15m sum(duration) as total  
     | eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0), (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)+1), 0) 
     | streamstats current=f count as counter 
     | eval carryover=if(total&amp;gt;900, total-900, 0) 
     | eval future=if(carryover&amp;gt;0, counter+amount, 0) 
     | search NOT future=0 
     | stats min(counter) as counter by future 
     | sort 0 - future 
     | eval future="counter&amp;lt;=".future 
     | eval counter="counter&amp;gt;=".counter 
     | streamstats count as group 
     | eval search="(".counter." AND ".future."), \"".group."\"" 
     | stats list(search) as search 
     | eval search=mvjoin(search, ", ")]) 
 | eventstats max(amount) as maxamount max(future) as maxfuture max(total) as maxcarryover count as groupcount by group 
 | eval finalTotal=case(isnotnull(group) AND (NOT counter=maxfuture OR groupcount=1), 900, isnotnull(group) AND counter=maxfuture, (maxcarryover-((maxamount-1)*900))+coalesce(total,0), isnull(group), total) 
 | eval finalcarryover=if(finalTotal&amp;gt;900, finalTotal-900, null()) 
 | streamstats window=1 current=f values(finalcarryover) as lastfinalcarryover 
 | eval finaltotal=case(finalTotal&amp;gt;=900, 900, finalTotal&amp;lt;900 AND (lastfinalcarryover+finalTotal)&amp;lt;900, lastfinalcarryover+finalTotal, finalTotal&amp;lt;900 AND (lastfinalcarryover+finalTotal)&amp;gt;900, 900, finalTotal&amp;lt;900 AND NOT (lastfinalcarryover+finalTotal)&amp;lt;900 AND NOT (lastfinalcarryover+finalTotal)&amp;gt;900, finalTotal) 
 | rename total as oldTotal 
 | table _time oldTotal finaltotal
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 25 Jul 2019 14:47:38 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412924#M119021</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-07-25T14:47:38Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412925#M119022</link>
      <description>&lt;P&gt;Thank you.  I recreated it on my install as well.  This happens when the subsearch that generates the groups returns no results due to nothing exceeding 900 seconds.  This accounts for that by making that field null when that use case happens:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3 
| transaction EquipmentID startswith=TagBitData="1" endswith=TagBitData="0" 
| timechart span=15m sum(duration) as total 
| eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0), (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)+1), 0) 
| streamstats current=f count as counter 
| eval carryover=if(total&amp;gt;900, total-900, 0) 
| eval future=if(carryover&amp;gt;0, counter+amount, 0) 
| eval group=case( 
    [ search index=idx_sems source="sems_north" sourcetype=SEMSSmartLaneEvents Tag="XRY_STVD" EquipmentID=3 
    | transaction EquipmentID startswith=TagBitData="1" endswith=TagBitData="0" 
    | timechart span=15m sum(duration) as total 
    | eval amount=coalesce(case((total/900)&amp;gt;1 AND round(total/900,0)&amp;gt;(total/900), round(total/900,0), (total/900)&amp;gt;1 AND round(total/900,0)&amp;lt;(total/900), round(total/900,0)+1), 0) 
    | streamstats current=f count as counter 
    | eval carryover=if(total&amp;gt;900, total-900, 0) 
    | eval future=if(carryover&amp;gt;0, counter+amount, 0) 
    | search NOT future=0 
    | stats min(counter) as counter by future 
    | sort 0 - future 
    | eval future="counter&amp;lt;=".future 
    | eval counter="counter&amp;gt;=".counter 
    | streamstats count as group 
    | eval search="(".counter." AND ".future."), \"".group."\"" 
    | stats list(search) as search 
    | eval search=mvjoin(search, ", ") 
    | append 
        [| makeresults count=1 
        | eval search="\"ERROR\"=\"ERROR\", null()" 
        | table search] 
    | eventstats count 
    | search NOT (count&amp;gt;1 search="\"ERROR\"=\"ERROR\", null()") 
    | fields - count)]) 
| eventstats max(amount) as maxamount max(future) as maxfuture max(total) as maxcarryover count as groupcount by group 
| eval finalTotal=case(isnotnull(group) AND (NOT counter=maxfuture OR groupcount=1), 900, isnotnull(group) AND counter=maxfuture, (maxcarryover-((maxamount-1)*900))+coalesce(total,0), isnull(group), total) 
| eval finalcarryover=if(finalTotal&amp;gt;900, finalTotal-900, null()) 
| streamstats window=1 current=f values(finalcarryover) as lastfinalcarryover 
| eval finaltotal=case(finalTotal&amp;gt;=900, 900, finalTotal&amp;lt;900 AND (lastfinalcarryover+finalTotal)&amp;lt;900, lastfinalcarryover+finalTotal, finalTotal&amp;lt;900 AND (lastfinalcarryover+finalTotal)&amp;gt;900, 900, finalTotal&amp;lt;900 AND NOT (lastfinalcarryover+finalTotal)&amp;lt;900 AND NOT (lastfinalcarryover+finalTotal)&amp;gt;900, finalTotal) 
| rename total as oldTotal 
| table _time oldTotal finaltotal
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 25 Jul 2019 15:30:52 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412925#M119022</guid>
      <dc:creator>dmarling</dc:creator>
      <dc:date>2019-07-25T15:30:52Z</dc:date>
    </item>
    <item>
      <title>Re: Timechart duration, with values rolling into next span</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412926#M119023</link>
      <description>&lt;P&gt;Looks good, thank you &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 09 Aug 2019 10:32:30 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Timechart-duration-with-values-rolling-into-next-span/m-p/412926#M119023</guid>
      <dc:creator>ALXWBR</dc:creator>
      <dc:date>2019-08-09T10:32:30Z</dc:date>
    </item>
  </channel>
</rss>

