@dtburrows3 Did I set this up correctly? Note: I posted 5 days to simplify the use case but I need 28-day sums. | inputlookup direct_deposit_changes_v4_1_since_01012020.csv
| eval _time = strpti...
See more...
@dtburrows3 Did I set this up correctly? Note: I posted 5 days to simplify the use case but I need 28-day sums. | inputlookup direct_deposit_changes_v4_1_since_01012020.csv
| eval _time = strptime(_time,"%Y-%m-%d")
| sort 0 _time
| streamstats count as daily_count by _time
| streamstats window=28 list(daily_count) as values_added_together, sum(daily_count) as sum_daily_count
| table _time, daily_count, values_added_together, sum_daily_count I ended up with over 3 million rows when it should have been around 1,460. Because it wasn't grouped by _time (%Y-%m-%d). However, the foreach code produced the results I was looking for. | inputlookup direct_deposit_changes_v4_1_since_01012020.csv
| eval _time = strptime(_time,"%Y-%m-%d")
| stats count as daily_count by _time
| mvcombine daily_count
| eval cnt=0
| foreach mode=multivalue daily_count
[| eval summation_json=if( mvcount(mvindex(daily_count,cnt,cnt+27))==28, mvappend( 'summation_json', json_object( "set", mvindex(daily_count,cnt,cnt+27), "sum", sum(mvindex(daily_count,cnt,cnt+27)) ) ), 'summation_json' ), cnt='cnt'+1 ]
| rex field="summation_json" "sum\"\:(?<sum_daily_count>\d+)\}"
| fields sum_daily_count
| mvexpand sum_daily_count I confirmed these were correct using Excel. Now I must add _time (%Y-%m-%d) to the results. Thanks and God bless, Genesius