html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/4325#issuecomment-1507213204,https://api.github.com/repos/pydata/xarray/issues/4325,1507213204,IC_kwDOAMm_X85Z1j-U,2448579,2023-04-13T15:56:51Z,2023-04-13T15:56:51Z,MEMBER,"Over in https://github.com/pydata/xarray/issues/7344#issuecomment-1336299057 @shoyer > That said -- we could also switch to smarter NumPy based algorithms to implement most moving window calculations, e.g,. using np.nancumsum for moving window means. After some digging, this would involve using [""summed area tables""](https://en.wikipedia.org/wiki/Summed-area_table) which have been generalized to nD, and can be used to compute all our built-in reductions (except median). Basically we'd store the summed area table (repeated `np.cumsum`) and then calculate reductions using binary ops (mostly subtraction) on those tables. This would be an intermediate level project but we could implement it incrementally (start with `sum` for example). One downside is the potential for floating point inaccuracies because we're taking differences of potentially large numbers. cc @aulemahal ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,675482176