issues: 712052219
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
712052219 | MDU6SXNzdWU3MTIwNTIyMTk= | 4474 | Implement rolling_exp for dask arrays | 2560426 | open | 0 | 7 | 2020-09-30T15:31:50Z | 2020-10-15T16:32:03Z | NONE | Is your feature request related to a problem? Please describe.
I use dask-based chunking on my arrays regularly and would like to leverage the efficient Describe the solution you'd like
It's possible to compute a rolling exp mean as a function of rolling exp means of contiguous, non-overlapping subsets (chunks). You just need to first "un-normalize" the rolling_exps of each chunk in order to split them into their corresponding numerators and denominators (see the Then, scale each chunk's numerator and denominator series (derived from their Describe alternatives you've considered
I implemented my own inefficient weighted rolling mean using xarray's |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4474/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
13221727 | issue |