issues: 496460488
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
496460488 | MDU6SXNzdWU0OTY0NjA0ODg= | 3326 | quantile with Dask arrays | 6475152 | closed | 0 | 0 | 2019-09-20T17:14:59Z | 2019-11-25T15:57:49Z | 2019-11-25T15:57:49Z | NONE | Currently the
The problem with following the suggestion of the exception (loading the array into memory) is that "wide and shallow" arrays are too big to load into memory, yet each chunk is statistically independent if the quantile dimension is the "shallow" dimension. I'm not necessarily proposing delegating to Dask's quantile (unless it's super easy), but wanted to explore this special case described above. Related links: Thank you! EDIT: added stackoverflow link |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3326/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | 13221727 | issue |