home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 705068971

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/3332#issuecomment-705068971 https://api.github.com/repos/pydata/xarray/issues/3332 705068971 MDEyOklzc3VlQ29tbWVudDcwNTA2ODk3MQ== 29147682 2020-10-07T17:00:35Z 2020-10-07T17:00:35Z NONE

Is there any way to get around this? The window dimension combined with the For window size x, every chunk should be larger than x//2 requirement means that for a large moving window I'm getting O(100GB) chunks that do not fit in memory at compute time. I can, of course, rechunk other dimensions, but that is expensive and substantially slower. I also suspect this becomes practically infeasible on machines that have little memory. Regardless, mandatory O(n^2) memory usage with window size seems less than ideal.

My workaround has been to just implement my own slicing via for loop and then call reduction operations on the resultant dask arrays as normal... Perhaps there is something I missed along the way but I couldn't find anything in open or past issues to aid in resolving this. Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  496809167
Powered by Datasette · Queries took 0.898ms · About: xarray-datasette