issue_comments: 286181363
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/issues/1026#issuecomment-286181363 | https://api.github.com/repos/pydata/xarray/issues/1026 | 286181363 | MDEyOklzc3VlQ29tbWVudDI4NjE4MTM2Mw== | 1217238 | 2017-03-13T17:28:40Z | 2017-03-13T17:28:40Z | MEMBER | This is what I was looking for:
So in this case (where the chunk size is already 1), dask.array.reshape could actually work fine and the error is unnecessary (we don't have the exploding task issue). So this could potentially be fixed upstream in dask. For now, the best work-around (because you don't have any memory concerns) is to "rechunk" into a single block along the last axis before reshaping, e.g., |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
180516114 |