issue_comments: 522353487
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/issues/3225#issuecomment-522353487 | https://api.github.com/repos/pydata/xarray/issues/3225 | 522353487 | MDEyOklzc3VlQ29tbWVudDUyMjM1MzQ4Nw== | 1217238 | 2019-08-18T20:38:40Z | 2019-08-18T20:38:40Z | MEMBER | There isn't really a notion of "deep copying" a dask array. Dask assumes that everything you apply to a dask array is a pure function (though this isn't directly enforced), so if you map a mutating function over the blocks of a dask array you could potentially get undefined behavior (especially likely in the context of distributed computing). So when you tell xarray to deep copy a dask array, it currently just makes a normal copy. I agree this a little counterintuitive, but it isn't obvious to me exactly what the right fix would look like. Perhaps we could start raising an error or warning in this case? For your specific problem, the fix is to do the copy inside mapped function, e.g.,
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
481866516 |