issues: 703881154
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
703881154 | MDExOlB1bGxSZXF1ZXN0NDg4OTA4MTI5 | 4432 | Fix optimize for chunked DataArray | 1312546 | closed | 0 | 8 | 2020-09-17T20:16:08Z | 2020-09-18T13:20:45Z | 2020-09-17T23:19:23Z | MEMBER | 0 | pydata/xarray/pulls/4432 | Previously we generated in invalidate Dask task graph, becuase the lines removed here dropped keys that were referenced elsewhere in the task graph. The original implementation had a comment indicating that this was to cull: https://github.com/pydata/xarray/blob/502a988ad5b87b9f3aeec3033bf55c71272e1053/xarray/core/variable.py#L384 Just spot-checking things, I think we're OK here though. Something like
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4432/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
13221727 | pull |