issues: 621968474
This data as json
| id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 621968474 | MDU6SXNzdWU2MjE5Njg0NzQ= | 4085 | lazy evaluation of large arrays fails | 2599958 | closed | 0 | 4 | 2020-05-20T17:51:02Z | 2020-05-20T19:11:57Z | 2020-05-20T19:11:56Z | NONE | I have a large DataSet, including these DataArrays:
and
(The coordinates and attributes excluded for brevity, but they match in the right ways.) When I do math operations with the 4D DataArray (temp) and 3D DataArray (zeta), no problem:
This returns an object instantly, and the result is lazily evaluated. However, if I just try to add temp to itself,
this fails (eventually) as my medium sized computer runs out of memory, since it starts to evaluate the numbers as if I did a Why can't such simple math operations between two large arrays also be lazily evaluated? |
{
"url": "https://api.github.com/repos/pydata/xarray/issues/4085/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
completed | 13221727 | issue |