issues: 321553778
This data as json
| id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 321553778 | MDU6SXNzdWUzMjE1NTM3Nzg= | 2109 | Dataset.expand_dims() not lazy | 206773 | closed | 0 | 2 | 2018-05-09T12:39:44Z | 2018-05-09T15:45:31Z | 2018-05-09T15:45:31Z | NONE | The following won't come back for a very long time or will fail with an out-of-memory error: ```python
Problem descriptionWhen I call Dataset.expand_dims('time') on one of my ~2GB datasets (compressed), it seems to load all data data into memory, at least memory consumption goes beyond 12GB eventually ending in an out-of-memory exception.
(Sorry for the German UI.) Expected Output
Output of
|
{
"url": "https://api.github.com/repos/pydata/xarray/issues/2109/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
completed | 13221727 | issue |
