issues: 277538485
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
277538485 | MDU6SXNzdWUyNzc1Mzg0ODU= | 1745 | open_mfdataset() memory error in v0.10 | 22665917 | closed | 0 | 24 | 2017-11-28T21:08:23Z | 2019-01-13T01:51:43Z | 2019-01-13T01:51:43Z | NONE | Code Sample```python import xarray ncfiles = '/example/path/to/wrf/netcdfs/*' dropvars = ['list', 'of', 'many', 'vars', 'to', 'drop'] dset = xarray.open_mfdataset(ncfiles, drop_variables=dropvars, concat_dim='Time', Problem descriptionI am trying to load 73 model (WRF) output files using When I run the above code with v0.9.6, it completes in roughly 7 seconds. But with v0.10, it crashes with the following error:
which, as I understand, means I'm exceeding my memory allocation. Any thoughts on what could be the source of this issue? Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1745/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | 13221727 | issue |