issues: 326553877
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
326553877 | MDU6SXNzdWUzMjY1NTM4Nzc= | 2187 | open_dataset crash with long filenames | 16655388 | closed | 0 | 2 | 2018-05-25T14:47:31Z | 2018-05-29T14:43:50Z | 2018-05-29T14:42:35Z | NONE | Code Sample```python import xarray as xr import shutil import numpy as np create netcdf filedata = np.random.rand(4, 3) foo = xr.DataArray(data) foo.to_netcdf('test.nc') f_nc = 'a.nc' shutil.copy('test.nc', f_nc) while 1: print '{:05n} characteres'.format(len(f_nc)) ds1 = xr.open_dataset(f_nc) ds1.close() nf_nc = 'a' + f_nc shutil.move(f_nc, nf_nc) f_nc = nf_nc
``` Problem descriptionOn my linux machine (CentOS) this code crashes (memory corrruption) when the filename length hits 32 characters. On my OSX machine it is fine until 255 character and stops with an IOError Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2187/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | 13221727 | issue |