html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/1198#issuecomment-287099757,https://api.github.com/repos/pydata/xarray/issues/1198,287099757,MDEyOklzc3VlQ29tbWVudDI4NzA5OTc1Nw==,731499,2017-03-16T15:46:14Z,2017-03-16T15:46:14Z,CONTRIBUTOR,"with that keyword the open_mfdataset goes through !
my code crashes later because the data is too big to fit in memory, but that's another problem ;-)","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056
https://github.com/pydata/xarray/pull/1198#issuecomment-287058651,https://api.github.com/repos/pydata/xarray/issues/1198,287058651,MDEyOklzc3VlQ29tbWVudDI4NzA1ODY1MQ==,731499,2017-03-16T13:37:05Z,2017-03-16T13:37:05Z,CONTRIBUTOR,"Hey @pwolfram, I installed your branch and tried to `open_mfdataset` on 1474 hdf5 files.
I got the following:
```
File ""/users/noel/.conda/envs/python3/lib/python3.6/site-packages/xarray/backends/api.py"", line 524, in open_mfdataset
File ""/users/noel/.conda/envs/python3/lib/python3.6/site-packages/xarray/backends/api.py"", line 524, in
File ""/users/noel/.conda/envs/python3/lib/python3.6/site-packages/xarray/backends/api.py"", line 299, in open_dataset
File ""/users/noel/.conda/envs/python3/lib/python3.6/site-packages/xarray/backends/netCDF4_.py"", line 203, in __init__
File ""/users/noel/.conda/envs/python3/lib/python3.6/site-packages/xarray/backends/netCDF4_.py"", line 178, in _open_netcdf4_group
File ""netCDF4/_netCDF4.pyx"", line 1848, in netCDF4._netCDF4.Dataset.__init__ (netCDF4/_netCDF4.c:13992)
OSError: Too many open files
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056
https://github.com/pydata/xarray/pull/1198#issuecomment-277276242,https://api.github.com/repos/pydata/xarray/issues/1198,277276242,MDEyOklzc3VlQ29tbWVudDI3NzI3NjI0Mg==,731499,2017-02-03T15:27:04Z,2017-02-03T15:27:04Z,CONTRIBUTOR,"I'm just chiming in to signify my interest in seeing this issue solved. I have just hit ""OSError: Too many open files"". The data itself is not even huge, but it's scattered across many files and it's a PITA to revert to manual concatenation -- I've grown used to dask doing the work for me ;-)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056