html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/2995#issuecomment-518869785,https://api.github.com/repos/pydata/xarray/issues/2995,518869785,MDEyOklzc3VlQ29tbWVudDUxODg2OTc4NQ==,1117224,2019-08-06T22:39:07Z,2019-08-06T22:39:07Z,NONE,Is it possible to read mulitple netcdf files on s3 using open_mfdataset?,"{""total_count"": 3, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 3}",,449706080
https://github.com/pydata/xarray/issues/2273#issuecomment-409349569,https://api.github.com/repos/pydata/xarray/issues/2273,409349569,MDEyOklzc3VlQ29tbWVudDQwOTM0OTU2OQ==,1117224,2018-07-31T20:02:57Z,2018-07-31T20:03:41Z,NONE,Ah thanks @jhamman! (Updated to 10.8 and can confirm warnings are supressed),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,339611449
https://github.com/pydata/xarray/issues/2273#issuecomment-409342489,https://api.github.com/repos/pydata/xarray/issues/2273,409342489,MDEyOklzc3VlQ29tbWVudDQwOTM0MjQ4OQ==,1117224,2018-07-31T19:38:09Z,2018-07-31T19:38:09Z,NONE,"For anyone else looking for a TEMP fix to hide these warnings (they were spamming my output making debugging difficult)...
`import warnings`
`warnings.simplefilter(action='ignore', category=UserWarning) `","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,339611449
https://github.com/pydata/xarray/issues/1856#issuecomment-398481880,https://api.github.com/repos/pydata/xarray/issues/1856,398481880,MDEyOklzc3VlQ29tbWVudDM5ODQ4MTg4MA==,1117224,2018-06-19T17:33:03Z,2018-06-19T17:33:03Z,NONE,Also hitting this issue. (Use case: formatting netcdf files for some R code that does not have labeled indexing... ugh). Thanks @phausamann for the work around. Default transposing of coods makes sense to me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,291485366
https://github.com/pydata/xarray/pull/1983#issuecomment-382071801,https://api.github.com/repos/pydata/xarray/issues/1983,382071801,MDEyOklzc3VlQ29tbWVudDM4MjA3MTgwMQ==,1117224,2018-04-17T17:14:33Z,2018-04-17T17:38:42Z,NONE,"Thanks @jhamman for working on this! I did a test on my real world data (1202 ~3mb files) on my local computer and am not getting results I expected:
1) No speed up with parallel=True
2) _Slow down_ when using distributed (processes=16 cores=16).
Am I missing something?
```python
nc_files = glob.glob(E.obs['NSIDC_0081']['sipn_nc']+'/*.nc')
print(len(nc_files))
1202
# Parallel False
%time ds = xr.open_mfdataset(nc_files, concat_dim='time', parallel=False, autoclose=True)
CPU times: user 57.8 s, sys: 3.2 s, total: 1min 1s
Wall time: 1min
# Parallel True with default scheduler
%time ds = xr.open_mfdataset(nc_files, concat_dim='time', parallel=True, autoclose=True)
CPU times: user 1min 16s, sys: 9.82 s, total: 1min 26s
Wall time: 1min 16s
# Parallel True with distributed
from dask.distributed import Client
client = Client()
print(client)
%time ds = xr.open_mfdataset(nc_files, concat_dim='time', parallel=True, autoclose=True)
CPU times: user 2min 17s, sys: 12.3 s, total: 2min 29s
Wall time: 3min 48s
```
On feature/parallel_open_netcdf commit 280a46f13426a462fb3e983cfd5ac7a0565d1826","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,304589831
https://github.com/pydata/xarray/pull/1070#issuecomment-279016156,https://api.github.com/repos/pydata/xarray/issues/1070,279016156,MDEyOklzc3VlQ29tbWVudDI3OTAxNjE1Ng==,1117224,2017-02-10T17:54:13Z,2017-02-10T17:54:13Z,NONE,"Hi @fmaussion, no objections here. I got it working just barely for my project, and won't have time in the near future to devote to wrap this up.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,186326698
https://github.com/pydata/xarray/pull/961#issuecomment-271420374,https://api.github.com/repos/pydata/xarray/issues/961,271420374,MDEyOklzc3VlQ29tbWVudDI3MTQyMDM3NA==,1117224,2017-01-09T21:57:13Z,2017-01-09T21:57:13Z,NONE,"Numpy's datetime64 dtype currently used by xarray does not store time zone as mentioned here #552. To prevent users from making time zone errors upon dataset creation, I think the implied assumption that UTC be used, should be made more apparent in the readthedocs. Hopefully in the future it can be added to datetime64??","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,170688064
https://github.com/pydata/xarray/pull/1070#issuecomment-257401393,https://api.github.com/repos/pydata/xarray/issues/1070,257401393,MDEyOklzc3VlQ29tbWVudDI1NzQwMTM5Mw==,1117224,2016-10-31T19:52:56Z,2016-10-31T19:52:56Z,NONE,"Any idea why Segmentation faults occur for 3.4 and 4.5?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,186326698
https://github.com/pydata/xarray/pull/1070#issuecomment-257385375,https://api.github.com/repos/pydata/xarray/issues/1070,257385375,MDEyOklzc3VlQ29tbWVudDI1NzM4NTM3NQ==,1117224,2016-10-31T18:51:46Z,2016-10-31T18:51:46Z,NONE,"Travis-ci fail is because it can't find rasterio, which comes through the conda-forge channel (https://github.com/conda-forge/rasterio-feedstock). I think it needs to be added as described here (http://conda.pydata.org/docs/travis.html#additional-steps). But I am new to Travis-ci, so don't want to mess up the current .travis.yml file.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,186326698
https://github.com/pydata/xarray/pull/1070#issuecomment-257373530,https://api.github.com/repos/pydata/xarray/issues/1070,257373530,MDEyOklzc3VlQ29tbWVudDI1NzM3MzUzMA==,1117224,2016-10-31T18:10:52Z,2016-10-31T18:10:52Z,NONE,"Tested open_mfdataset() on 100+ geotiffs and lazyloading with rasterio does appear to be working.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,186326698
https://github.com/pydata/xarray/issues/970#issuecomment-240249797,https://api.github.com/repos/pydata/xarray/issues/970,240249797,MDEyOklzc3VlQ29tbWVudDI0MDI0OTc5Nw==,1117224,2016-08-16T21:46:43Z,2016-08-16T21:46:43Z,NONE,"Yes, that is a perfect solution, thank you!
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,171504099