html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/919#issuecomment-235742069,https://api.github.com/repos/pydata/xarray/issues/919,235742069,MDEyOklzc3VlQ29tbWVudDIzNTc0MjA2OQ==,17951292,2016-07-27T22:32:36Z,2016-07-27T22:32:36Z,NONE,"Yes indeed. I'm embarrassed I even posted this! It looks like the nbnds and time_bnds variables were added to the yearly files at some point. Accordingly, a simple conditional checking for their existence and subsequently deleting them prior to concatenation did the trick. Too bad there is absolutely no consistency in these files at all. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,167684282 https://github.com/pydata/xarray/issues/900#issuecomment-233504857,https://api.github.com/repos/pydata/xarray/issues/900,233504857,MDEyOklzc3VlQ29tbWVudDIzMzUwNDg1Nw==,17951292,2016-07-19T01:17:43Z,2016-07-19T01:17:43Z,NONE,"Thanks Stephan - that was very helpful! I was able to carry out the necessary computations with no problems after seeing your reply. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,166195300 https://github.com/pydata/xarray/issues/894#issuecomment-230284327,https://api.github.com/repos/pydata/xarray/issues/894,230284327,MDEyOklzc3VlQ29tbWVudDIzMDI4NDMyNw==,17951292,2016-07-04T12:56:10Z,2016-07-04T12:56:10Z,NONE,"That did the trick! ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,163414759 https://github.com/pydata/xarray/issues/891#issuecomment-229762102,https://api.github.com/repos/pydata/xarray/issues/891,229762102,MDEyOklzc3VlQ29tbWVudDIyOTc2MjEwMg==,17951292,2016-06-30T19:22:43Z,2016-06-30T19:22:43Z,NONE,"Thanks, Stephan! Enjoy your vacation! For now I am processing multiple files over an opendap connection using netCDF4.MFDataset and then building a DataArray or Dataset from the variables stored within the resulting MFDataset object. This appears much faster than the process recommend here when a lot of big files are involved: http://xarray.pydata.org/en/stable/io.html (i.e., using read_netcdfs). Once a bug fix is implemented, I'll try using the dask-optimized open_mfdataset(). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,162726984