html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/1038#issuecomment-290601925,https://api.github.com/repos/pydata/xarray/issues/1038,290601925,MDEyOklzc3VlQ29tbWVudDI5MDYwMTkyNQ==,4295853,2017-03-31T02:53:30Z,2017-03-31T02:53:30Z,CONTRIBUTOR,"@shoyer, tests should be restarted following merge of #1336 and this PR should be ready to merge.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289873207,https://api.github.com/repos/pydata/xarray/issues/1038,289873207,MDEyOklzc3VlQ29tbWVudDI4OTg3MzIwNw==,4295853,2017-03-28T19:07:50Z,2017-03-28T19:07:50Z,CONTRIBUTOR,See #1336 for a fix that disables these tests that have been acting up because of resource issues.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289116553,https://api.github.com/repos/pydata/xarray/issues/1038,289116553,MDEyOklzc3VlQ29tbWVudDI4OTExNjU1Mw==,4295853,2017-03-24T19:04:42Z,2017-03-24T19:04:42Z,CONTRIBUTOR,"Crash in the same place... but when I restarted it via a force push earlier it passed, which would imply we are running out of resources on travis.
Maybe the thing to do is just to do a reset on the open file limit as @rabernat suggested, this way it provides a factor of safety on travis.
Thoughts on this idea @shoyer and @fmaussion?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289114015,https://api.github.com/repos/pydata/xarray/issues/1038,289114015,MDEyOklzc3VlQ29tbWVudDI4OTExNDAxNQ==,4295853,2017-03-24T18:54:45Z,2017-03-24T18:54:45Z,CONTRIBUTOR,Is it possible that the test fails if more than one is simultaneously run on the same node? Could you restart the other tests to verify (restart at the same time if possible).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289113304,https://api.github.com/repos/pydata/xarray/issues/1038,289113304,MDEyOklzc3VlQ29tbWVudDI4OTExMzMwNA==,4295853,2017-03-24T18:52:07Z,2017-03-24T18:52:07Z,CONTRIBUTOR,"Still passing locally...
```
xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_autoclose_netcdf4 PASSED
xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_open_large_num_files_netcdf4 PASSED
xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_autoclose_scipy PASSED
xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_open_large_num_files_scipy PASSED
xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_autoclose_pynio PASSED
xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_open_large_num_files_pynio PASSED
```
Test passes even if I run it multiple times too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289111963,https://api.github.com/repos/pydata/xarray/issues/1038,289111963,MDEyOklzc3VlQ29tbWVudDI4OTExMTk2Mw==,4295853,2017-03-24T18:46:49Z,2017-03-24T18:47:26Z,CONTRIBUTOR,I'm continuing to take a look-- my tests were not 100% set up locally on this branch and I'll see if I can reproduce the sporadic error on macOS.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289109435,https://api.github.com/repos/pydata/xarray/issues/1038,289109435,MDEyOklzc3VlQ29tbWVudDI4OTEwOTQzNQ==,4295853,2017-03-24T18:36:35Z,2017-03-24T18:36:35Z,CONTRIBUTOR,"@shoyer, should I do a quick ""hot fix"" and then try to sort out the problem?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289108876,https://api.github.com/repos/pydata/xarray/issues/1038,289108876,MDEyOklzc3VlQ29tbWVudDI4OTEwODg3Ng==,4295853,2017-03-24T18:34:13Z,2017-03-24T18:34:13Z,CONTRIBUTOR,"It happened here too... I just tried it out on my local machine via `conda env create -f ci/requirements-py27-cdat+pynio.yml` and wasn't able to get an error... are any of the crashes better then a ""seg fault""?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-289080885,https://api.github.com/repos/pydata/xarray/issues/1038,289080885,MDEyOklzc3VlQ29tbWVudDI4OTA4MDg4NQ==,4295853,2017-03-24T16:58:43Z,2017-03-24T16:58:43Z,CONTRIBUTOR,"@shoyer, added a test as requested.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-288433091,https://api.github.com/repos/pydata/xarray/issues/1038,288433091,MDEyOklzc3VlQ29tbWVudDI4ODQzMzA5MQ==,4295853,2017-03-22T15:21:07Z,2017-03-22T15:21:07Z,CONTRIBUTOR,Provided checks pass this should be ready to merge @fmaussion unless @shoyer has any additional recommended changes.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-288432628,https://api.github.com/repos/pydata/xarray/issues/1038,288432628,MDEyOklzc3VlQ29tbWVudDI4ODQzMjYyOA==,4295853,2017-03-22T15:19:45Z,2017-03-22T15:19:45Z,CONTRIBUTOR,"Note, I would say that `open_mfdataset` is no longer experimental because of its widespread use.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-288423681,https://api.github.com/repos/pydata/xarray/issues/1038,288423681,MDEyOklzc3VlQ29tbWVudDI4ODQyMzY4MQ==,4295853,2017-03-22T14:52:34Z,2017-03-22T14:52:34Z,CONTRIBUTOR,"@fmaussion and @shoyer, I'd like to close this PR out if possible. I'm not 100% sure this PR is worthwhile to complete in a general fashion because of the ambiguity in how to best handle this issue. My current take on this would be to go with whatever is simplest / cleanest, at least in the short term, which is @fmaussion's suggestion above. Does this work for you both?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-251571258,https://api.github.com/repos/pydata/xarray/issues/1038,251571258,MDEyOklzc3VlQ29tbWVudDI1MTU3MTI1OA==,4295853,2016-10-05T03:09:38Z,2016-10-05T03:09:38Z,CONTRIBUTOR,"@shoyer, I did some more digging and see some of the potential issues because some of the concatenation / merging is done quasi-automatically, which reduces the number of objects that must be merged (e.g., https://github.com/pydata/xarray/blob/master/xarray/core/combine.py#L391). I'm assuming this is done for performance / simplicity. Is that true?
This is looking like a much larger piece of work as I look at this further because the information has already been compressed by the time the `merge` is called (i.e., `len(dict_like_objects)` is not necessarily equal to the number of input files https://github.com/pydata/xarray/blob/master/xarray/core/merge.py#L531).
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674
https://github.com/pydata/xarray/pull/1038#issuecomment-251565041,https://api.github.com/repos/pydata/xarray/issues/1038,251565041,MDEyOklzc3VlQ29tbWVudDI1MTU2NTA0MQ==,4295853,2016-10-05T02:14:53Z,2016-10-05T02:14:53Z,CONTRIBUTOR,"@shoyer, it sounds like provenance of data is an outstanding problem long-term. I'm happy to just copy attributes from the first dataset but am wondering what it would take to do this correctly, i.e., the ""overhaul"". Any information you have on this would be really helpful. At a minimum we can do as you suggest to fix the lack of attributes (#1037).
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674