html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/pull/1038#issuecomment-290604158,https://api.github.com/repos/pydata/xarray/issues/1038,290604158,MDEyOklzc3VlQ29tbWVudDI5MDYwNDE1OA==,1217238,2017-03-31T03:11:00Z,2017-03-31T03:11:00Z,MEMBER,"OK, going to merge this anyways... the failing tests will be fixed by #1366","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-290601925,https://api.github.com/repos/pydata/xarray/issues/1038,290601925,MDEyOklzc3VlQ29tbWVudDI5MDYwMTkyNQ==,4295853,2017-03-31T02:53:30Z,2017-03-31T02:53:30Z,CONTRIBUTOR,"@shoyer, tests should be restarted following merge of #1336 and this PR should be ready to merge.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289873207,https://api.github.com/repos/pydata/xarray/issues/1038,289873207,MDEyOklzc3VlQ29tbWVudDI4OTg3MzIwNw==,4295853,2017-03-28T19:07:50Z,2017-03-28T19:07:50Z,CONTRIBUTOR,See #1336 for a fix that disables these tests that have been acting up because of resource issues.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289116553,https://api.github.com/repos/pydata/xarray/issues/1038,289116553,MDEyOklzc3VlQ29tbWVudDI4OTExNjU1Mw==,4295853,2017-03-24T19:04:42Z,2017-03-24T19:04:42Z,CONTRIBUTOR,"Crash in the same place... but when I restarted it via a force push earlier it passed, which would imply we are running out of resources on travis. Maybe the thing to do is just to do a reset on the open file limit as @rabernat suggested, this way it provides a factor of safety on travis. Thoughts on this idea @shoyer and @fmaussion?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289114272,https://api.github.com/repos/pydata/xarray/issues/1038,289114272,MDEyOklzc3VlQ29tbWVudDI4OTExNDI3Mg==,1217238,2017-03-24T18:55:41Z,2017-03-24T18:55:41Z,MEMBER,"Just restarted, let's see...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289114015,https://api.github.com/repos/pydata/xarray/issues/1038,289114015,MDEyOklzc3VlQ29tbWVudDI4OTExNDAxNQ==,4295853,2017-03-24T18:54:45Z,2017-03-24T18:54:45Z,CONTRIBUTOR,Is it possible that the test fails if more than one is simultaneously run on the same node? Could you restart the other tests to verify (restart at the same time if possible).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289113931,https://api.github.com/repos/pydata/xarray/issues/1038,289113931,MDEyOklzc3VlQ29tbWVudDI4OTExMzkzMQ==,1217238,2017-03-24T18:54:24Z,2017-03-24T18:54:24Z,MEMBER,Travis is a shared environment that runs multiple tests concurrently. It's possible that we're running out of files due to other users or even other variants of our same build.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289113304,https://api.github.com/repos/pydata/xarray/issues/1038,289113304,MDEyOklzc3VlQ29tbWVudDI4OTExMzMwNA==,4295853,2017-03-24T18:52:07Z,2017-03-24T18:52:07Z,CONTRIBUTOR,"Still passing locally... ``` xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_autoclose_netcdf4 PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_open_large_num_files_netcdf4 PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_autoclose_scipy PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_open_large_num_files_scipy PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_autoclose_pynio PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_open_large_num_files_pynio PASSED ``` Test passes even if I run it multiple times too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289111963,https://api.github.com/repos/pydata/xarray/issues/1038,289111963,MDEyOklzc3VlQ29tbWVudDI4OTExMTk2Mw==,4295853,2017-03-24T18:46:49Z,2017-03-24T18:47:26Z,CONTRIBUTOR,I'm continuing to take a look-- my tests were not 100% set up locally on this branch and I'll see if I can reproduce the sporadic error on macOS.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289109435,https://api.github.com/repos/pydata/xarray/issues/1038,289109435,MDEyOklzc3VlQ29tbWVudDI4OTEwOTQzNQ==,4295853,2017-03-24T18:36:35Z,2017-03-24T18:36:35Z,CONTRIBUTOR,"@shoyer, should I do a quick ""hot fix"" and then try to sort out the problem?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289109287,https://api.github.com/repos/pydata/xarray/issues/1038,289109287,MDEyOklzc3VlQ29tbWVudDI4OTEwOTI4Nw==,1217238,2017-03-24T18:35:56Z,2017-03-24T18:35:56Z,MEMBER,"@pwolfram if we're getting sporadic failures on Travis, it's probably better to skip the test by default. It's important for the test suite not be flakey.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289108876,https://api.github.com/repos/pydata/xarray/issues/1038,289108876,MDEyOklzc3VlQ29tbWVudDI4OTEwODg3Ng==,4295853,2017-03-24T18:34:13Z,2017-03-24T18:34:13Z,CONTRIBUTOR,"It happened here too... I just tried it out on my local machine via `conda env create -f ci/requirements-py27-cdat+pynio.yml` and wasn't able to get an error... are any of the crashes better then a ""seg fault""?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289108562,https://api.github.com/repos/pydata/xarray/issues/1038,289108562,MDEyOklzc3VlQ29tbWVudDI4OTEwODU2Mg==,10050469,2017-03-24T18:33:01Z,2017-03-24T18:33:01Z,MEMBER,"Yes, it also happened on this PR: https://github.com/pydata/xarray/pull/1328","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289108229,https://api.github.com/repos/pydata/xarray/issues/1038,289108229,MDEyOklzc3VlQ29tbWVudDI4OTEwODIyOQ==,1217238,2017-03-24T18:31:36Z,2017-03-24T18:31:36Z,MEMBER,"It looks like one of the new many files tests is crashing: xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_open_large_num_files_pynio /home/travis/build.sh: line 62: 1561 Segmentation fault (core dumped) py.test xarray --cov=xarray --cov-report term-missing --verbose https://travis-ci.org/pydata/xarray/jobs/214722901","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289080885,https://api.github.com/repos/pydata/xarray/issues/1038,289080885,MDEyOklzc3VlQ29tbWVudDI4OTA4MDg4NQ==,4295853,2017-03-24T16:58:43Z,2017-03-24T16:58:43Z,CONTRIBUTOR,"@shoyer, added a test as requested.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-288456001,https://api.github.com/repos/pydata/xarray/issues/1038,288456001,MDEyOklzc3VlQ29tbWVudDI4ODQ1NjAwMQ==,1217238,2017-03-22T16:26:02Z,2017-03-22T16:26:02Z,MEMBER,"Yes, this works for me. Can you add a test case that covers this?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-288434764,https://api.github.com/repos/pydata/xarray/issues/1038,288434764,MDEyOklzc3VlQ29tbWVudDI4ODQzNDc2NA==,10050469,2017-03-22T15:25:45Z,2017-03-22T15:25:45Z,MEMBER,"> Note, I would say that open_mfdataset is no longer experimental because of its widespread use. Yes, I also recently updated the IO docs in this respect and removed the experimental part: http://xarray.pydata.org/en/latest/io.html#id6","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-288433091,https://api.github.com/repos/pydata/xarray/issues/1038,288433091,MDEyOklzc3VlQ29tbWVudDI4ODQzMzA5MQ==,4295853,2017-03-22T15:21:07Z,2017-03-22T15:21:07Z,CONTRIBUTOR,Provided checks pass this should be ready to merge @fmaussion unless @shoyer has any additional recommended changes.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-288432628,https://api.github.com/repos/pydata/xarray/issues/1038,288432628,MDEyOklzc3VlQ29tbWVudDI4ODQzMjYyOA==,4295853,2017-03-22T15:19:45Z,2017-03-22T15:19:45Z,CONTRIBUTOR,"Note, I would say that `open_mfdataset` is no longer experimental because of its widespread use.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-288425486,https://api.github.com/repos/pydata/xarray/issues/1038,288425486,MDEyOklzc3VlQ29tbWVudDI4ODQyNTQ4Ng==,10050469,2017-03-22T14:57:54Z,2017-03-22T14:57:54Z,MEMBER,"Yes, that's good for me. I would mention it somewhere in the docstring though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-288423681,https://api.github.com/repos/pydata/xarray/issues/1038,288423681,MDEyOklzc3VlQ29tbWVudDI4ODQyMzY4MQ==,4295853,2017-03-22T14:52:34Z,2017-03-22T14:52:34Z,CONTRIBUTOR,"@fmaussion and @shoyer, I'd like to close this PR out if possible. I'm not 100% sure this PR is worthwhile to complete in a general fashion because of the ambiguity in how to best handle this issue. My current take on this would be to go with whatever is simplest / cleanest, at least in the short term, which is @fmaussion's suggestion above. Does this work for you both?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-267683615,https://api.github.com/repos/pydata/xarray/issues/1038,267683615,MDEyOklzc3VlQ29tbWVudDI2NzY4MzYxNQ==,10050469,2016-12-16T20:03:10Z,2016-12-16T20:03:10Z,MEMBER,"AFAIC I'd be happy with a ``combined.attrs = datasets[0].attrs`` added [before returning the combined dataset](https://github.com/pydata/xarray/blob/master/xarray/backends/api.py#L520) which is already better than the current situation... Do you have time to get back to this @pwolfram ?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-251715380,https://api.github.com/repos/pydata/xarray/issues/1038,251715380,MDEyOklzc3VlQ29tbWVudDI1MTcxNTM4MA==,1217238,2016-10-05T15:50:06Z,2016-10-05T15:50:06Z,MEMBER,"> I did some more digging and see some of the potential issues because some of the concatenation / merging is done quasi-automatically, which reduces the number of objects that must be merged (e.g., https://github.com/pydata/xarray/blob/master/xarray/core/combine.py#L391). I'm assuming this is done for performance / simplicity. Is that true? We have two primitive combine operations, `concat` (same variables, different coordinate values) and `merge` (different variables, same coordinate values). `auto_combine` needs to do both in some order. You're right that the order of `grouped` is not deterministic (it uses a dict). Sorting by key for input into the list comprehension could fix that. The comprehensive fix would be to pick a merge strategy for attributes, and apply it uniformly in each place where xarray merges variables or datasets (basically, in `concat` and all the `merge` variations). Possibly several merge strategies, with a keyword argument to switch between them. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-251571258,https://api.github.com/repos/pydata/xarray/issues/1038,251571258,MDEyOklzc3VlQ29tbWVudDI1MTU3MTI1OA==,4295853,2016-10-05T03:09:38Z,2016-10-05T03:09:38Z,CONTRIBUTOR,"@shoyer, I did some more digging and see some of the potential issues because some of the concatenation / merging is done quasi-automatically, which reduces the number of objects that must be merged (e.g., https://github.com/pydata/xarray/blob/master/xarray/core/combine.py#L391). I'm assuming this is done for performance / simplicity. Is that true? This is looking like a much larger piece of work as I look at this further because the information has already been compressed by the time the `merge` is called (i.e., `len(dict_like_objects)` is not necessarily equal to the number of input files https://github.com/pydata/xarray/blob/master/xarray/core/merge.py#L531). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-251565041,https://api.github.com/repos/pydata/xarray/issues/1038,251565041,MDEyOklzc3VlQ29tbWVudDI1MTU2NTA0MQ==,4295853,2016-10-05T02:14:53Z,2016-10-05T02:14:53Z,CONTRIBUTOR,"@shoyer, it sounds like provenance of data is an outstanding problem long-term. I'm happy to just copy attributes from the first dataset but am wondering what it would take to do this correctly, i.e., the ""overhaul"". Any information you have on this would be really helpful. At a minimum we can do as you suggest to fix the lack of attributes (#1037). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-251547619,https://api.github.com/repos/pydata/xarray/issues/1038,251547619,MDEyOklzc3VlQ29tbWVudDI1MTU0NzYxOQ==,1217238,2016-10-05T00:00:30Z,2016-10-05T00:00:30Z,MEMBER,"Merge logic for attributes opens a whole big can of worms. I would probably just copy attributes from the first dataset (similar to what we do in `concat`), unless you want to overhaul the whole thing in a more comprehensive fashion. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674