html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/pull/1038#issuecomment-290604158,https://api.github.com/repos/pydata/xarray/issues/1038,290604158,MDEyOklzc3VlQ29tbWVudDI5MDYwNDE1OA==,1217238,2017-03-31T03:11:00Z,2017-03-31T03:11:00Z,MEMBER,"OK, going to merge this anyways... the failing tests will be fixed by #1366","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289114272,https://api.github.com/repos/pydata/xarray/issues/1038,289114272,MDEyOklzc3VlQ29tbWVudDI4OTExNDI3Mg==,1217238,2017-03-24T18:55:41Z,2017-03-24T18:55:41Z,MEMBER,"Just restarted, let's see...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289113931,https://api.github.com/repos/pydata/xarray/issues/1038,289113931,MDEyOklzc3VlQ29tbWVudDI4OTExMzkzMQ==,1217238,2017-03-24T18:54:24Z,2017-03-24T18:54:24Z,MEMBER,Travis is a shared environment that runs multiple tests concurrently. It's possible that we're running out of files due to other users or even other variants of our same build.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289109287,https://api.github.com/repos/pydata/xarray/issues/1038,289109287,MDEyOklzc3VlQ29tbWVudDI4OTEwOTI4Nw==,1217238,2017-03-24T18:35:56Z,2017-03-24T18:35:56Z,MEMBER,"@pwolfram if we're getting sporadic failures on Travis, it's probably better to skip the test by default. It's important for the test suite not be flakey.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-289108229,https://api.github.com/repos/pydata/xarray/issues/1038,289108229,MDEyOklzc3VlQ29tbWVudDI4OTEwODIyOQ==,1217238,2017-03-24T18:31:36Z,2017-03-24T18:31:36Z,MEMBER,"It looks like one of the new many files tests is crashing: xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_open_large_num_files_pynio /home/travis/build.sh: line 62: 1561 Segmentation fault (core dumped) py.test xarray --cov=xarray --cov-report term-missing --verbose https://travis-ci.org/pydata/xarray/jobs/214722901","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-288456001,https://api.github.com/repos/pydata/xarray/issues/1038,288456001,MDEyOklzc3VlQ29tbWVudDI4ODQ1NjAwMQ==,1217238,2017-03-22T16:26:02Z,2017-03-22T16:26:02Z,MEMBER,"Yes, this works for me. Can you add a test case that covers this?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-251715380,https://api.github.com/repos/pydata/xarray/issues/1038,251715380,MDEyOklzc3VlQ29tbWVudDI1MTcxNTM4MA==,1217238,2016-10-05T15:50:06Z,2016-10-05T15:50:06Z,MEMBER,"> I did some more digging and see some of the potential issues because some of the concatenation / merging is done quasi-automatically, which reduces the number of objects that must be merged (e.g., https://github.com/pydata/xarray/blob/master/xarray/core/combine.py#L391). I'm assuming this is done for performance / simplicity. Is that true? We have two primitive combine operations, `concat` (same variables, different coordinate values) and `merge` (different variables, same coordinate values). `auto_combine` needs to do both in some order. You're right that the order of `grouped` is not deterministic (it uses a dict). Sorting by key for input into the list comprehension could fix that. The comprehensive fix would be to pick a merge strategy for attributes, and apply it uniformly in each place where xarray merges variables or datasets (basically, in `concat` and all the `merge` variations). Possibly several merge strategies, with a keyword argument to switch between them. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674 https://github.com/pydata/xarray/pull/1038#issuecomment-251547619,https://api.github.com/repos/pydata/xarray/issues/1038,251547619,MDEyOklzc3VlQ29tbWVudDI1MTU0NzYxOQ==,1217238,2016-10-05T00:00:30Z,2016-10-05T00:00:30Z,MEMBER,"Merge logic for attributes opens a whole big can of worms. I would probably just copy attributes from the first dataset (similar to what we do in `concat`), unless you want to overhaul the whole thing in a more comprehensive fashion. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,181033674