html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/3070#issuecomment-507519092,https://api.github.com/repos/pydata/xarray/issues/3070,507519092,MDEyOklzc3VlQ29tbWVudDUwNzUxOTA5Mg==,24736507,2019-07-02T05:02:47Z,2019-07-04T20:02:40Z,NONE,"Hello @shoyer! Thanks for updating this PR. We checked the lines you've touched for [PEP 8](https://www.python.org/dev/peps/pep-0008) issues, and found:
There are currently no PEP 8 issues detected in this Pull Request. Cheers! :beers:
##### Comment last updated at 2019-07-04 20:02:40 UTC","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-507521709,https://api.github.com/repos/pydata/xarray/issues/3070,507521709,MDEyOklzc3VlQ29tbWVudDUwNzUyMTcwOQ==,22429695,2019-07-02T05:18:13Z,2019-07-04T20:02:24Z,NONE,"# [Codecov](https://codecov.io/gh/pydata/xarray/pull/3070?src=pr&el=h1) Report
> :exclamation: No coverage uploaded for pull request base (`master@681ec0e`). [Click here to learn what that means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
> The diff coverage is `n/a`.
```diff
@@ Coverage Diff @@
## master #3070 +/- ##
========================================
Coverage ? 94.8%
========================================
Files ? 67
Lines ? 12876
Branches ? 0
========================================
Hits ? 12207
Misses ? 669
Partials ? 0
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-508201730,https://api.github.com/repos/pydata/xarray/issues/3070,508201730,MDEyOklzc3VlQ29tbWVudDUwODIwMTczMA==,1217238,2019-07-03T18:14:49Z,2019-07-03T18:14:49Z,MEMBER,"I'm referring to the tests running on CI, e.g., ""Linux py36-flakey"" on Azure. If you look at the [full logs](https://dev.azure.com/xarray/xarray/_build/results?buildId=110) from this PR (which I made verbose) and scroll down you see:
```
xarray/tests/test_backends.py::TestPydapOnline::test_cmp_local_file SKIPPED [ 13%]
xarray/tests/test_backends.py::TestPydapOnline::test_compatible_to_netcdf SKIPPED [ 13%]
xarray/tests/test_backends.py::TestPydapOnline::test_dask SKIPPED [ 13%]
xarray/tests/test_backends.py::TestPydapOnline::test_session SKIPPED [ 13%]
```
So for some reason it's not doing flakey tests, even though in theory I'm passing the right flags for it:
https://github.com/pydata/xarray/blob/681ec0e84af7c155b16ef46122e2a6cea64b585d/ci/azure/unit-tests.yml#L14
Actually, I definitely screwed this up. I should be using `$PYTEST_EXTRA_FLAGS`, not `$EXTRA_FLAGS`. Let me try fixing that...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-508168192,https://api.github.com/repos/pydata/xarray/issues/3070,508168192,MDEyOklzc3VlQ29tbWVudDUwODE2ODE5Mg==,5635139,2019-07-03T16:39:07Z,2019-07-03T16:39:07Z,MEMBER,"Somewhat weirdly. I get different results; could you confirm yours?
```
pytest --run-network-tests
---
Results (175.99s):
7266 passed
5 xpassed
19 xfailed
1092 skipped
```
```
pytest
---
Results (165.58s):
7264 passed
5 xpassed
19 xfailed
1094 skipped
```
Also of note: `--run-network` also works, as does `--run-n`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-508026785,https://api.github.com/repos/pydata/xarray/issues/3070,508026785,MDEyOklzc3VlQ29tbWVudDUwODAyNjc4NQ==,5635139,2019-07-03T09:56:13Z,2019-07-03T09:56:13Z,MEMBER,"That does sound suspicious! I'm on vacation with bad WiFi but let me confirm in the next few days, if that's ok","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-507717430,https://api.github.com/repos/pydata/xarray/issues/3070,507717430,MDEyOklzc3VlQ29tbWVudDUwNzcxNzQzMA==,1217238,2019-07-02T15:03:53Z,2019-07-02T15:03:53Z,MEMBER,"Pytest reports the exact same number of tests passed, both with and without `--run-network-tests`. That seems suspicious to me.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-507549388,https://api.github.com/repos/pydata/xarray/issues/3070,507549388,MDEyOklzc3VlQ29tbWVudDUwNzU0OTM4OA==,5635139,2019-07-02T07:07:56Z,2019-07-02T07:07:56Z,MEMBER,"Let's def change to the most reliable & standard approach here. I think I originally got this from https://docs.pytest.org/en/latest/example/simple.html#control-skipping-of-tests-according-to-command-line-option but if this is minority let's make the switch
What makes us think the existing approach wasn't working? (not saying that it did work in CI! It does seem to work locally though)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-507544537,https://api.github.com/repos/pydata/xarray/issues/3070,507544537,MDEyOklzc3VlQ29tbWVudDUwNzU0NDUzNw==,5635139,2019-07-02T06:50:52Z,2019-07-02T06:50:52Z,MEMBER,"`--run-network` works fine
```
I test ~/w/xarray * pytest --run-network xarray/tests/test_tutorial.py 192ms Tue Jul 2 02:45:19 2019
Test session starts (platform: darwin, Python 3.7.3, pytest 4.6.3, pytest-sugar 0.9.2)
rootdir: /Users/maximilian/workspace/xarray, inifile: setup.cfg
plugins: xdist-1.29.0, forked-1.0.2, sugar-0.9.2, regtest-1.4.1, testmon-0.9.16, pycharm-0.5.0, celery-4.3.0
collecting ...
xarray/tests/test_tutorial.py ✓✓ 100% ██████████
=============================================================================================== warnings summary ===============================================================================================
xarray/tests/test_tutorial.py::TestLoadDataset::test_download_from_github
/usr/local/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject
return f(*args, **kwds)
xarray/tests/test_tutorial.py::TestLoadDataset::test_download_from_github
xarray/tests/test_tutorial.py::TestLoadDataset::test_download_from_github_load_without_cache
/usr/local/lib/python3.7/site-packages/_pytest/python.py:174: RuntimeWarning: deallocating CachingFileManager(, '/Users/maximilian/.xarray_tutorial_data/tiny.nc', mode='r', kwargs={'mmap': None, 'version': 2}), but file is not already closed. This may indicate a bug.
testfunction(**testargs)
-- Docs: https://docs.pytest.org/en/latest/warnings.html
Results (15.92s):
2 passed # <-- runs
N test ~/w/xarray * pytest xarray/tests/test_tutorial.py 17.6s Tue Jul 2 02:49:10 2019
Test session starts (platform: darwin, Python 3.7.3, pytest 4.6.3, pytest-sugar 0.9.2)
rootdir: /Users/maximilian/workspace/xarray, inifile: setup.cfg
plugins: xdist-1.29.0, forked-1.0.2, sugar-0.9.2, regtest-1.4.1, testmon-0.9.16, pycharm-0.5.0, celery-4.3.0
collecting ...
xarray/tests/test_tutorial.py ss 100% ██████████
Results (3.06s):
2 skipped # <-- skips
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-507537223,https://api.github.com/repos/pydata/xarray/issues/3070,507537223,MDEyOklzc3VlQ29tbWVudDUwNzUzNzIyMw==,1217238,2019-07-02T06:24:15Z,2019-07-02T06:24:15Z,MEMBER,"Hmm... maybe it does work, at least in this. But we definitely aren't running the network tests, even with these changes (e.g., look at the `TestPydapOnline` tests). I'm not quite sure why.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297
https://github.com/pydata/xarray/pull/3070#issuecomment-507536039,https://api.github.com/repos/pydata/xarray/issues/3070,507536039,MDEyOklzc3VlQ29tbWVudDUwNzUzNjAzOQ==,5635139,2019-07-02T06:19:55Z,2019-07-02T06:19:55Z,MEMBER,"Hmmm, this is my code. Checking what I missed. We use the similar code internally and it works (I think...)
Do we have any flaky tests? I only see one that's marked, and that's skipped anyway (maybe because of this problem):
```
rg flaky
azure-pipelines.yml
24: pytest_extra_flags: --run-flaky --run-network-tests
setup.cfg
13: flaky: flaky tests
conftest.py
8: parser.addoption(""--run-flaky"", action=""store_true"",
9: help=""runs flaky tests"")
16: if not config.getoption(""--run-flaky""):
17: skip_flaky = pytest.mark.skip(
18: reason=""set --run-flaky option to run flaky tests"")
20: if ""flaky"" in item.keywords:
21: item.add_marker(skip_flaky)
doc/whats-new.rst
1847:- Enhanced tests suite by use of ``@slow`` and ``@flaky`` decorators, which are
1848: controlled via ``--run-flaky`` and ``--skip-slow`` command line arguments
xarray/tests/__init__.py
112:flaky = pytest.mark.flaky
xarray/tests/test_plot.py
30:@pytest.mark.flaky
31:@pytest.mark.skip(reason='maybe flaky')
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,463021297