home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

49 rows where state = "closed", type = "issue" and user = 14808389 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: state_reason, created_at (date), updated_at (date), closed_at (date)

type 1

  • issue · 49 ✖

state 1

  • closed · 49 ✖

repo 1

  • xarray 49
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
590630281 MDU6SXNzdWU1OTA2MzAyODE= 3921 issues discovered by the all-but-dask CI keewis 14808389 closed 0     4 2020-03-30T22:08:46Z 2024-04-25T14:48:15Z 2024-02-10T02:57:34Z MEMBER      

After adding the py38-all-but-dask CI in #3919, it discovered a few backend issues: - zarr: - [x] open_zarr with chunks="auto" always tries to chunk, even if dask is not available (fixed in #3919) - [x] ZarrArrayWrapper.__getitem__ incorrectly passes the indexer's tuple attribute to _arrayize_vectorized_indexer (this only happens if dask is not available) (fixed in #3919) - [x] slice indexers with negative steps get transformed incorrectly if dask is not available https://github.com/pydata/xarray/pull/8674 - rasterio: - ~calling pickle.dumps on a Dataset object returned by open_rasterio fails because a non-serializable lock was used (if dask is installed, a serializable lock is used instead)~

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3921/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1730451312 I_kwDOAMm_X85nJJdw 7879 occasional segfaults on CI keewis 14808389 closed 0     3 2023-05-29T09:52:01Z 2023-11-06T22:03:43Z 2023-11-06T22:03:42Z MEMBER      

The upstream-dev CI currently fails sometimes due to a segfault (the normal CI crashes, too, but since we use pytest-xdist we only get a message stating "worker x crashed").

I'm not sure why, and I can't reproduce locally, either. Given that dask's local scheduler is in the traceback and the failing test is test_open_mfdataset_manyfiles, I assume there's some issue with parallel disk access or the temporary file creation.

log of the segfaulting CI job ``` ============================= test session starts ============================== platform linux -- Python 3.10.11, pytest-7.3.1, pluggy-1.0.0 rootdir: /home/runner/work/xarray/xarray configfile: setup.cfg testpaths: xarray/tests, properties plugins: env-0.8.1, xdist-3.3.1, timeout-2.1.0, cov-4.1.0, reportlog-0.1.2, hypothesis-6.75.6 timeout: 60.0s timeout method: signal timeout func_only: False collected 16723 items / 2 skipped xarray/tests/test_accessor_dt.py ....................................... [ 0%] ........................................................................ [ 0%] ........................................................................ [ 1%] ........................................................................ [ 1%] ............................... [ 1%] xarray/tests/test_accessor_str.py ...................................... [ 1%] ........................................................................ [ 2%] ........................................................................ [ 2%] ............................................. [ 3%] xarray/tests/test_array_api.py ........... [ 3%] xarray/tests/test_backends.py ........................X........x........ [ 3%] ...................................s.........................X........x. [ 3%] .........................................s.........................X.... [ 4%] ....x.......................................X........................... [ 4%] ....................................X........x.......................... [ 5%] .............X.......................................................... [ 5%] ....X........x....................x.x................X.................. [ 5%] x..x..x..x...................................X........x................. [ 6%] ...x.x................X..................x..x..x..x..................... [ 6%] ..............X........x....................x.x................X........ [ 7%] ..........x..x..x..x.......................................X........x... [ 7%] ...........................................X........x................... [ 8%] ...ss........................X........x................................. [ 8%] .................X........x............................................. [ 8%] ..X........x.............................................X........x..... [ 9%] ............................................X........x.................. [ 9%] .........................................................X........x..... [ 10%] ......................................................................X. [ 10%] .......x................................................................ [ 11%] Fatal Python error: Segmentation fault Thread 0x00007f9c7b8ff640 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 81 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Current thread 0x00007f9c81f1d640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9c82f1e640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9ca575e740 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 320 in wait File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/queue.py", line 171 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 137 in queue_get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 500 in get_async File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py", line 89 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py", line 595 in compute File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 1046 in open_mfdataset File "/home/runner/work/xarray/xarray/xarray/tests/test_backends.py", line 3295 in test_open_mfdataset_manyfiles File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 1799 in runtest File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 262 in <lambda> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 341 in from_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 261 in call_runtest_hook File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 222 in call_and_report File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 133 in runtestprotocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 348 in pytest_runtestloop File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 323 in _main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 269 in wrap_session File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 316 in pytest_cmdline_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 166 in main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 189 in console_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pytest/__main__.py", line 5 in <module> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 86 in _run_code File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 196 in _run_module_as_main Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pandas._libs.ops, numexpr.interpreter, bottleneck.move, bottleneck.nonreduce, bottleneck.nonreduce_axis, bottleneck.reduce, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.indexing, pandas._libs.index, pandas._libs.internals, pandas._libs.join, pandas._libs.writers, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, cftime._cftime, yaml._yaml, cytoolz.utils, cytoolz.itertoolz, cytoolz.functoolz, cytoolz.dicttoolz, cytoolz.recipes, xxhash._xxhash, psutil._psutil_linux, psutil._psutil_posix, markupsafe._speedups, numpy.linalg.lapack_lite, matplotlib._c_internal_utils, PIL._imaging, matplotlib._path, kiwisolver._cext, scipy._lib._ccallback_c, _cffi_backend, unicodedata2, netCDF4._netCDF4, h5py._errors, h5py.defs, h5py._objects, h5py.h5, h5py.h5r, h5py.utils, h5py.h5s, h5py.h5ac, h5py.h5p, h5py.h5t, h5py._conv, h5py.h5z, h5py._proxy, h5py.h5a, h5py.h5d, h5py.h5ds, h5py.h5g, h5py.h5i, h5py.h5f, h5py.h5fd, h5py.h5pl, h5py.h5o, h5py.h5l, h5py._selector, pyproj._compat, pyproj._datadir, pyproj._network, pyproj._geod, pyproj.list, pyproj._crs, pyproj.database, pyproj._transformer, pyproj._sync, matplotlib._image, rasterio._version, rasterio._err, rasterio._filepath, rasterio._env, rasterio._transform, rasterio._base, rasterio.crs, rasterio._features, rasterio._warp, rasterio._io, numcodecs.compat_ext, numcodecs.blosc, numcodecs.zstd, numcodecs.lz4, numcodecs._shuffle, msgpack._cmsgpack, numcodecs.vlen, zstandard.backend_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._isolve._iterative, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._ellip_harm_2, scipy.spatial.transform._rotation, scipy.ndimage._nd_image, _ni_label, scipy.ndimage._ni_label, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize.__nnls, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._statlib, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._mvn, scipy.stats._rcont.rcont, scipy.cluster._vq, scipy.cluster._hierarchy, scipy.cluster._optimal_leaf_ordering, shapely.lib, shapely._geos, shapely._geometry_helpers, cartopy.trace, scipy.fftpack.convolve, tornado.speedups, cf_units._udunits2, scipy.io.matlab._mio_utils, scipy.io.matlab._streams, scipy.io.matlab._mio5_utils (total: 241) /home/runner/work/_temp/b3f3888c-5349-4d19-80f6-41d140b86db5.sh: line 3: 6114 Segmentation fault (core dumped) python -m pytest --timeout=60 -rf --report-log output-3.10-log.jsonl ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7879/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  not_planned xarray 13221727 issue
1440494247 I_kwDOAMm_X85V3DKn 7270 type checking CI is failing keewis 14808389 closed 0     3 2022-11-08T16:10:24Z 2023-04-15T18:31:59Z 2023-04-15T18:31:59Z MEMBER      

The most recent runs of the type checking CI have started to fail with a segfault: /home/runner/work/_temp/dac0c060-b19a-435a-8063-bbc5b8ffbf24.sh: line 1: 2945 Segmentation fault (core dumped) python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report This seems to be due to the release of mypy=0.990.

7269 pinned mypy to mypy<0.990, which should be undone once we figured out how to fix that (most likely by waiting on the next release), and any complaints the new version has (see this workflow run).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7270/reactions",
    "total_count": 2,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 1,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
760574919 MDU6SXNzdWU3NjA1NzQ5MTk= 4670 increase the visibility of the upstream-dev PR CI keewis 14808389 closed 0     3 2020-12-09T18:37:57Z 2022-12-29T21:15:05Z 2021-01-19T15:27:26Z MEMBER      

We currently have two upstream-dev PR CI: the old pipelines CI and the new github actions CI added together with the scheduled upstream-dev ("nightly") CI. Since we don't need both I think we should disable one of these, presumably the old pipelines CI.

There's an issue with the CI result, though: since github doesn't have a icon for "passed with issues", we have to choose between "passed" or "failed" as the CI result (neither of which is optimal).

The advantage of using "failed" is that a failure is easily visible, but often the total CI result on PRs is set to "failed" because we didn't get around to fixing bugs introduced by recent changes to dependencies (which is confusing for contributors).

In #4584 I switched the pipelines upstream-dev CI to "allowed failure" so we get a warning instead of a failure. However, github doesn't print the number of warnings in the summary line, which means that if the icon is green nobody checks the status and upstream-dev CI failures are easily missed.

Our new scheduled nightly CI improves the situation quite a bit since we automatically get a issue containing the failures, but that means we aren't able to catch these failures before actually merging. As pointed out in https://github.com/pydata/xarray/issues/4574#issuecomment-725795622 that might be acceptable, though.

If we still want to fix this, we could have the PR CI automatically add a comment to the PR, which would contain a summary of the failures but also state that these failures can be ignored as long as they get the approval of a maintainer. This would increase the noise on PRs, though.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4670/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1250755592 I_kwDOAMm_X85KjQQI 6645 pre-release for v2022.06.0 keewis 14808389 closed 0     14 2022-05-27T13:14:06Z 2022-07-22T15:44:59Z 2022-07-22T15:44:59Z MEMBER      

There's a few unreleased and potentially breaking changes in main, most importantly the index refactor and the new groupby using numpy-groupies and flox. During the meeting on Wednesday we decided to release a preview version to get feedback before releasing a full version, especially from those who don't run their tests against our main branch.

I am planning to create the pre-release tomorrow, but if there's any big changes that should be included please post here.

cc @TomNicholas

Edit: the version will be called 2022.05.0.dev0, which will ensure that e.g. pip will require the --pre flag to install it.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6645/reactions",
    "total_count": 5,
    "+1": 5,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
818957820 MDU6SXNzdWU4MTg5NTc4MjA= 4976 reported version in the docs is misleading keewis 14808389 closed 0     3 2021-03-01T15:08:12Z 2022-07-10T13:00:46Z 2022-07-10T13:00:46Z MEMBER      

The editable install on RTD is reported to have the version 0.17.1.dev0+g835a53e6.d20210226 (which technically is correct but it would be better to have a clean version on tags).

This is not something I can reproduce with python -m pip install -e ., so it is either some RTD weirdness or happens because we get mamba to do the editable install for us.

We should try to get the version right.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4976/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1280027449 I_kwDOAMm_X85MS6s5 6714 `mypy` CI is failing on `main` keewis 14808389 closed 0     1 2022-06-22T11:54:16Z 2022-06-22T16:01:45Z 2022-06-22T16:01:45Z MEMBER      

The most recent runs of the mypy CI on main are failing with: xarray/core/dataset.py:6934: error: Incompatible types in assignment (expression has type "str", variable has type "Optional[Optional[Literal['Y', 'M', 'W', 'D', 'h', 'm', 's', 'ms', 'us', 'ns', 'ps', 'fs', 'as']]]") [assignment] I'm not sure what changed since the last pass, though.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6714/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
628436420 MDU6SXNzdWU2Mjg0MzY0MjA= 4116 xarray ufuncs keewis 14808389 closed 0     5 2020-06-01T13:25:54Z 2022-04-19T03:26:53Z 2022-04-19T03:26:53Z MEMBER      

The documentation warns that the universal functions in xarray.ufuncs should not be used unless compatibility with numpy < 1.13 is required.

Since we only support numpy >= 1.15: is it time to remove that (already deprecated) module?

Since there are also functions that are not true ufuncs (e.g. np.angle and np.median) and need __array_function__ (or something similar, see #3917), we could also keep those and just remove the ones that are dispatched using __array_ufunc__.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4116/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1174386963 I_kwDOAMm_X85F_7kT 6382 `reindex` drops attrs keewis 14808389 closed 0     1 2022-03-19T22:37:46Z 2022-03-21T07:53:05Z 2022-03-21T07:53:04Z MEMBER      

What happened?

reindex stopped propagating attrs (detected in xarray-contrib/pint-xarray#159).

As far as I can tell, the new reindexing code in Aligner does not handle attrs yet?

Minimal Complete Verifiable Example

```Python

before #5692

In [1]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature")

In [2]: ds.reindex(lat=range(10, 80, 5)).lat Out[2]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y

In [3]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[3]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y

after #5692

In [3]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature")

In [4]: ds.reindex(lat=range(10, 80, 5)).lat Out[4]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75

In [5]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[5]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6382/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1125877338 I_kwDOAMm_X85DG4Za 6250 failing docs builds because the `scipy` intersphinx registry is unreachable keewis 14808389 closed 0     6 2022-02-07T11:44:44Z 2022-02-08T21:49:48Z 2022-02-08T21:49:47Z MEMBER      

What happened?

scipy seems to have some trouble with their documentation host setup, which means that trying to fetch its intersphinx registry returns a 404.

There's nothing we can do to really fix this, but we can try to the avoid docs build failures by disabling that intersphinx entry (not sure if that results in other errors, though)

What did you expect to happen?

No response

Minimal Complete Verifiable Example

No response

Relevant log output

python WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://docs.scipy.org/doc/scipy/objects.inv' not fetchable due to <class 'requests.exceptions.HTTPError'>: 404 Client Error: Not Found for url: https://docs.scipy.org/doc/scipy/objects.inv

Anything else we need to know?

No response

Environment

See RTD

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6250/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
861860833 MDU6SXNzdWU4NjE4NjA4MzM= 5190 PRs cancel CI on push keewis 14808389 closed 0     1 2021-04-19T20:33:52Z 2022-01-31T16:59:27Z 2022-01-31T16:59:27Z MEMBER      

The cancel step doesn't seem to be configured properly, a push to a pull request shouldn't cancel CI on master. For reference, here are the logs for two examples: CI Additional, CI.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5190/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
928448281 MDU6SXNzdWU5Mjg0NDgyODE= 5517 mypy issues on `main` keewis 14808389 closed 0     3 2021-06-23T16:38:24Z 2021-06-24T08:58:06Z 2021-06-24T08:58:06Z MEMBER      

The pre-commit is reporting some mypy issues on the main branch: xarray/core/formatting.py:193: error: No overload variant of "max" matches argument types "signedinteger[Any]", "int" [call-overload] xarray/core/formatting.py:193: note: Possible overload variant: xarray/core/formatting.py:193: note: def [SupportsLessThanT <: SupportsLessThan] max(__arg1, SupportsLessThanT, SupportsLessThanT, *_args: SupportsLessThanT, key: None = ...) -> SupportsLessThanT xarray/core/formatting.py:193: note: <5 more non-matching overloads not shown> xarray/core/indexes.py:166: error: Incompatible types in assignment (expression has type "Union[dtype[object_], dtype[Any], dtype[void]]", variable has type "dtype[object_]") [assignment] xarray/core/computation.py:607: error: Argument 1 to "append" of "list" has incompatible type "None"; expected "slice" [arg-type] xarray/core/accessor_str.py:277: error: <nothing> not callable [misc] Found 4 errors in 4 files (checked 143 source files)

does anyone know how to fix those?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5517/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
895470157 MDU6SXNzdWU4OTU0NzAxNTc= 5344 failing cftime plot tests keewis 14808389 closed 0     2 2021-05-19T13:48:49Z 2021-06-12T12:57:52Z 2021-06-12T12:57:52Z MEMBER      

This was reported in #5077 and seems to be an upstream issue (see https://github.com/pydata/xarray/issues/5077#issuecomment-808605313):

FAILED xarray/tests/test_plot.py::TestCFDatetimePlot::test_cfdatetime_line_plot FAILED xarray/tests/test_plot.py::TestCFDatetimePlot::test_cfdatetime_pcolormesh_plot FAILED xarray/tests/test_plot.py::TestCFDatetimePlot::test_cfdatetime_contour_plot

In order to get the upsteam-dev CI open new issues for other failing tests on upstream-dev I've xfail'ed them in #5343. This should be undone once it is fixed in either cftime or nc-time-axis.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5344/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
889162918 MDU6SXNzdWU4ODkxNjI5MTg= 5291 ds = xr.tutorial.load_dataset("air_temperature") with 0.18 needs engine argument keewis 14808389 closed 0     15 2021-05-11T23:42:58Z 2021-05-18T21:23:00Z 2021-05-18T21:23:00Z MEMBER      

From xarray-contrib/xarray-tutorial#43 by @scottyhq:

Many notebooks out there start with the line ds = xr.tutorial.load_dataset("air_temperature"). That now gives an error traceback with xarray>=0.18:

pytb Traceback (most recent call last): File "/Users/scott/GitHub/zarrdata/./create_zarr.py", line 6, in <module> ds = xr.tutorial.load_dataset("air_temperature") File "/Users/scott/miniconda3/envs/zarrdata/lib/python3.9/site-packages/xarray/tutorial.py", line 179, in load_dataset with open_dataset(*args, **kwargs) as ds: File "/Users/scott/miniconda3/envs/zarrdata/lib/python3.9/site-packages/xarray/tutorial.py", line 100, in open_dataset ds = _open_dataset(filepath, **kws) File "/Users/scott/miniconda3/envs/zarrdata/lib/python3.9/site-packages/xarray/backends/api.py", line 485, in open_dataset engine = plugins.guess_engine(filename_or_obj) File "/Users/scott/miniconda3/envs/zarrdata/lib/python3.9/site-packages/xarray/backends/plugins.py", line 112, in guess_engine raise ValueError("cannot guess the engine, try passing one explicitly") ValueError: cannot guess the engine, try passing one explicitly

It's an easy fix though, just add ds = xr.tutorial.load_dataset("air_temperature", engine="netcdf4"), new users might be thrown by that though. Also a note that unless the netcdf4 library is explicitly put into the software environment, even adding the engine=netcdf4 can result in an error: "ValueError: unrecognized engine netcdf4 must be one of: ['store', 'zarr']", so I think a minimal environment definition to run would be: name: xarray-tutorial channels: - conda-forge dependencies: - xarray=0.18 - pooch=1.3 - netcdf4=1.5 - zarr=2.8

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5291/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
873751431 MDU6SXNzdWU4NzM3NTE0MzE= 5241 linters are not run after pre-commit autoupdate keewis 14808389 closed 0     0 2021-05-01T19:33:47Z 2021-05-16T11:21:40Z 2021-05-16T11:21:40Z MEMBER      

4906 added a autoupdate CI for the pre-commit hooks. However, once the hook versions are updated, the pre-commit CI is not run so potential errors will only be visible after merging the PR.

The documentation of create-pr-action states that that's because GITHUB_TOKEN does not allow it to trigger those workflows so we'd have to create a personal access token, or run pre-commit run --all-files immediately after the autoupdate.

The second option is definitely easier but would miss the separate mypy job (not sure how much of an issue that is).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5241/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
855330062 MDU6SXNzdWU4NTUzMzAwNjI= 5138 failing CI keewis 14808389 closed 0     13 2021-04-11T15:08:08Z 2021-04-25T21:08:43Z 2021-04-12T18:17:50Z MEMBER      

It seems cfgrib (or one of its dependencies) is breaking the import of xarray on both upsteam-dev and some of the normal CI. Interestingly, the broken environments installed eccodes=2.19.1 while the ones which are passing have eccodes=2.21.0.

Is this a mamba / conda caching issue?

cc @andersy005 @alexamici

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5138/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
856831600 MDU6SXNzdWU4NTY4MzE2MDA= 5147 failing CI due to CFTimeIndex errors keewis 14808389 closed 0     0 2021-04-13T10:40:17Z 2021-04-14T13:27:10Z 2021-04-14T13:27:10Z MEMBER      

the normal CI started to fail: FAILED xarray/tests/test_cftimeindex.py::test_distant_cftime_datetime_sub_cftimeindex[365_day] FAILED xarray/tests/test_cftimeindex.py::test_distant_cftime_datetime_sub_cftimeindex[360_day] FAILED xarray/tests/test_cftimeindex.py::test_distant_cftime_datetime_sub_cftimeindex[julian] FAILED xarray/tests/test_cftimeindex.py::test_distant_cftime_datetime_sub_cftimeindex[all_leap] FAILED xarray/tests/test_cftimeindex.py::test_distant_cftime_datetime_sub_cftimeindex[366_day] FAILED xarray/tests/test_cftimeindex.py::test_distant_cftime_datetime_sub_cftimeindex[gregorian] FAILED xarray/tests/test_cftimeindex.py::test_distant_cftime_datetime_sub_cftimeindex[proleptic_gregorian]

traceback ```pytb [gw2] linux -- Python 3.9.2 /usr/share/miniconda/envs/xarray-tests/bin/python > ??? E TypeError: Expected unicode, got datetime.timedelta pandas/_libs/tslibs/timedeltas.pyx:264: TypeError During handling of the above exception, another exception occurred: calendar = 'proleptic_gregorian' @requires_cftime @pytest.mark.parametrize("calendar", _CFTIME_CALENDARS) def test_distant_cftime_datetime_sub_cftimeindex(calendar): a = xr.cftime_range("2000", periods=5, calendar=calendar) with pytest.raises(ValueError, match="difference exceeds"): > a.date_type(1, 1, 1) - a /home/runner/work/xarray/xarray/xarray/tests/test_cftimeindex.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /home/runner/work/xarray/xarray/xarray/coding/cftimeindex.py:581: in __rsub__ return pd.TimedeltaIndex(other - np.array(self)) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/indexes/timedeltas.py:155: in __new__ tdarr = TimedeltaArray._from_sequence_not_strict( /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/arrays/timedeltas.py:250: in _from_sequence_not_strict data, inferred_freq = sequence_to_td64ns(data, copy=copy, unit=unit) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/arrays/timedeltas.py:957: in sequence_to_td64ns data = objects_to_td64ns(data, unit=unit, errors=errors) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/arrays/timedeltas.py:1067: in objects_to_td64ns result = array_to_timedelta64(values, unit=unit, errors=errors) pandas/_libs/tslibs/timedeltas.pyx:269: in pandas._libs.tslibs.timedeltas.array_to_timedelta64 ??? pandas/_libs/tslibs/timedeltas.pyx:222: in pandas._libs.tslibs.timedeltas.convert_to_timedelta64 ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > ??? E OverflowError: Python int too large to convert to C long pandas/_libs/tslibs/timedeltas.pyx:167: OverflowError ```

This seems to coincide with the release of pandas=1.2.4, and this is indeed the only difference in the environment (I didn't try to reproduce, though).

cc @spencerkclark

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5147/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
603497896 MDU6SXNzdWU2MDM0OTc4OTY= 3986 building the visualization gallery is slow keewis 14808389 closed 0     2 2020-04-20T20:00:51Z 2021-03-24T17:56:49Z 2021-03-24T17:56:49Z MEMBER      

When running sphinx to build the documentation, it frequently times out when trying to build the visualization gallery. Running bash /usr/bin/time -v python -c 'import xarray as xr; xr.open_rasterio("https://github.com/mapbox/rasterio/raw/master/tests/data/RGB.byte.tif")' reports that it takes at least 5 minutes (or time out after 10 minutes) if opened from the url. Subsequent calls use the cache, so the second rasterio example is fast.

If instead I download the file manually and then load from disk, the whole notebook completes in about 10 seconds. Also, directly calling rasterio.open completes in a few seconds, so the bug should be in open_rasterio.

I do think we should try to fix this in the backend, but maybe we could also cache RGB.byte.tif in the same directory as the xarray.tutorial data and open the cached file in the gallery?

Edit: this is really flaky, I can't reliably reproduce this.

Edit2: for now, I'm using a extra cell containing ```python import pathlib import shutil import requests

cache_dir = pathlib.Path.home() / ".xarray_tutorial_data" path = cache_dir / "RGB.byte.tif" url = "https://github.com/mapbox/rasterio/raw/master/tests/data/RGB.byte.tif"

if not path.exists() or path.stat().st_size == 0: with requests.get(url) as r, path.open(mode="wb") as f: if r.status_code == requests.codes.ok: shutil.copyfileobj(r.raw, f) else: print("download failed: {r.status_code}") r.raise_for_status()

url = path ``` and modify both examples to use the new url

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3986/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
657230088 MDU6SXNzdWU2NTcyMzAwODg= 4226 failing upstream-dev CI: matplotlib keewis 14808389 closed 0     1 2020-07-15T10:08:04Z 2021-03-07T13:31:37Z 2021-03-07T13:31:37Z MEMBER      

Recent changes in matplotlib (my guess is matplotlib/matplotlib#17830) seem to have broken a few of our tests:

tracebacks ``` ________________________ TestContour.test_single_level _________________________ self = <xarray.tests.test_plot.TestContour object at 0x7fc60f7c3e20> # add_colorbar defaults to false > self.plotmethod(levels=[0.1]) xarray/tests/test_plot.py:1561: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray/plot/plot.py:827: in plotmethod return newplotfunc(**allargs) xarray/plot/plot.py:699: in newplotfunc cmap_params, cbar_kwargs = _process_cmap_cbar_kwargs( xarray/plot/utils.py:819: in _process_cmap_cbar_kwargs cmap_params = _determine_cmap_params(**cmap_kwargs) xarray/plot/utils.py:291: in _determine_cmap_params cmap, newnorm = _build_discrete_cmap(cmap, levels, extend, filled) xarray/plot/utils.py:77: in _build_discrete_cmap new_cmap, cnorm = mpl.colors.from_levels_and_colors(levels, pal, extend=extend) /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:2200: in from_levels_and_colors norm = BoundaryNorm(levels, ncolors=n_data_colors) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <matplotlib.colors.BoundaryNorm object at 0x7fc60f5048e0> boundaries = [0.1], ncolors = 0, clip = False def __init__(self, boundaries, ncolors, clip=False, *, extend='neither'): """ Parameters ---------- boundaries : array-like Monotonically increasing sequence of at least 2 boundaries. ncolors : int Number of colors in the colormap to be used. clip : bool, optional """ if clip and extend != 'neither': raise ValueError("'clip=True' is not compatible with 'extend'") self.clip = clip self.vmin = boundaries[0] self.vmax = boundaries[-1] self.boundaries = np.asarray(boundaries) self.N = len(self.boundaries) if self.N < 2: > raise ValueError("You must provide at least 2 boundaries " f"(1 region) but you passed in {boundaries!r}") E ValueError: You must provide at least 2 boundaries (1 region) but you passed in [0.1] /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:1508: ValueError ________________________ test_facetgrid_single_contour _________________________ @requires_matplotlib def test_facetgrid_single_contour(): @requires_matplotlib def test_facetgrid_single_contour(): # regression test for GH3569 x, y = np.meshgrid(np.arange(12), np.arange(12)) z = xr.DataArray(np.sqrt(x ** 2 + y ** 2)) z2 = xr.DataArray(np.sqrt(x ** 2 + y ** 2) + 1) ds = xr.concat([z, z2], dim="time") ds["time"] = [0, 1] > ds.plot.contour(col="time", levels=[4], colors=["k"]) xarray/tests/test_plot.py:2409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray/plot/plot.py:827: in plotmethod return newplotfunc(**allargs) xarray/plot/plot.py:638: in newplotfunc return _easy_facetgrid(darray, kind="dataarray", **allargs) xarray/plot/facetgrid.py:644: in _easy_facetgrid return g.map_dataarray(plotfunc, x, y, **kwargs) xarray/plot/facetgrid.py:248: in map_dataarray cmap_params, cbar_kwargs = _process_cmap_cbar_kwargs( xarray/plot/utils.py:819: in _process_cmap_cbar_kwargs cmap_params = _determine_cmap_params(**cmap_kwargs) xarray/plot/utils.py:291: in _determine_cmap_params cmap, newnorm = _build_discrete_cmap(cmap, levels, extend, filled) xarray/plot/utils.py:77: in _build_discrete_cmap new_cmap, cnorm = mpl.colors.from_levels_and_colors(levels, pal, extend=extend) /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:2200: in from_levels_and_colors norm = BoundaryNorm(levels, ncolors=n_data_colors) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <matplotlib.colors.BoundaryNorm object at 0x7fc60d47b280> boundaries = [4], ncolors = 0, clip = False def __init__(self, boundaries, ncolors, clip=False, *, extend='neither'): """ Parameters ---------- boundaries : array-like Monotonically increasing sequence of at least 2 boundaries. ncolors : int Number of colors in the colormap to be used. clip : bool, optional If clip is ``True``, out of range values are mapped to 0 if they are below ``boundaries[0]`` or mapped to ``ncolors - 1`` if they are above ``boundaries[-1]``. If clip is ``False``, out of range values are mapped to -1 if they are below ``boundaries[0]`` or mapped to *ncolors* if they are above ``boundaries[-1]``. These are then converted to valid indices by `Colormap.__call__`. extend : {'neither', 'both', 'min', 'max'}, default: 'neither' Extend the number of bins to include one or both of the regions beyond the boundaries. For example, if ``extend`` is 'min', then the color to which the region between the first pair of boundaries is mapped will be distinct from the first color in the colormap, and by default a `~matplotlib.colorbar.Colorbar` will be drawn with the triangle extension on the left or lower end. Returns ------- int16 scalar or array Notes ----- *boundaries* defines the edges of bins, and data falling within a bin is mapped to the color with the same index. If the number of bins, including any extensions, is less than *ncolors*, the color index is chosen by linear interpolation, mapping the ``[0, nbins - 1]`` range onto the ``[0, ncolors - 1]`` range. """ if clip and extend != 'neither': raise ValueError("'clip=True' is not compatible with 'extend'") self.clip = clip self.vmin = boundaries[0] self.vmax = boundaries[-1] self.boundaries = np.asarray(boundaries) self.N = len(self.boundaries) if self.N < 2: > raise ValueError("You must provide at least 2 boundaries " f"(1 region) but you passed in {boundaries!r}") E ValueError: You must provide at least 2 boundaries (1 region) but you passed in [4] /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:1508: ValueError ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4226/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
550335922 MDU6SXNzdWU1NTAzMzU5MjI= 3697 documentation build issues on RTD keewis 14808389 closed 0     8 2020-01-15T17:46:13Z 2021-02-25T13:52:03Z 2021-02-21T22:21:55Z MEMBER      

It seems we have (seemingly) random failures on RTD.

Some of these are the known memory issue: recreating my doc environment used about 1.4 GB of RAM, which might be too much for RTD, even with the extended memory.

Much more often is a timeout when building the docs but I can't reproduce them locally. Any ideas? Edit: This really is random, I tried rerunning and the build passed.

Also, a warning: proj_create: init=epsg:/init=IGNF: syntax not supported in non-PROJ4 emulation mode

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3697/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
784628736 MDU6SXNzdWU3ODQ2Mjg3MzY= 4801 scheduled upstream-dev CI is skipped keewis 14808389 closed 0     5 2021-01-12T22:08:19Z 2021-01-14T00:09:43Z 2021-01-13T21:35:44Z MEMBER      

The scheduled CI is being skipped since about 6 days ago (which means this is probably due to the merge of #4729, see this run before the merge and this run after the merge).

This is really strange because workflow_dispatch events (for which the CI detection job is also skipped) still work perfectly fine.

Edit: it seems to be because of https://github.com/pydata/xarray/blob/f52a95cbe694336fe47bc5a42c713bee8ad74d64/.github/workflows/upstream-dev-ci.yaml#L34-L37 if I remove that the scheduled CI runs.

cc @andersy005

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4801/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
765675044 MDU6SXNzdWU3NjU2NzUwNDQ= 4688 drop python 3.6 support keewis 14808389 closed 0     0 2020-12-13T22:11:15Z 2021-01-07T18:15:18Z 2021-01-07T18:15:18Z MEMBER      

NEP 29 states that as of Jun 23, 2020 we can drop python 3.6 support. pandas will do so in their upcoming 1.2 release, so we might eventually be forced to do that, too.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4688/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
749035078 MDU6SXNzdWU3NDkwMzUwNzg= 4603 failing nightly CI keewis 14808389 closed 0     1 2020-11-23T18:35:25Z 2020-12-05T00:30:11Z 2020-12-05T00:30:11Z MEMBER      

It seems the scheduled CI failed because no artifacts were uploaded (i.e. the CI passed). Is it possible to skip the report job if the upstream-dev job passed, or to at least skip all following jobs after we found out that there are no artifacts to download? If not or if that turns out to be too complex we can just ignore the failure.

cc @andersy005

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4603/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
739281259 MDU6SXNzdWU3MzkyODEyNTk= 4571 upgrading mypy to 0.790 fails keewis 14808389 closed 0     3 2020-11-09T18:56:43Z 2020-11-13T19:38:05Z 2020-11-13T19:38:05Z MEMBER      

Continuing from #4567, trying to upgrade mypy to 0.790 fails with: xarray/core/utils.py:466: error: Value of type variable "_LT" of "sorted" cannot be "K" See also https://github.com/pydata/xarray/pull/4567#issuecomment-723546797 and https://github.com/pydata/xarray/pull/4567#issuecomment-724016427

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4571/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
736272283 MDU6SXNzdWU3MzYyNzIyODM= 4565 failing upstream-dev polyfit warning test keewis 14808389 closed 0     1 2020-11-04T16:56:26Z 2020-11-09T19:08:35Z 2020-11-09T19:08:35Z MEMBER      

In xarray/tests/test_dataset.py::TestDataset::test_polyfit_warnings, the second call to polyfit: python ds.var1.polyfit("dim2", 10, full=True)

emits a warning with the current upstream-dev pandas: xarray/tests/test_dataset.py::TestDataset::test_polyfit_warnings .../xarray/core/alignment.py:307: FutureWarning: Index.__or__ operating as a set operation is deprecated, in the future this will be a logical operation matching Series.__or__. Use index.union(other) instead index = joiner(matching_indexes)

since the test makes sure only the first call emits a warning (a np.RankWarning), the test fails.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4565/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
638211949 MDU6SXNzdWU2MzgyMTE5NDk= 4152 signature in accessor methods keewis 14808389 closed 0     5 2020-06-13T18:49:16Z 2020-09-06T23:05:04Z 2020-09-06T23:05:04Z MEMBER      

With the merge of #3988 we're now properly documenting the str, dt and plot accessors, but the signatures of the plotting methods are missing a few parameters. For example, DataArray.plot.contour is missing the parameter x (it's properly documented in the parameters section, though).

Also, we need to remove the str and dt accessors from Computing and try to figure out how to fix the summary of DataArray.plot (the plotting method).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4152/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
642039530 MDU6SXNzdWU2NDIwMzk1MzA= 4165 allow specifying a fill value per variable keewis 14808389 closed 0     1 2020-06-19T15:08:16Z 2020-08-24T22:03:15Z 2020-08-24T22:03:15Z MEMBER      

While working on #4163 I noticed that the fill value parameter for align (but maybe also reindex, concat, merge and combine_*?) will be used for all variables (except dimension coordinates) which is obviously not ideal when working with quantities. Would it make sense to optionally allow fill_value to be a dict which maps a fill value to a variable name?

Consider this: python In [2]: a = xr.Dataset( ...: data_vars={"a": ("x", [12, 14, 13, 10, 8])}, ...: coords={"x": [-2, -1, 0, 1, 2], "u": ("x", [-20, -10, 0, 10, 20])}, ...: ) ...: b = xr.Dataset( ...: data_vars={"b": ("x", [7, 9, 3])}, ...: coords={"x": [0, 3, 4], "u": ("x", [0, 30, 40])}, ...: ...: ) ...: ...: xr.align(a, b, join="outer", fill_value=-50) Out[2]: (<xarray.Dataset> Dimensions: (x: 7) Coordinates: * x (x) int64 -2 -1 0 1 2 3 4 u (x) int64 -20 -10 0 10 20 -50 -50 Data variables: a (x) int64 12 14 13 10 8 -50 -50, <xarray.Dataset> Dimensions: (x: 7) Coordinates: * x (x) int64 -2 -1 0 1 2 3 4 u (x) int64 -50 -50 0 -50 -50 30 40 Data variables: b (x) int64 -50 -50 7 -50 -50 9 3) I'd like to be able to do something like this instead: python In [3]: xr.align(a, b, join="outer", fill_value={"a": -30, "b": -40, "u": -50}) Out[3]: (<xarray.Dataset> Dimensions: (x: 7) Coordinates: * x (x) int64 -2 -1 0 1 2 3 4 u (x) int64 -20 -10 0 10 20 -50 -50 Data variables: a (x) int64 12 14 13 10 8 -30 -30, <xarray.Dataset> Dimensions: (x: 7) Coordinates: * x (x) int64 -2 -1 0 1 2 3 4 u (x) int64 -40 -40 0 -40 -40 30 40 Data variables: b (x) int64 -50 -50 7 -50 -50 9 3)

I could get there by passing the default (dtypes.NA) and then using fillna, but that only seems to work with data variables so coordinates would need to pass through a reset_coords / set_coords cycle. Also, with this the dtype is changed to float.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4165/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
676827750 MDU6SXNzdWU2NzY4Mjc3NTA= 4334 missing parameter in DataArray.str.get keewis 14808389 closed 0     1 2020-08-11T12:13:58Z 2020-08-15T10:28:05Z 2020-08-15T10:28:05Z MEMBER      

While working on #4286 I noticed that the docstring of DataArray.str.get claims to allow passing a default value in addition to the index, but the python code doesn't have that parameter at all. I think the default value is a good idea and that we should make the code match the docstring.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4334/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
675602229 MDU6SXNzdWU2NzU2MDIyMjk= 4328 failing docs CI keewis 14808389 closed 0     1 2020-08-08T23:09:48Z 2020-08-09T11:57:38Z 2020-08-09T11:57:38Z MEMBER      

The RTD builds are timing out again. With my own setup I get this error instead:

Traceback ```pytb AttributeError Traceback (most recent call last) ~/checkouts/readthedocs.org/user_builds/xarray-keewis/conda/latest/lib/python3.8/site-packages/IPython/core/formatters.py in __call__(self, obj) 700 type_pprinters=self.type_printers, 701 deferred_pprinters=self.deferred_printers) --> 702 printer.pretty(obj) 703 printer.flush() 704 return stream.getvalue() ~/checkouts/readthedocs.org/user_builds/xarray-keewis/conda/latest/lib/python3.8/site-packages/IPython/lib/pretty.py in pretty(self, obj) 392 if cls is not object \ 393 and callable(cls.__dict__.get('__repr__')): --> 394 return _repr_pprint(obj, self, cycle) 395 396 return _default_pprint(obj, self, cycle) ~/checkouts/readthedocs.org/user_builds/xarray-keewis/conda/latest/lib/python3.8/site-packages/IPython/lib/pretty.py in _repr_pprint(obj, p, cycle) 698 """A pprint that just redirects to the normal repr function.""" 699 # Find newlines and replace them with p.break_() --> 700 output = repr(obj) 701 lines = output.splitlines() 702 with p.group(): ~/checkouts/readthedocs.org/user_builds/xarray-keewis/checkouts/latest/xarray/core/rolling.py in __repr__(self) 99 """provide a nice str repr of our rolling object""" 100 --> 101 attrs = [ 102 "{k}->{v}".format(k=k, v=getattr(self, k)) 103 for k in list(self.dim) + self.window + self.center + [self.min_periods] ~/checkouts/readthedocs.org/user_builds/xarray-keewis/checkouts/latest/xarray/core/rolling.py in <listcomp>(.0) 100 101 attrs = [ --> 102 "{k}->{v}".format(k=k, v=getattr(self, k)) 103 for k in list(self.dim) + self.window + self.center + [self.min_periods] 104 ] AttributeError: 'DataArrayRolling' object has no attribute 'y' ```

I think that was introduced in #4219. cc @fujiisoup

Also, we should definitely ask support why those two behave differently. Edit: see readthedocs/readthedocs.org#7371

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4328/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
668166816 MDU6SXNzdWU2NjgxNjY4MTY= 4287 failing docs CI keewis 14808389 closed 0     4 2020-07-29T21:19:30Z 2020-08-05T21:31:46Z 2020-08-05T21:31:46Z MEMBER      

I'm not quite sure why (maybe a pandas or matplotlib release?), but the docs CI raises a exception in plotting.rst: https://github.com/pydata/xarray/blob/a081d01df11610adea7a48acee5a71d9eb5ffd16/doc/plotting.rst#L589-L590

there are also sphinx warnings about malformed rst: /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/4286/lib/python3.8/site-packages/pandas/core/base.py:docstring of xarray.CFTimeIndex.max:7: WARNING: Inline strong start-string without end-string. /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/4286/lib/python3.8/site-packages/pandas/core/base.py:docstring of xarray.CFTimeIndex.min:7: WARNING: Inline strong start-string without end-string. so I guess there was a pandas release a few days ago?

Edit: pandas 1.1.0 was released yesterday

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4287/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
650929228 MDU6SXNzdWU2NTA5MjkyMjg= 4202 isort flags keewis 14808389 closed 0     0 2020-07-04T17:39:17Z 2020-07-16T19:13:57Z 2020-07-16T19:13:57Z MEMBER      

Because I've been hit by this elsewhere: isort has released a new version today that removes the -rc / --recursive flag (without deprecation, I think). Once conda-forge has been updated, we will start seeing a failing isort CI.

Not sure if we should pin isort for now or simply remove the flag once the builds are failing (or we remove the flag now and have a pretty much useless isort CI until the feedstock has been merged).

Update: isort relaxed the error into a warning, so this is not that urgent

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4202/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
657223618 MDU6SXNzdWU2NTcyMjM2MTg= 4225 failing type checking CI keewis 14808389 closed 0     2 2020-07-15T09:57:59Z 2020-07-15T12:24:46Z 2020-07-15T12:24:46Z MEMBER      

Due to the update of pytest, mypy reports errors: xarray/tests/test_cftimeindex_resample.py:57: error: List or tuple expected as variable arguments xarray/tests/test_dataset.py:2407: error: No overload variant of "__call__" of "_XfailMarkDecorator" matches argument type "Type[AssertionError]" xarray/tests/test_dataset.py:2407: note: Possible overload variant: xarray/tests/test_dataset.py:2407: note: def __call__(self, condition: Union[str, bool] = ..., *conditions: Union[str, bool], reason: str = ..., run: bool = ..., raises: Union[BaseException, Tuple[BaseException, ...]] = ..., strict: bool = ...) -> MarkDecorator xarray/tests/test_dataset.py:2407: note: <1 more non-matching overload not shown> xarray/tests/test_dask.py:767: error: No overload variant of "__call__" of "_XfailMarkDecorator" matches argument type "Type[NotImplementedError]" xarray/tests/test_dask.py:767: note: Possible overload variant: xarray/tests/test_dask.py:767: note: def __call__(self, condition: Union[str, bool] = ..., *conditions: Union[str, bool], reason: str = ..., run: bool = ..., raises: Union[BaseException, Tuple[BaseException, ...]] = ..., strict: bool = ...) -> MarkDecorator xarray/tests/test_dask.py:767: note: <1 more non-matching overload not shown> xarray/tests/test_dataarray.py:3953: error: No overload variant of "__call__" of "_XfailMarkDecorator" matches argument type "Type[AssertionError]" xarray/tests/test_dataarray.py:3953: note: Possible overload variant: xarray/tests/test_dataarray.py:3953: note: def __call__(self, condition: Union[str, bool] = ..., *conditions: Union[str, bool], reason: str = ..., run: bool = ..., raises: Union[BaseException, Tuple[BaseException, ...]] = ..., strict: bool = ...) -> MarkDecorator xarray/tests/test_dataarray.py:3953: note: <1 more non-matching overload not shown> Found 4 errors in 4 files (checked 124 source files) since that particular pytest version is a release candidate, should we pin pytest for now?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4225/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
650884355 MDU6SXNzdWU2NTA4ODQzNTU= 4200 Upstream-dev matplotlib failure keewis 14808389 closed 0     0 2020-07-04T12:37:02Z 2020-07-04T17:24:15Z 2020-07-04T17:24:15Z MEMBER      

The upstream-dev CI currently fails because matplotlib upstream seems to have removed / renamed Colorbar._label. We definitely need to change that since it's a private attribute, but I'm not quite sure what we can use instead.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4200/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
642990666 MDU6SXNzdWU2NDI5OTA2NjY= 4168 upstream-dev failures due to numpy dtype deprecations keewis 14808389 closed 0     0 2020-06-22T11:30:07Z 2020-06-22T22:51:57Z 2020-06-22T22:51:57Z MEMBER      

It seems numpy recently changed their dtype system and we now get warnings for np.bool, np.int and np.long. Instead, the warnings suggest we should use the python types or np.compat.long.

Since we have a few tests that require zero warnings, this means our test suite fails (and more than 13000 warnings are not a good thing, anyways).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4168/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
537156661 MDU6SXNzdWU1MzcxNTY2NjE= 3617 diff formatting for assert_allclose keewis 14808389 closed 0     0 2019-12-12T18:59:52Z 2020-06-13T17:53:03Z 2020-06-13T17:53:03Z MEMBER      

At the moment, the output of a failing assert_allclose is not really useful, it simply prints the values it compared: ```python In [1]: import numpy as np ...: import xarray as xr ...: from xarray.testing import assert_allclose, assert_identical

In [2]: da1 = xr.DataArray(data=[2, 5, 8], coords={"x": [0, 1, 2]}, dims="x") ...: da2 = xr.DataArray(data=[1, 5, 8], coords={"x": [0, 1, 3]}, dims="x")

In [3]: assert_allclose(da1, da2)

AssertionError Traceback (most recent call last) <ipython-input-6-55cbcc1eeb58> in <module> ----> 1 assert_allclose(da1, da2)

~/Documents/Programming/xarray/xarray/testing.py in assert_allclose(a, b, rtol, atol, decode_bytes) 123 assert allclose, f"{a.values}\n{b.values}" 124 elif isinstance(a, DataArray): --> 125 assert_allclose(a.variable, b.variable, **kwargs) 126 assert set(a.coords) == set(b.coords) 127 for v in a.coords.variables:

~/Documents/Programming/xarray/xarray/testing.py in assert_allclose(a, b, rtol, atol, decode_bytes) 121 assert a.dims == b.dims 122 allclose = _data_allclose_or_equiv(a.values, b.values, kwargs) --> 123 assert allclose, f"{a.values}\n{b.values}" 124 elif isinstance(a, DataArray): 125 assert_allclose(a.variable, b.variable, kwargs)

AssertionError: [2 5 8] [1 5 8]

In [4]: assert_identical(da1, da2)

AssertionError Traceback (most recent call last) <ipython-input-7-127e2b1ca0dc> in <module> ----> 1 assert_identical(da1, da2)

~/Documents/Programming/xarray/xarray/testing.py in assert_identical(a, b) 83 elif isinstance(a, DataArray): 84 assert a.name == b.name ---> 85 assert a.identical(b), formatting.diff_array_repr(a, b, "identical") 86 elif isinstance(a, (Dataset, Variable)): 87 assert a.identical(b), formatting.diff_dataset_repr(a, b, "identical")

AssertionError: Left and right DataArray objects are not identical

Differing values: L array([2, 5, 8]) R array([1, 5, 8]) Differing coordinates: L * x (x) int64 0 1 2 R * x (x) int64 0 1 3 `` It would be good to somehow extendformatting.diff_array_reprandformatting.diff_dataset_reprand to then use those to print a diff forassert_allclose`.

I found a reference to this: https://github.com/pydata/xarray/pull/1690#issuecomment-455540813, but it's not being actively worked on.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3617/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
538092307 MDU6SXNzdWU1MzgwOTIzMDc= 3625 Documentation for injected methods keewis 14808389 closed 0     4 2019-12-15T19:09:47Z 2020-06-13T17:52:46Z 2020-06-13T17:52:46Z MEMBER      

At the moment, the documentation for methods introduced by accessors is broken. We work around this issue by directly referencing the corresponding accessor method or the underlying function. Of course, this is not how we recommend calling them, so we should try to fix this.

It seems this is mostly a sphinx issue since this is known (see #3333 and #3602), but we didn't find a way to resolve this. I'd like use this issue to collect ideas for solutions.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3625/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
637227979 MDU6SXNzdWU2MzcyMjc5Nzk= 4147 readthedocs build / documentation build time keewis 14808389 closed 0     1 2020-06-11T18:17:53Z 2020-06-12T15:03:20Z 2020-06-12T15:03:20Z MEMBER      

It seems that after the merge of #3818, the RTD builds started to time out while the docs CI take about 37 minutes.

As a reference, before #3818 our docs CI completed in about 6 minutes.

I'm not sure if that is due to #3818 or because of updated dependencies (dask?), but I think we should try not to take these 30 minutes.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4147/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
634979933 MDU6SXNzdWU2MzQ5Nzk5MzM= 4133 upstream-dev failure when installing pandas keewis 14808389 closed 0     3 2020-06-08T22:39:01Z 2020-06-11T02:14:49Z 2020-06-11T02:14:49Z MEMBER      

So after #4124 has been fixed upstream, we now have problems with installing numpy / pandas:

I'm not sure if I read the logs correctly, but it seems pandas (or the process to build their wheel, see their pyproject.toml) depends on numpy==1.15.4 which doesn't have a wheel for py38 and thus the source distribution is chosen. I'm not sure why the building the wheel from sdist doesn't work anymore, though.

For reference, see the logs of the last passing build.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4133/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
631860723 MDU6SXNzdWU2MzE4NjA3MjM= 4124 upstream-dev CI fails on import of dask.array keewis 14808389 closed 0     9 2020-06-05T19:08:31Z 2020-06-10T12:41:44Z 2020-06-10T12:41:44Z MEMBER      

I'm not sure why, but our upstream-dev CI fails when importing dask.array: https://dev.azure.com/xarray/xarray/_build/results?buildId=2996&view=logs&j=2280efed-fda1-53bd-9213-1fa8ec9b4fa8&t=aa9ddc6e-3e6c-56cb-4312-111c01d6f735

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4124/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
618234617 MDU6SXNzdWU2MTgyMzQ2MTc= 4062 clean up of the branches on the main repository keewis 14808389 closed 0     3 2020-05-14T13:31:57Z 2020-05-18T23:24:47Z 2020-05-18T14:03:19Z MEMBER      

We have several branches on our main repository that appear to be either out-dated or used to build documentation on RTD (these are no longer necessary since RTD fixed the conda-out-of-memory issues for everyone: building on personal RTD setups should work). Since they already lead to confusion, I think we should try clean them up. Here's a list of the branches I think we can remove: - [x] 0.11.x: seems to be for the 0.11 releases which are no longer maintained (points to the same commit as the v0.11.3 tag) - [ ] accessor-documentation: I intended to use this for building the documentation on RTD, but I accidentally opened #3988 from it instead of from the branch on my own repository. It will be removed once that PR is merged / closed. - [x] fix-docs is from more than a year ago. @shoyer, is there anything that was not already merged into master? - [x] fix-rtd was a attempt to fix RTD when it continuously failed about 1-2 months ago - [x] revert-3128-drop-keyword-support: I think that was used to give a contributor the possibility to amend to a already merged PR, but in the end a follow-up PR was used instead. - [x] scipy19-docs and scipy19-docs-backup: these were merged last year (but turns out there are a few pending PRs that try to merge into scipy19-docs)

Edit: I'm going to delete those branches on Monday

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4062/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
603594157 MDU6SXNzdWU2MDM1OTQxNTc= 3987 failing distributed tests on upstream-dev keewis 14808389 closed 0     0 2020-04-20T23:18:08Z 2020-04-21T12:53:45Z 2020-04-21T12:53:45Z MEMBER      

Since dask/distributed#3706 gen_cluster requires functions decorated with it to be coroutines. We use it twice in test_distributed.py: on test_async and on test_serializable_locks, which both are pre-3.5 style coroutines (yield / yield from instead of async / await) but not marked as such (I'm not sure what pytest does with coroutines).

I think this should be a pretty strait-forward fix: replace def with async def and yield / yield from with await.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3987/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
569789152 MDU6SXNzdWU1Njk3ODkxNTI= 3794 all-dependencies-but-dask CI keewis 14808389 closed 0     1 2020-02-24T11:15:54Z 2020-04-03T19:48:19Z 2020-04-03T19:48:19Z MEMBER      

We recently got a few reports about tests failing with most optional dependencies installed but not dask (#3777, #3778, #3779).

Since dask is pretty tightly coupled in our code, I think it might be worth to add a CI job that runs the tests in a environment with all optional dependencies except from dask. Thoughts?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3794/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
592823921 MDU6SXNzdWU1OTI4MjM5MjE= 3928 failing cftime upstream-dev tests keewis 14808389 closed 0     0 2020-04-02T18:00:54Z 2020-04-03T19:35:18Z 2020-04-03T19:35:18Z MEMBER      

Another set of failing cftime tests, this time because of warnings while calling open_dataset (I think?): FAILED xarray/tests/test_backends.py::test_use_cftime_standard_calendar_default_in_range[gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_standard_calendar_default_in_range[proleptic_gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_standard_calendar_default_in_range[standard] FAILED xarray/tests/test_backends.py::test_use_cftime_true[1500-360_day] FAILED xarray/tests/test_backends.py::test_use_cftime_true[1500-standard] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-360_day] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-365_day] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-366_day] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-all_leap] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-julian] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-noleap] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-proleptic_gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2000-standard] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-360_day] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-365_day] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-366_day] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-all_leap] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-julian] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-noleap] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-proleptic_gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_true[2500-standard] FAILED xarray/tests/test_backends.py::test_use_cftime_false_standard_calendar_in_range[gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_false_standard_calendar_in_range[proleptic_gregorian] FAILED xarray/tests/test_backends.py::test_use_cftime_false_standard_calendar_in_range[standard]

```pytb __ test_use_cftime_standard_calendar_default_in_range[gregorian] ___

calendar = 'gregorian' for v in ["x", "time"]: original[v].attrs["units"] = units original[v].attrs["calendar"] = calendar

    x_timedeltas = np.array(x).astype("timedelta64[D]")
    time_timedeltas = np.array(time).astype("timedelta64[D]")
    decoded_x = np.datetime64(units_date, "ns") + x_timedeltas
    decoded_time = np.datetime64(units_date, "ns") + time_timedeltas
    expected_x = DataArray(decoded_x, [("time", decoded_time)], name="x")
    expected_time = DataArray(decoded_time, [("time", decoded_time)], name="time")

    with create_tmp_file() as tmp_file:
        original.to_netcdf(tmp_file)
        with pytest.warns(None) as record:
            with open_dataset(tmp_file, use_cftime=False) as ds:
                assert_identical(expected_x, ds.x)
                assert_identical(expected_time, ds.time)
          assert not record

E assert not WarningsChecker(record=True)

xarray/tests/test_backends.py:4378: AssertionError

```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3928/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
572789017 MDU6SXNzdWU1NzI3ODkwMTc= 3809 warning in the MacOSX CI keewis 14808389 closed 0     0 2020-02-28T14:26:31Z 2020-03-07T04:49:22Z 2020-03-07T04:49:22Z MEMBER      

Running the MacOSX py38 CI job gives us this warning: ```

[warning]This pipeline uses a Microsoft-hosted agent image that will be removed on March 23, 2020 (MacOS-10.13). You must make changes to your pipeline before that date, or else your pipeline will fail. Learn more (https://aka.ms/removing-older-images-hosted-pools).

```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3809/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
538065011 MDU6SXNzdWU1MzgwNjUwMTE= 3623 RTD configuration file version keewis 14808389 closed 0     0 2019-12-15T15:18:42Z 2019-12-17T23:22:17Z 2019-12-17T23:22:17Z MEMBER      

RTD released a new version of the configuration file format a bit more than a year ago and deprecated the old version. I think we should try to migrate our setup to the new format. Thoughts?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3623/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
537152819 MDU6SXNzdWU1MzcxNTI4MTk= 3616 formatting of nep-18 compatible arrays with assert_* keewis 14808389 closed 0     2 2019-12-12T18:51:33Z 2019-12-13T11:27:55Z 2019-12-13T11:27:54Z MEMBER      

At the moment, the formatting.diff_*_repr functions that provide the pretty-printing for assert_* use repr to format NEP-18 strings, truncating the result if it is too long. In the case of pint's quantities, this makes the pretty printing useless since only a few values are visible and the unit is in the truncated part.

What should we about this? Does pint have to change its repr?

xref #3611

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3616/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
527755536 MDU6SXNzdWU1Mjc3NTU1MzY= 3567 version collisions on readthedocs keewis 14808389 closed 0     3 2019-11-24T20:58:24Z 2019-12-06T13:36:03Z 2019-12-03T18:59:41Z MEMBER      

In #3199 the documentation on readthedocs did not build because the newest packaged version of xarray was pulled in by a dependency -- in this case cfgrib. I'm not sure why, but this shadows the versions installed by at least python setup.py install --force and python -m pip install -e . (not python -m pip install ., I think, but readthedocs uses --update-strategy=eager for pip-installing, which deactivates version pinning).

Fortunately, cfgrib does not have a hard dependency on xarray (it's in the extras_require section), so this issue was bypassed in #3557 by using pip to install it. Should we ever want to introduce a dependency that requires xarray (or to use conda to install cfgrib), this is bound to resurface.

It might be that this is a problem with how versioneer constructs versions based on git commit hashes and a fix to #2853 is a fix to this, but it certainly needs more investigation.

3369 is related, but more about catching these sort of issues before merging.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3567/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
522498720 MDU6SXNzdWU1MjI0OTg3MjA= 3525 DatasetGroupBy does not implement quantile keewis 14808389 closed 0     6 2019-11-13T22:02:55Z 2019-11-15T19:58:02Z 2019-11-15T19:58:02Z MEMBER      

The docs claim quantile works on grouped datasets, but that does not seem to be the case: ```python

import xarray as xr ds = xr.Dataset(data_vars={"a": ("x", list("abcd"))}, coords={"x": range(4)}) ds.a.groupby(ds.x % 2 == 0).quantile <bound method DataArrayGroupBy.quantile of DataArrayGroupBy, grouped over 'x' 2 groups with labels False, True.> ds.groupby(ds.x % 2 == 0).quantile AttributeError: 'DatasetGroupBy' object has no attribute 'quantile' ``` this was found while trying to silence the nit-picky sphinx warnings in #3516

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3525/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
484097190 MDU6SXNzdWU0ODQwOTcxOTA= 3241 aggregation functions treat duck arrays differently depending on dtype keewis 14808389 closed 0     6 2019-08-22T16:30:56Z 2019-09-02T22:52:47Z 2019-09-02T22:52:47Z MEMBER      

While working on #3238, I tried replacing np.arange with np.linspace to create test arrays: ```python

ureg = pint.UnitRegistry()

with int values

array = np.arange(10).astype(int) * ureg.m np.max(array) <Quantity(9, 'meter')> np.max(xr.DataArray(data=array)) # works as expected <xarray.DataArray ()> <Quantity(9, 'meter')>

now with floats

array = np.arange(10).astype(float) * ureg.m np.max(array) <Quantity(9.0, 'meter')> np.max(xr.DataArray(data=array)) # unit information is lost <xarray.DataArray ()> array(9.) `` Judging by the build logs of #3238, this seems to be the case for all aggregation functions except fromnp.median` and of course those that return booleans or indices.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3241/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 1588.934ms · About: xarray-datasette