home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

306 rows where repo = 13221727, state = "closed" and user = 14808389 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, state_reason, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 257
  • issue 49

state 1

  • closed · 306 ✖

repo 1

  • xarray · 306 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
2275404926 PR_kwDOAMm_X85uWjVP 8993 call `np.cross` with 3D vectors only keewis 14808389 closed 0     1 2024-05-02T12:21:30Z 2024-05-03T15:56:49Z 2024-05-03T15:22:26Z MEMBER   0 pydata/xarray/pulls/8993
  • [x] towards #8844

In the tests, we've been calling np.cross with vectors of 2 or 3 dimensions, numpy>=2 will deprecate 2D vectors (plus, we're now raising on warnings). Thus, we 0-pad the inputs before generating the expected result (which generally should not change the outcome of the tests).

For a later PR: add tests to check if xr.cross works if more than a single dimension is present, and pre-compute the expected result. Also, for property-based testing: the cross-product of two vectors is perpendicular to both input vectors (use the dot product to check that), and its length (l2-norm) is the product of the lengths of the input vectors.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8993/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2241526039 PR_kwDOAMm_X85skMs0 8939 avoid a couple of warnings in `polyfit` keewis 14808389 closed 0     14 2024-04-13T11:49:13Z 2024-05-01T16:42:06Z 2024-05-01T15:34:20Z MEMBER   0 pydata/xarray/pulls/8939

- [x] towards #8844

  • replace numpy.core.finfo with numpy.finfo
  • add dtype and copy parameters to all definitions of __array__
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8939/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2270984193 PR_kwDOAMm_X85uHk70 8986 clean up the upstream-dev setup script keewis 14808389 closed 0     1 2024-04-30T09:34:04Z 2024-04-30T23:26:13Z 2024-04-30T20:59:56Z MEMBER   0 pydata/xarray/pulls/8986

In trying to install packages that are compatible with numpy>=2 I added several projects that are built in CI without build isolation (so that they will be built with the nightly version of numpy). That was a temporary workaround, so we should start thinking about cleaning this up.

As it seems numcodecs is now compatible (or uses less of numpy in compiled code, not sure), this is an attempt to see if CI works if we use the version from conda-forge.

bottleneck and cftime now build against numpy>=2.0.0rc1, so we can stop building them without build isolation.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8986/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2267711587 PR_kwDOAMm_X85t8VWy 8978 more engine environment tricks in preparation for `numpy>=2` keewis 14808389 closed 0     7 2024-04-28T17:54:38Z 2024-04-29T14:56:22Z 2024-04-29T14:56:21Z MEMBER   0 pydata/xarray/pulls/8978

Turns out pydap also needs to build with numpy>=2. Until it does, we should remove it from the upstream-dev environment. Also, numcodecs build-depends on setuptools-scm.

And finally, the h5py nightlies might support numpy>=2 (h5py>=3.11 supposedly is numpy>=2 compatible), so once again I'll try and see if CI passes.

  • [x] towards #8844
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8978/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
590630281 MDU6SXNzdWU1OTA2MzAyODE= 3921 issues discovered by the all-but-dask CI keewis 14808389 closed 0     4 2020-03-30T22:08:46Z 2024-04-25T14:48:15Z 2024-02-10T02:57:34Z MEMBER      

After adding the py38-all-but-dask CI in #3919, it discovered a few backend issues: - zarr: - [x] open_zarr with chunks="auto" always tries to chunk, even if dask is not available (fixed in #3919) - [x] ZarrArrayWrapper.__getitem__ incorrectly passes the indexer's tuple attribute to _arrayize_vectorized_indexer (this only happens if dask is not available) (fixed in #3919) - [x] slice indexers with negative steps get transformed incorrectly if dask is not available https://github.com/pydata/xarray/pull/8674 - rasterio: - ~calling pickle.dumps on a Dataset object returned by open_rasterio fails because a non-serializable lock was used (if dask is installed, a serializable lock is used instead)~

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3921/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
2255271332 PR_kwDOAMm_X85tSKJs 8961 use `nan` instead of `NaN` keewis 14808389 closed 0     0 2024-04-21T21:26:18Z 2024-04-21T22:01:04Z 2024-04-21T22:01:03Z MEMBER   0 pydata/xarray/pulls/8961

FYI @aulemahal, numpy.NaN will be removed in the upcoming numpy=2.0 release.

  • [x] follow-up to #8603
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8961/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2241528898 PR_kwDOAMm_X85skNON 8940 adapt more tests to the copy-on-write behavior of pandas keewis 14808389 closed 0     1 2024-04-13T11:57:10Z 2024-04-13T19:36:30Z 2024-04-13T14:44:50Z MEMBER   0 pydata/xarray/pulls/8940
  • [x] follow-up to #8846
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8940/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2241499231 PR_kwDOAMm_X85skHW9 8938 use `pd.to_timedelta` instead of `TimedeltaIndex` keewis 14808389 closed 0     0 2024-04-13T10:38:12Z 2024-04-13T12:32:14Z 2024-04-13T12:32:13Z MEMBER   0 pydata/xarray/pulls/8938

pandas recently removed the deprecated unit kwarg to TimedeltaIndex.

  • [x] towards #8844
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8938/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2181644595 PR_kwDOAMm_X85pYPWY 8823 try to get the `upstream-dev` CI to complete again keewis 14808389 closed 0     2 2024-03-12T13:36:20Z 2024-03-12T16:59:12Z 2024-03-12T16:04:53Z MEMBER   0 pydata/xarray/pulls/8823

There's a couple of accumulated failures now, including a crash because pandas apparently depends on pyarrow now, which on conda-forge is not built for numpy>=2.0.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8823/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2141273710 PR_kwDOAMm_X85nOs6t 8767 new whats-new section keewis 14808389 closed 0     0 2024-02-19T00:39:59Z 2024-02-20T10:35:35Z 2024-02-20T10:35:35Z MEMBER   0 pydata/xarray/pulls/8767  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8767/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2141229636 PR_kwDOAMm_X85nOjve 8766 release v2024.02.0 keewis 14808389 closed 0     0 2024-02-18T23:06:06Z 2024-02-18T23:06:22Z 2024-02-18T23:06:22Z MEMBER   0 pydata/xarray/pulls/8766  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8766/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2141111970 PR_kwDOAMm_X85nOMmO 8764 release summary for 2024.02.0 keewis 14808389 closed 0     3 2024-02-18T17:45:01Z 2024-02-18T23:00:26Z 2024-02-18T22:52:14Z MEMBER   0 pydata/xarray/pulls/8764
  • [x] closes #8748
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8764/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1505375386 PR_kwDOAMm_X85F6MBQ 7395 implement `isnull` using `full_like` instead of `zeros_like` keewis 14808389 closed 0     2 2022-12-20T22:07:30Z 2024-01-28T14:19:26Z 2024-01-23T18:29:14Z MEMBER   0 pydata/xarray/pulls/7395

After changing the behavior of the implementation of *_like in pint, it seems comparisons fail now. As it turns out, this is because we're using zeros_like to return all-False arrays from isnull for input with non-nullable dtypes.

I'd argue that python full_like(data, dtype=bool, fill_value=False) is a little bit easier to understand than python zeros_like(data, dtype=bool) as the latter requires knowledge about the bit representation of False, so the change is not only to get pint to work properly.

  • [x] Tests added
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7395/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2010594399 PR_kwDOAMm_X85gWlAz 8483 import from the new location of `normalize_axis_index` if possible keewis 14808389 closed 0     13 2023-11-25T12:19:32Z 2024-01-18T16:52:02Z 2024-01-18T15:34:57Z MEMBER   0 pydata/xarray/pulls/8483

Another one of the numpy=2.0 fixes, this time numpy.core.multiarray.normalize_axis_index has been moved to numpy.lib.array_utils (and apparently this is the first time it has been officially exposed as public API).

Since as far as I remember numpy is working on removing numpy.core entirely, we might also want to change our usage of defchararray (in the formatting tests). Not sure how, though.

  • [x] Towards #8091
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8483/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2078559800 PR_kwDOAMm_X85j6NsC 8605 run CI on `python=3.12` keewis 14808389 closed 0     7 2024-01-12T10:47:18Z 2024-01-17T21:54:13Z 2024-01-17T21:54:12Z MEMBER   0 pydata/xarray/pulls/8605
  • [x] Closes #8580
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8605/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1977836822 PR_kwDOAMm_X85enruo 8416 migrate the other CI to python 3.11 keewis 14808389 closed 0     3 2023-11-05T15:26:31Z 2024-01-03T20:17:11Z 2023-11-17T15:27:21Z MEMBER   0 pydata/xarray/pulls/8416

(namely, additional CI and upstream-dev CI)

python=3.11 has been released more than a year ago and python=3.12 is out as well, which means that it is a good idea to migrate sooner than later.

Regarding python=3.12, usually it is numba that keeps us from testing on a new python version for some time, where numbagg and sparse are the only dependencies that would use it. Should we create a environment without those two dependencies and switch back to the normal one once numba supports the new python version?

We still have the special environment files for python>=3.11 because the normal ones still include cdms2. We deprecated that back in May – not sure which release that ended up in – but since cdms2 will be abandoned end of this year, that's when we're free to drop support and merge both environments (though maybe we can justify dropping support earlier?)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8416/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2060807644 PR_kwDOAMm_X85i-Lpn 8576 ignore a `DeprecationWarning` emitted by `seaborn` keewis 14808389 closed 0     0 2023-12-30T17:30:28Z 2023-12-30T22:10:08Z 2023-12-30T22:10:08Z MEMBER   0 pydata/xarray/pulls/8576

Not sure if this is something that we'll only see on main after the next release of pandas (if ever), though.

I also moved the hdf5 warning to xarray/tests/__init__.py, as that is usually the source of these warnings.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8576/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2028193332 PR_kwDOAMm_X85hSQNW 8526 explicitly skip using `__array_namespace__` for `numpy.ndarray` keewis 14808389 closed 0     3 2023-12-06T10:09:48Z 2023-12-07T09:18:05Z 2023-12-06T17:58:46Z MEMBER   0 pydata/xarray/pulls/8526
  • [x] towards #8091

numpy recently added __array_namespace__ to the ndarray class, which returns the main numpy module. However, that does not yet provide a couple of functions, in this case we need concat. This adds a additional condition to duck_array_ops.concat that disables using __array_namespace__ for ndarray objects.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8526/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1988047821 PR_kwDOAMm_X85fKfB6 8441 remove `cdms2` keewis 14808389 closed 0     4 2023-11-10T17:25:50Z 2023-11-14T17:34:57Z 2023-11-14T17:15:49Z MEMBER   0 pydata/xarray/pulls/8441

cdms2 is going to be abandoned at the end of this year, the recommended replacement is xcdat. We have deprecated our conversion functions in 2023.06.0 (release on June 23) which was less than 6 months ago, but given that cdms2 is already not installable on some architectures it makes sense to remove earlier.

This also appears to allow us to remove the special python=3.11 environment files.

  • [x] Follow-up to #7876, closes #8419
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8441/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1730451312 I_kwDOAMm_X85nJJdw 7879 occasional segfaults on CI keewis 14808389 closed 0     3 2023-05-29T09:52:01Z 2023-11-06T22:03:43Z 2023-11-06T22:03:42Z MEMBER      

The upstream-dev CI currently fails sometimes due to a segfault (the normal CI crashes, too, but since we use pytest-xdist we only get a message stating "worker x crashed").

I'm not sure why, and I can't reproduce locally, either. Given that dask's local scheduler is in the traceback and the failing test is test_open_mfdataset_manyfiles, I assume there's some issue with parallel disk access or the temporary file creation.

log of the segfaulting CI job ``` ============================= test session starts ============================== platform linux -- Python 3.10.11, pytest-7.3.1, pluggy-1.0.0 rootdir: /home/runner/work/xarray/xarray configfile: setup.cfg testpaths: xarray/tests, properties plugins: env-0.8.1, xdist-3.3.1, timeout-2.1.0, cov-4.1.0, reportlog-0.1.2, hypothesis-6.75.6 timeout: 60.0s timeout method: signal timeout func_only: False collected 16723 items / 2 skipped xarray/tests/test_accessor_dt.py ....................................... [ 0%] ........................................................................ [ 0%] ........................................................................ [ 1%] ........................................................................ [ 1%] ............................... [ 1%] xarray/tests/test_accessor_str.py ...................................... [ 1%] ........................................................................ [ 2%] ........................................................................ [ 2%] ............................................. [ 3%] xarray/tests/test_array_api.py ........... [ 3%] xarray/tests/test_backends.py ........................X........x........ [ 3%] ...................................s.........................X........x. [ 3%] .........................................s.........................X.... [ 4%] ....x.......................................X........................... [ 4%] ....................................X........x.......................... [ 5%] .............X.......................................................... [ 5%] ....X........x....................x.x................X.................. [ 5%] x..x..x..x...................................X........x................. [ 6%] ...x.x................X..................x..x..x..x..................... [ 6%] ..............X........x....................x.x................X........ [ 7%] ..........x..x..x..x.......................................X........x... [ 7%] ...........................................X........x................... [ 8%] ...ss........................X........x................................. [ 8%] .................X........x............................................. [ 8%] ..X........x.............................................X........x..... [ 9%] ............................................X........x.................. [ 9%] .........................................................X........x..... [ 10%] ......................................................................X. [ 10%] .......x................................................................ [ 11%] Fatal Python error: Segmentation fault Thread 0x00007f9c7b8ff640 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 81 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Current thread 0x00007f9c81f1d640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9c82f1e640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9ca575e740 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 320 in wait File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/queue.py", line 171 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 137 in queue_get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 500 in get_async File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py", line 89 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py", line 595 in compute File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 1046 in open_mfdataset File "/home/runner/work/xarray/xarray/xarray/tests/test_backends.py", line 3295 in test_open_mfdataset_manyfiles File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 1799 in runtest File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 262 in <lambda> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 341 in from_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 261 in call_runtest_hook File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 222 in call_and_report File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 133 in runtestprotocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 348 in pytest_runtestloop File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 323 in _main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 269 in wrap_session File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 316 in pytest_cmdline_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 166 in main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 189 in console_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pytest/__main__.py", line 5 in <module> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 86 in _run_code File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 196 in _run_module_as_main Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pandas._libs.ops, numexpr.interpreter, bottleneck.move, bottleneck.nonreduce, bottleneck.nonreduce_axis, bottleneck.reduce, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.indexing, pandas._libs.index, pandas._libs.internals, pandas._libs.join, pandas._libs.writers, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, cftime._cftime, yaml._yaml, cytoolz.utils, cytoolz.itertoolz, cytoolz.functoolz, cytoolz.dicttoolz, cytoolz.recipes, xxhash._xxhash, psutil._psutil_linux, psutil._psutil_posix, markupsafe._speedups, numpy.linalg.lapack_lite, matplotlib._c_internal_utils, PIL._imaging, matplotlib._path, kiwisolver._cext, scipy._lib._ccallback_c, _cffi_backend, unicodedata2, netCDF4._netCDF4, h5py._errors, h5py.defs, h5py._objects, h5py.h5, h5py.h5r, h5py.utils, h5py.h5s, h5py.h5ac, h5py.h5p, h5py.h5t, h5py._conv, h5py.h5z, h5py._proxy, h5py.h5a, h5py.h5d, h5py.h5ds, h5py.h5g, h5py.h5i, h5py.h5f, h5py.h5fd, h5py.h5pl, h5py.h5o, h5py.h5l, h5py._selector, pyproj._compat, pyproj._datadir, pyproj._network, pyproj._geod, pyproj.list, pyproj._crs, pyproj.database, pyproj._transformer, pyproj._sync, matplotlib._image, rasterio._version, rasterio._err, rasterio._filepath, rasterio._env, rasterio._transform, rasterio._base, rasterio.crs, rasterio._features, rasterio._warp, rasterio._io, numcodecs.compat_ext, numcodecs.blosc, numcodecs.zstd, numcodecs.lz4, numcodecs._shuffle, msgpack._cmsgpack, numcodecs.vlen, zstandard.backend_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._isolve._iterative, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._ellip_harm_2, scipy.spatial.transform._rotation, scipy.ndimage._nd_image, _ni_label, scipy.ndimage._ni_label, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize.__nnls, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._statlib, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._mvn, scipy.stats._rcont.rcont, scipy.cluster._vq, scipy.cluster._hierarchy, scipy.cluster._optimal_leaf_ordering, shapely.lib, shapely._geos, shapely._geometry_helpers, cartopy.trace, scipy.fftpack.convolve, tornado.speedups, cf_units._udunits2, scipy.io.matlab._mio_utils, scipy.io.matlab._streams, scipy.io.matlab._mio5_utils (total: 241) /home/runner/work/_temp/b3f3888c-5349-4d19-80f6-41d140b86db5.sh: line 3: 6114 Segmentation fault (core dumped) python -m pytest --timeout=60 -rf --report-log output-3.10-log.jsonl ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7879/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  not_planned xarray 13221727 issue
1845449919 PR_kwDOAMm_X85Xp1U1 8064 adapt to NEP 51 keewis 14808389 closed 0     7 2023-08-10T15:43:13Z 2023-09-30T09:27:25Z 2023-09-25T04:46:49Z MEMBER   0 pydata/xarray/pulls/8064

With NEP 51 (and the changes to numpy main), scalar types no longer pretend to be standard python types in their string representation. This fixes most of the errors in the tests but there are still a few remaining in the doctests (in particular, doctests for private plotting utils).

  • [x] towards #8091
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8064/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1898193938 PR_kwDOAMm_X85abbJ4 8188 fix the failing docs keewis 14808389 closed 0     7 2023-09-15T11:01:42Z 2023-09-20T11:04:03Z 2023-09-15T13:26:24Z MEMBER   0 pydata/xarray/pulls/8188

The docs have been failing because of a malformatted docstring we inherit from pandas, and this caused us to miss another error in #8183. The fix is to avoid installing pandas=2.1.0, which should be the only version with the malformatted docstring, and to apply the missing changes from #8183 here.

  • [x] Closes #8157, follow-up to #8183
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8188/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1869782001 PR_kwDOAMm_X85Y76lw 8117 fix miscellaneous `numpy=2.0` errors keewis 14808389 closed 0     9 2023-08-28T13:34:56Z 2023-09-13T15:34:15Z 2023-09-11T03:55:52Z MEMBER   0 pydata/xarray/pulls/8117
  • [x] towards #8091
  • [x] closes #8133

Edit: looking at the relevant numpy issues, it appears numpy will stay a bit unstable for the next few weeks / months. Not sure how quickly we should try to adapt to those changes.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8117/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1845114193 PR_kwDOAMm_X85Xorkf 8061 unpin `numpy` keewis 14808389 closed 0     8 2023-08-10T12:43:32Z 2023-08-17T18:14:22Z 2023-08-17T18:14:21Z MEMBER   0 pydata/xarray/pulls/8061
  • [x] follow-up to #7415

It seems in a previous PR I "temporarily" pinned numpy to get CI to pass, but then forgot to unpin later and merged it as-is. As a result, we have not been running the main CI with numpy>=1.24 ever since, even though now numpy=1.25 has been around for a while.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8061/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1423972935 PR_kwDOAMm_X85BlCII 7225 join together duplicate entries in the text `repr` keewis 14808389 closed 0     4 2022-10-26T12:53:49Z 2023-07-24T18:37:05Z 2023-07-20T21:13:57Z MEMBER   0 pydata/xarray/pulls/7225

Indexes contains one entry per coordinate, even if there are indexes that index multiple coordinates. This deduplicates the entries, but the format is not quite clear.

  • [x] follow-up to #6795
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

The formatting options we were able to come up with: 1. separate with just newlines (see e29aeb9085bc677e16dabd4a2b94cf63d06c155e): Indexes: one CustomIndex two three PandasIndex 2. mark unique indexes with a prefix to make it look like a list (see 9b90f8bceda6f012c863927865cc638e0ff3fb88): Indexes: - one CustomIndex two - three PandasIndex 3. use unicode box components (in front of the coordinate names) (see 2cec070ebf4f1958d4ffef98d7649eda21ac09a3): Indexes: ┌ one CustomIndex │ two └ three four PandasIndex 4. use unicode box components (after the coordinate names) (see 492ab47ccce8c43d264d5c759841060d33cafe4d): Indexes: one ┐ CustomIndex two │ three ┘ four PandasIndex

For the unicode box components, we can choose between the light and heavy variants.

@benbovy and I think the unicode variants (especially variant 3) are the easiest to understand, but we would need to decide whether we care about terminals that don't support unicode.

Edit: in the meeting we decided that support for the subsection of unicode should be common enough that we can use it. I'll clean this PR up to implement option 3, then.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7225/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 1,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1789429376 PR_kwDOAMm_X85Uso19 7961 manually unshallow the repository on RTD keewis 14808389 closed 0     0 2023-07-05T12:15:31Z 2023-07-11T13:24:18Z 2023-07-05T15:44:09Z MEMBER   0 pydata/xarray/pulls/7961

RTD is deprecating the feature flag we made use of before.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7961/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1745794965 PR_kwDOAMm_X85SaJCg 7899 use trusted publishers instead of a API token keewis 14808389 closed 0     4 2023-06-07T12:30:56Z 2023-06-16T08:58:05Z 2023-06-16T08:37:07Z MEMBER   0 pydata/xarray/pulls/7899

PyPI introduced the concept of "trusted publishers" a few months ago, which allows requesting short-lived API tokens for trusted publishing services (such as GHA, in our case).

Someone with the appropriate rights will have to enable this on PyPI, and I will do the same for TestPyPI.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7899/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1730664352 PR_kwDOAMm_X85RmgD2 7880 don't use `CacheFileManager.__del__` on interpreter shutdown keewis 14808389 closed 0     9 2023-05-29T12:16:06Z 2023-06-06T20:37:40Z 2023-06-06T15:14:37Z MEMBER   0 pydata/xarray/pulls/7880

Storing a reference to the function on the class tells the garbage collector to not collect the function before the class, such that any instance can safely complete its __del__.

No tests because I don't know how to properly test this. Any ideas?

  • [x] Closes #7814
  • [ ] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7880/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1738586208 PR_kwDOAMm_X85SBfsz 7889 retire the TestPyPI workflow keewis 14808389 closed 0     1 2023-06-02T17:54:04Z 2023-06-04T19:58:08Z 2023-06-04T18:46:14Z MEMBER   0 pydata/xarray/pulls/7889

With the recent addition of the workflow to upload nightly releases to anaconda.org/scientific-python-nightly-wheels, we don't really need the TestPyPI workflow anymore, especially since PyPI instances are not designed to automatically delete very old releases.

  • [x] Follow-up to #7863 and #7865
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7889/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1721896187 PR_kwDOAMm_X85RIyh6 7867 add `numba` to the py3.11 environment keewis 14808389 closed 0     1 2023-05-23T11:49:37Z 2023-06-03T11:36:10Z 2023-05-28T06:30:10Z MEMBER   0 pydata/xarray/pulls/7867

numba=0.57.0 has been out for quite some time already and is available on conda-forge as-of last week, which means we can almost retire the separate python=3.11 environment file.

I'm not sure what to do about cdms2 (we will see if that fails in CI), but in any case we should just deprecate and remove any functions that use it: if I understand correctly, cdms2 is in maintenance mode until the end of the year and will be discontinued afterwards.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7867/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1730414479 PR_kwDOAMm_X85RlpAe 7878 move to `setup-micromamba` keewis 14808389 closed 0     0 2023-05-29T09:27:15Z 2023-06-01T16:21:57Z 2023-06-01T16:21:56Z MEMBER   0 pydata/xarray/pulls/7878

The provision-with-micromamba action has been deprecated and will not receive any further releases. It is replaced by the setup-micromamba action, which does basically the same thing, but with a different implementation and a few changes to the configuration.

  • [x] Closes #7877
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7878/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1729709527 PR_kwDOAMm_X85RjPc9 7876 deprecate the `cdms2` conversion methods keewis 14808389 closed 0     2 2023-05-28T22:18:55Z 2023-05-30T20:59:48Z 2023-05-29T19:01:20Z MEMBER   0 pydata/xarray/pulls/7876

As the cdms2 library has been deprecated and will be retired at the end of this year, maintaining conversion functions does not make sense anymore. Additionally, one of the tests is currently failing on the upstream-dev CI (cdms2 is incompatible with a recent change to numpy), and it seems unlikely this will ever be fixed (and there's no python=3.11 release, blocking us from merging the py311 environment with the default one).

cc @tomvothecoder for visibility

  • [x] Closes #7707
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7876/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1726529405 PR_kwDOAMm_X85RYfGo 7875 defer to `numpy` for the expected result keewis 14808389 closed 0     1 2023-05-25T21:48:18Z 2023-05-27T19:53:08Z 2023-05-27T19:53:07Z MEMBER   0 pydata/xarray/pulls/7875

numpy has recently changed the result of np.cos(0) by a very small value, which makes our tests break.

I'm not really sure what the best fix is, so I split the changes into two parts: the first commit uses python xr.testing.assert_allclose(a + 1, np.cos(a)) to test the result, while the second commit uses python expected = xr.full_like(a, fill_value=np.cos(0), dtype=float) actual = np.cos(a) xr.testing.assert_identical(actual, expected)

  • [x] towards #7707
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7875/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1718144679 PR_kwDOAMm_X85Q8Hne 7855 adapt the `pint` + `dask` test to the newest version of `pint` keewis 14808389 closed 0     0 2023-05-20T11:35:47Z 2023-05-25T17:25:01Z 2023-05-25T17:24:34Z MEMBER   0 pydata/xarray/pulls/7855
  • [x] towards #7707

pint recently improved the support for wrapping dask, breaking our older tests. With this change, we basically require the newest version of pint (pint>=0.21.1, which should be released pretty soon) to interact with dask, as 0.21.0 did break our use of np.allclose and np.isclose. However, I guess dask support has never been properly tested and thus should be considered "experimental", anyways.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7855/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1688716198 PR_kwDOAMm_X85PZRyC 7793 adjust the deprecation policy for python keewis 14808389 closed 0     2 2023-04-28T15:03:51Z 2023-05-02T11:51:27Z 2023-05-01T22:26:55Z MEMBER   0 pydata/xarray/pulls/7793

As discussed in #7765, this extends the policy months by 6 to a total of 30 months. With that, the support for a python version can be removed as soon as the next version is at least 30 months old. Together with the 12 month release cycle python has, we get the 42 month release window from NEP 29.

Note that this is still missing the release overview proposed in #7765, I'm still thinking about how to best implement the automatic update / formatting, and how to coordinate it with the (still manual) version overrides.

  • [x] towards #7765, closes #7777
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7793/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1440494247 I_kwDOAMm_X85V3DKn 7270 type checking CI is failing keewis 14808389 closed 0     3 2022-11-08T16:10:24Z 2023-04-15T18:31:59Z 2023-04-15T18:31:59Z MEMBER      

The most recent runs of the type checking CI have started to fail with a segfault: /home/runner/work/_temp/dac0c060-b19a-435a-8063-bbc5b8ffbf24.sh: line 1: 2945 Segmentation fault (core dumped) python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report This seems to be due to the release of mypy=0.990.

7269 pinned mypy to mypy<0.990, which should be undone once we figured out how to fix that (most likely by waiting on the next release), and any complaints the new version has (see this workflow run).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7270/reactions",
    "total_count": 2,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 1,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1668326257 PR_kwDOAMm_X85OVLQA 7756 remove the `black` hook keewis 14808389 closed 0     0 2023-04-14T14:10:36Z 2023-04-14T17:42:49Z 2023-04-14T16:36:18Z MEMBER   0 pydata/xarray/pulls/7756

Apparently, in addition to formatting notebooks, black-jupyter does exactly the same thing as black.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7756/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1668319039 PR_kwDOAMm_X85OVJv5 7755 reword the what's new entry for the `pandas` 2.0 dtype changes keewis 14808389 closed 0     0 2023-04-14T14:06:54Z 2023-04-14T14:30:51Z 2023-04-14T14:30:50Z MEMBER   0 pydata/xarray/pulls/7755

As a follow-up to #7724, this makes the what's new entry a bit more precise.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7755/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1655782486 PR_kwDOAMm_X85Nr3hH 7724 `pandas=2.0` support keewis 14808389 closed 0     8 2023-04-05T14:52:30Z 2023-04-12T13:24:07Z 2023-04-12T13:04:11Z MEMBER   0 pydata/xarray/pulls/7724

As mentioned in https://github.com/pydata/xarray/issues/7716#issuecomment-1497623839, this tries to unpin pandas.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7724/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1637616804 PR_kwDOAMm_X85MvWza 7664 use the `files` interface instead of the deprecated `read_binary` keewis 14808389 closed 0     2 2023-03-23T14:06:36Z 2023-03-30T14:59:22Z 2023-03-30T14:58:43Z MEMBER   0 pydata/xarray/pulls/7664

Apparently, read_binary has been marked as deprecated in python=3.11, and is to be replaced by importlib.resources.files, which has been available since python=3.9. Since we dropped support for python=3.8 a while ago, we can safely follow the instructions in the deprecation warning.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7664/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1635470616 PR_kwDOAMm_X85MoK6O 7657 add timeouts for tests keewis 14808389 closed 0     9 2023-03-22T10:20:04Z 2023-03-24T16:42:48Z 2023-03-24T15:49:22Z MEMBER   0 pydata/xarray/pulls/7657

The macos 3.11 CI seems stall very often at the moment, which makes it hit the 6 hours mark and get cancelled. Since our tests should never run that long (the ubuntu runners usually take between 15-20 minutes), I'm introducing a pretty generous timeout of 5 minutes. By comparison, we already have a timeout of 60 seconds in the upstream-dev CI, but that's on a ubuntu runner which usually is much faster than any of the macos / windows runners.

Tests that time out raise an error, which might help us with figuring out which test it is that stalls, and also if we can do anything about that.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7657/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1615259652 PR_kwDOAMm_X85LkhcR 7594 ignore the `pkg_resources` deprecation warning keewis 14808389 closed 0     0 2023-03-08T13:18:10Z 2023-03-08T13:41:55Z 2023-03-08T13:41:54Z MEMBER   0 pydata/xarray/pulls/7594

In one of the recent setuptools releases pkg_resources finally got deprecated in favor of the importlib.* modules. We don't use pkg_resources directly, so the DeprecationWarning that causes the doctest CI to fail is from a dependency. As such, the only way to get it to work again is to ignore the warning until the upstream packages have released versions that don't use/import pkg_resources.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7594/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1534634670 PR_kwDOAMm_X85Hc1wx 7442 update the docs environment keewis 14808389 closed 0     5 2023-01-16T09:58:43Z 2023-03-03T10:17:14Z 2023-03-03T10:14:13Z MEMBER   0 pydata/xarray/pulls/7442

Most notably: - bump python to 3.10 - bump sphinx to at least 5.0 - remove the pydata-sphinx-theme pin: sphinx-book-theme pins to a exact minor version so pinning as well does not change anything - xref https://github.com/executablebooks/sphinx-book-theme/issues/686

~Edit: it seems this is blocked by sphinx-book-theme pinning sphinx to >=3,<5. They already changed the pin, we're just waiting on a release~

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7442/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1556739109 PR_kwDOAMm_X85IhJib 7477 RTD maintenance keewis 14808389 closed 0     0 2023-01-25T14:20:27Z 2023-01-25T14:58:50Z 2023-01-25T14:58:46Z MEMBER   0 pydata/xarray/pulls/7477

Since the last time the RTD configuration was updated, a few things have changed: - the OS image is now a bit old - we can tell git (and thus setuptools_scm) to ignore changes by RTD

If I read the documentation / history of RTD correctly, they removed the DONT_SHALLOW_CLONE feature flag from documentation (it still is enabled and works for this repository, though), so we might have to migrate to adding a post_checkout build job that does that soon.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7477/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1556471464 PR_kwDOAMm_X85IgPzN 7476 fix the RTD build skipping feature keewis 14808389 closed 0     0 2023-01-25T11:15:26Z 2023-01-25T11:18:00Z 2023-01-25T11:17:57Z MEMBER   0 pydata/xarray/pulls/7476

We can't use the example grep options because we usually enclose tags with [], which grep will interpret as character classes.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7476/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1553155277 PR_kwDOAMm_X85IVH-_ 7470 allow skipping RTD builds keewis 14808389 closed 0     1 2023-01-23T14:02:30Z 2023-01-24T16:09:54Z 2023-01-24T16:09:51Z MEMBER   0 pydata/xarray/pulls/7470

RTD somewhat recently introduced the build.jobs setting, which allows skipping builds (technically it's more of a "automated cancel" with a special error code instead of a skip) on user-defined conditions. We can use that to manually "skip" builds we don't need a documentation build for.

Edit: the only downside seems to be that the build is not actually marked as "skipped" Edit2: apparently that's a bug that's being worked on, see readthedocs/readthedocs.org#9807

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7470/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1552997940 PR_kwDOAMm_X85IUl4t 7469 create separate environment files for `python=3.11` keewis 14808389 closed 0     0 2023-01-23T12:17:08Z 2023-01-23T14:03:36Z 2023-01-23T13:03:13Z MEMBER   0 pydata/xarray/pulls/7469

This builds on #7353, which found that cdms2 and numba (and thus also numbagg and sparse) don't yet support python=3.11.

In order to still test that we support python=3.11 but without dropping those dependencies from the other environments, this adds separate environment files, which should be removed once cdms2 and numba support python=3.11.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7469/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1519058102 PR_kwDOAMm_X85GoVw9 7415 install `numbagg` from `conda-forge` keewis 14808389 closed 0     5 2023-01-04T14:17:44Z 2023-01-20T19:46:46Z 2023-01-20T19:46:43Z MEMBER   0 pydata/xarray/pulls/7415

It seems there is a numbagg package on conda-forge now.

Not sure what to do about the min-all-deps CI, but given that the most recent version of numbagg happened more than 12 months ago (more than 18 months, even) maybe we can just bump it to the version on conda-forge?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7415/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1519154372 PR_kwDOAMm_X85Goq2J 7416 remove `numbagg` and `numba` from the upstream-dev CI keewis 14808389 closed 0     3 2023-01-04T15:18:51Z 2023-01-04T20:07:33Z 2023-01-04T20:07:29Z MEMBER   0 pydata/xarray/pulls/7416
  • [x] opposite of #7311
  • [x] closes #7306

Using the numpy HEAD together with numba is, in general, not supported (see numba/numba#8615). So in order to avoid the current failures and still be able to test numpy HEAD, this (temporarily) removes numbagg and numba (pulled in through numbagg), from the upstream-dev CI.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7416/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
760574919 MDU6SXNzdWU3NjA1NzQ5MTk= 4670 increase the visibility of the upstream-dev PR CI keewis 14808389 closed 0     3 2020-12-09T18:37:57Z 2022-12-29T21:15:05Z 2021-01-19T15:27:26Z MEMBER      

We currently have two upstream-dev PR CI: the old pipelines CI and the new github actions CI added together with the scheduled upstream-dev ("nightly") CI. Since we don't need both I think we should disable one of these, presumably the old pipelines CI.

There's an issue with the CI result, though: since github doesn't have a icon for "passed with issues", we have to choose between "passed" or "failed" as the CI result (neither of which is optimal).

The advantage of using "failed" is that a failure is easily visible, but often the total CI result on PRs is set to "failed" because we didn't get around to fixing bugs introduced by recent changes to dependencies (which is confusing for contributors).

In #4584 I switched the pipelines upstream-dev CI to "allowed failure" so we get a warning instead of a failure. However, github doesn't print the number of warnings in the summary line, which means that if the icon is green nobody checks the status and upstream-dev CI failures are easily missed.

Our new scheduled nightly CI improves the situation quite a bit since we automatically get a issue containing the failures, but that means we aren't able to catch these failures before actually merging. As pointed out in https://github.com/pydata/xarray/issues/4574#issuecomment-725795622 that might be acceptable, though.

If we still want to fix this, we could have the PR CI automatically add a comment to the PR, which would contain a summary of the failures but also state that these failures can be ignored as long as they get the approval of a maintainer. This would increase the noise on PRs, though.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4670/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1440354343 PR_kwDOAMm_X85Cbqgn 7267 `keep_attrs` for pad keewis 14808389 closed 0     6 2022-11-08T14:55:05Z 2022-12-12T15:59:46Z 2022-12-12T15:59:42Z MEMBER   0 pydata/xarray/pulls/7267

I ran into this while trying DataTree.pad, which silently dropped the attrs, even with keep_attrs=True.

  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7267/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1440486521 PR_kwDOAMm_X85CcHQ5 7269 pin mypy to a known good version keewis 14808389 closed 0     0 2022-11-08T16:06:47Z 2022-11-08T16:49:16Z 2022-11-08T16:49:13Z MEMBER   0 pydata/xarray/pulls/7269

The type checking CI has started to fail with the new upgrade. In order not to disturb unrelated PRs, this pins mypy to a known good version (mypy<0.990).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7269/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1438416173 PR_kwDOAMm_X85CVEkR 7260 use the moving release tag of `issue-from-pytest-log` keewis 14808389 closed 0     0 2022-11-07T14:01:35Z 2022-11-07T14:40:28Z 2022-11-07T14:32:02Z MEMBER   0 pydata/xarray/pulls/7260

Like most github actions (or at least all the official actions/* ones), issue-from-pytest-log maintains a moving release tag that points to the most recently released version.

In order to decrease the amount of updating PRs, this makes use of that tag.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7260/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1429769840 PR_kwDOAMm_X85B4VCq 7242 fix the environment setup: actually use the python version keewis 14808389 closed 0     0 2022-10-31T12:30:37Z 2022-10-31T13:24:36Z 2022-10-31T13:16:46Z MEMBER   0 pydata/xarray/pulls/7242

While working on #7241, I realized that I forgot to specify the python in our main CI, with the effect that we've been testing python 3.10 twice for each OS since the introduction of the micromamba action.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7242/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1422482990 PR_kwDOAMm_X85BgEYZ 7212 use the new action to create an issue from the output of reportlog keewis 14808389 closed 0     0 2022-10-25T13:37:42Z 2022-10-26T09:34:09Z 2022-10-26T09:12:42Z MEMBER   0 pydata/xarray/pulls/7212

I'm not sure if we actually need the dedicated "report" job, or whether adding an additional step to the main ci job would suffice?

I can see two reasons for a separate job: 1. we get to control the python version independently from the version of the main job 2. we upload the reportlog files as artifacts

I think if we can change the action to abort if it is run on a python version it does not support the first concern would not matter anymore, and for 2 we might just keep the "upload artifact" action (but is it even possible to manually access artifacts? If not we might not even need the upload).

Since I don't think either is a major concern, I went ahead and joined the jobs.

  • [x] Closes #6810
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7212/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1422502185 PR_kwDOAMm_X85BgIjA 7213 use the moving release tag of ci-trigger keewis 14808389 closed 0     0 2022-10-25T13:50:13Z 2022-10-25T14:29:54Z 2022-10-25T14:29:51Z MEMBER   0 pydata/xarray/pulls/7213

In order to follow the minor releases of ci-trigger, we can use the new v1 release tag that will always point to the most recent release.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7213/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1413425793 PR_kwDOAMm_X85BBvaI 7185 indexes section in the HTML repr keewis 14808389 closed 0     4 2022-10-18T15:25:34Z 2022-10-20T06:59:05Z 2022-10-19T21:12:46Z MEMBER   0 pydata/xarray/pulls/7185

To see the effect, try this:

```python import xarray as xr from xarray.core.indexes import Index class CustomIndex(Index): def __init__(self, names, options): self.names = names self.options = options @classmethod def from_variables(cls, variables, options): names = list(variables.keys()) return cls(names, options) def __repr__(self): options = ( {"names": repr(self.names)} | {str(k): str(v) for k, v in self.options.items()} ) return f"CustomIndex({', '.join(k + '=' + v for k, v in options.items())})" def _repr_html_(self): header_row = "<tr><td>KDTree params</td></tr>" option_rows = [ f"<tr><td>{option}</td><td>{value}</td></tr>" for option, value in self.options.items() ] return f"<left><table>{header_row}{''.join(option_rows)}</table></left>" ds = xr.tutorial.open_dataset("rasm") ds1 = ds.set_xindex(["xc", "yc"], CustomIndex, param1="a", param2="b") with xr.set_options(display_style="text"): display(ds1) with xr.set_options(display_style="html"): display(ds1) ```

~The repr looks a bit strange because I've been borrowing the variable CSS classes.~ Edit: @benbovy fixed that for me

Also, the discussion about what _repr_inline_ should include from #7183 is relevant here as well.

  • [x] Follow-up to #6795, depends on #7183
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7185/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1412926287 PR_kwDOAMm_X85BADV9 7183 use `_repr_inline_` for indexes that define it keewis 14808389 closed 0     6 2022-10-18T10:00:47Z 2022-10-19T14:06:51Z 2022-10-19T14:06:47Z MEMBER   0 pydata/xarray/pulls/7183

Also, some tests for the index summarizer.

  • [x] Follow-up to #6795
  • [x] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7183/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1306887842 PR_kwDOAMm_X847g7WQ 6795 display the indexes in the string reprs keewis 14808389 closed 0     7 2022-07-16T19:42:19Z 2022-10-15T18:28:36Z 2022-10-12T16:52:53Z MEMBER   0 pydata/xarray/pulls/6795

With the flexible indexes refactor indexes have become much more important, which means we should include them in the reprs of DataArray and Dataset objects.

This is a initial attempt, covering only the string reprs, with a few unanswered questions: - how do we format indexes? Do we delegate to their __repr__ or some other method? - should we skip PandasIndex and PandasMultiIndex? - how do we present indexes that wrap multiple columns? At the moment, they are duplicated (see also the discussion in #6392) - what do we do with the index marker in the coords repr?

(also, how do we best test this?)

  • [ ] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6795/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1404894283 PR_kwDOAMm_X85AlZGn 7153 use a hook to synchronize the versions of `black` keewis 14808389 closed 0     5 2022-10-11T16:07:05Z 2022-10-12T08:00:10Z 2022-10-12T08:00:07Z MEMBER   0 pydata/xarray/pulls/7153

We started to pin the version of black used in the environment of blackdoc, but the version becomes out-of-date pretty quickly. The new hook I'm adding here is still experimental, but pretty limited in what it can destroy (the pre-commit configuration) so for now we can just review any new autoupdate PRs from the pre-commit-ci a bit more thoroughly.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7153/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1327082521 PR_kwDOAMm_X848kdpQ 6873 rely on `numpy`'s version of `nanprod` and `nansum` keewis 14808389 closed 0     1 2022-08-03T11:33:35Z 2022-08-09T17:31:27Z 2022-08-09T14:55:21Z MEMBER   0 pydata/xarray/pulls/6873

At the moment, nanprod and nansum will replace any nan values with 1 for nanprod or 0 for nansum. For pint, inserting dimensionless values into quantities is explicitly not allowed, so our version of nanprod cannot be used on pint quantities (0 is a exception so nansum does work).

I'm proposing to rely on numpy's version for that, which would allow pint to customize the behavior.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6873/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1326997649 PR_kwDOAMm_X848kLEu 6872 skip creating a cupy-backed IndexVariable keewis 14808389 closed 0     0 2022-08-03T10:21:06Z 2022-08-03T15:34:18Z 2022-08-03T15:04:56Z MEMBER   0 pydata/xarray/pulls/6872

We could probably replace the default indexes with cudf indexes, but with pandas indexes this test doesn't make too much sense.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6872/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1250755592 I_kwDOAMm_X85KjQQI 6645 pre-release for v2022.06.0 keewis 14808389 closed 0     14 2022-05-27T13:14:06Z 2022-07-22T15:44:59Z 2022-07-22T15:44:59Z MEMBER      

There's a few unreleased and potentially breaking changes in main, most importantly the index refactor and the new groupby using numpy-groupies and flox. During the meeting on Wednesday we decided to release a preview version to get feedback before releasing a full version, especially from those who don't run their tests against our main branch.

I am planning to create the pre-release tomorrow, but if there's any big changes that should be included please post here.

cc @TomNicholas

Edit: the version will be called 2022.05.0.dev0, which will ensure that e.g. pip will require the --pre flag to install it.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6645/reactions",
    "total_count": 5,
    "+1": 5,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
818957820 MDU6SXNzdWU4MTg5NTc4MjA= 4976 reported version in the docs is misleading keewis 14808389 closed 0     3 2021-03-01T15:08:12Z 2022-07-10T13:00:46Z 2022-07-10T13:00:46Z MEMBER      

The editable install on RTD is reported to have the version 0.17.1.dev0+g835a53e6.d20210226 (which technically is correct but it would be better to have a clean version on tags).

This is not something I can reproduce with python -m pip install -e ., so it is either some RTD weirdness or happens because we get mamba to do the editable install for us.

We should try to get the version right.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4976/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1284543780 PR_kwDOAMm_X846Wk8u 6727 resolve the timeouts on RTD keewis 14808389 closed 0     1 2022-06-25T10:38:36Z 2022-06-30T01:00:21Z 2022-06-25T11:00:50Z MEMBER   0 pydata/xarray/pulls/6727

The reason the changes introduced in #6542 caused timeouts is that they redefined ds to a bigger dataset, which would then be used to demonstrate to_dict.

The fix is to explicitly set the dataset before calling to_dict, which also makes that section a bit easier to follow.

  • [x] Closes #6720
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6727/reactions",
    "total_count": 4,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 1,
    "confused": 0,
    "heart": 3,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1280027449 I_kwDOAMm_X85MS6s5 6714 `mypy` CI is failing on `main` keewis 14808389 closed 0     1 2022-06-22T11:54:16Z 2022-06-22T16:01:45Z 2022-06-22T16:01:45Z MEMBER      

The most recent runs of the mypy CI on main are failing with: xarray/core/dataset.py:6934: error: Incompatible types in assignment (expression has type "str", variable has type "Optional[Optional[Literal['Y', 'M', 'W', 'D', 'h', 'm', 's', 'ms', 'us', 'ns', 'ps', 'fs', 'as']]]") [assignment] I'm not sure what changed since the last pass, though.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6714/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1274838660 PR_kwDOAMm_X8451-Gp 6701 try to import `UndefinedVariableError` from the new location keewis 14808389 closed 0     1 2022-06-17T10:05:18Z 2022-06-22T10:33:28Z 2022-06-22T10:33:25Z MEMBER   0 pydata/xarray/pulls/6701
  • [x] Closes #6698
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6701/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1271869460 PR_kwDOAMm_X845sEYR 6699 use `pytest-reportlog` to generate upstream-dev CI failure reports keewis 14808389 closed 0     0 2022-06-15T08:31:00Z 2022-06-16T08:12:28Z 2022-06-16T08:11:22Z MEMBER   0 pydata/xarray/pulls/6699

We currently use the output of pytest to generate our test results, which is both fragile and does not detect import errors on test collection.

Instead, we can use pytest-reportlog to generate a machine-readable file that is easy to parse (junit XML would probably work, too, but apparently does not contain all the different failure modes of pytest).

The new script will collect failure summaries like the old version, but it should be fairly easy to create a fancy report with more information.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6699/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1264767977 PR_kwDOAMm_X845Uh71 6674 use micromamba instead of mamba keewis 14808389 closed 0     3 2022-06-08T13:38:15Z 2022-06-10T11:33:05Z 2022-06-10T11:33:00Z MEMBER   0 pydata/xarray/pulls/6674

I'm not sure if this is exactly equal to what we had before, but we might be able to save 3-4 minutes of CI time with this.

  • [x] supersedes and closes #6544
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6674/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1264868027 PR_kwDOAMm_X845U3bZ 6675 install the development version of `matplotlib` into the upstream-dev CI keewis 14808389 closed 0     1 2022-06-08T14:42:15Z 2022-06-10T11:25:33Z 2022-06-10T11:25:31Z MEMBER   0 pydata/xarray/pulls/6675
  • [x] follow-up to #4947
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6675/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1264971669 PR_kwDOAMm_X845VNfZ 6676 release notes for the pre-release keewis 14808389 closed 0     2 2022-06-08T15:59:28Z 2022-06-09T14:41:34Z 2022-06-09T14:41:32Z MEMBER   0 pydata/xarray/pulls/6676

The only thing it contains so far is the known regression, do we want to have a summary of the most notable changes like in the previous releases?

  • [x] towards #6645

cc @pydata/xarray

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6676/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1264494714 PR_kwDOAMm_X845TmsM 6673 more testpypi workflow fixes keewis 14808389 closed 0     0 2022-06-08T09:57:50Z 2022-06-08T13:59:29Z 2022-06-08T13:52:08Z MEMBER   0 pydata/xarray/pulls/6673

Hopefully the final follow-up to #6660.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6673/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1262891004 PR_kwDOAMm_X845OOpR 6671 try to finally fix the TestPyPI workflow keewis 14808389 closed 0     0 2022-06-07T08:04:32Z 2022-06-07T08:33:15Z 2022-06-07T08:33:01Z MEMBER   0 pydata/xarray/pulls/6671

As per https://github.com/pydata/xarray/issues/6659#issuecomment-1148285401, don't invoke git restore.

  • [x] Closes #6659
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6671/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1262480325 PR_kwDOAMm_X845M3wk 6669 pin setuptools in the testpypi configure script keewis 14808389 closed 0     1 2022-06-06T22:04:12Z 2022-06-06T22:38:48Z 2022-06-06T22:33:16Z MEMBER   0 pydata/xarray/pulls/6669

This works around a incompatibility between setuptools>=60 and setuptools_scm.

  • [x] Closes #6659
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6669/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1262396039 PR_kwDOAMm_X845MlHs 6668 fix the python version for the TestPyPI release workflow keewis 14808389 closed 0     0 2022-06-06T20:42:51Z 2022-06-06T21:16:01Z 2022-06-06T21:12:23Z MEMBER   0 pydata/xarray/pulls/6668

follow-up to #6660

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6668/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1259827097 PR_kwDOAMm_X845EJs5 6660 upload wheels from `main` to TestPyPI keewis 14808389 closed 0     4 2022-06-03T12:00:02Z 2022-06-06T19:49:08Z 2022-06-06T19:49:02Z MEMBER   0 pydata/xarray/pulls/6660

This adds a new workflow that uploads every commit to main as a new wheel to TestPyPI. No tests, though, so those wheels might be broken (but that's fine, I guess).

Should we document this somewhere, like the install guide?

  • [x] Closes #6659
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6660/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
628436420 MDU6SXNzdWU2Mjg0MzY0MjA= 4116 xarray ufuncs keewis 14808389 closed 0     5 2020-06-01T13:25:54Z 2022-04-19T03:26:53Z 2022-04-19T03:26:53Z MEMBER      

The documentation warns that the universal functions in xarray.ufuncs should not be used unless compatibility with numpy < 1.13 is required.

Since we only support numpy >= 1.15: is it time to remove that (already deprecated) module?

Since there are also functions that are not true ufuncs (e.g. np.angle and np.median) and need __array_function__ (or something similar, see #3917), we could also keep those and just remove the ones that are dispatched using __array_ufunc__.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4116/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1183366218 PR_kwDOAMm_X841Jvka 6419 unpin `jinja2` keewis 14808389 closed 0     0 2022-03-28T12:27:48Z 2022-03-30T13:54:51Z 2022-03-30T13:54:50Z MEMBER   0 pydata/xarray/pulls/6419

nbconvert released a fixed version today, so we can remove the pin of jinja2.

  • [x] Follow-up to #6415
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6419/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1181704575 PR_kwDOAMm_X841EMNH 6414 use the `DaskIndexingAdapter` for `duck dask` arrays keewis 14808389 closed 0     2 2022-03-26T12:00:34Z 2022-03-27T20:38:43Z 2022-03-27T20:38:40Z MEMBER   0 pydata/xarray/pulls/6414

(detected while trying to implement a PintMetaIndex in xarray-contrib/pint-xarray#163)

This fixes position-based indexing of duck dask arrays: ``` python

In [1]: import xarray as xr ...: impIn [1]: import xarray as xr ...: import dask.array as da ...: import pint ...: ...: ureg = pint.UnitRegistry(force_ndarray_like=True) ...: ...: a = da.zeros((20, 20), chunks=(10, 10)) ...: q = ureg.Quantity(a, "m") ...: ...: arr1 = xr.DataArray(a, dims=("x", "y")) ...: arr2 = xr.DataArray(q, dims=("x", "y"))

In [2]: arr1.isel(x=[0, 2, 4], y=[1, 3, 5]) Out[2]: <xarray.DataArray 'zeros_like-d81259c3a77e6dff3e60975e2afe4ff9' (x: 3, y: 3)> dask.array<getitem, shape=(3, 3), dtype=float64, chunksize=(3, 3), chunktype=numpy.ndarray> Dimensions without coordinates: x, y

In [3]: arr2.isel(x=[0, 2, 4], y=[1, 3, 5])

NotImplementedError Traceback (most recent call last) Input In [3], in <module> ----> 1 arr2.isel(x=[0, 2, 4], y=[1, 3, 5])

File .../xarray/core/dataarray.py:1220, in DataArray.isel(self, indexers, drop, missing_dims, **indexers_kwargs) 1215 return self._from_temp_dataset(ds) 1217 # Much faster algorithm for when all indexers are ints, slices, one-dimensional 1218 # lists, or zero or one-dimensional np.ndarray's -> 1220 variable = self._variable.isel(indexers, missing_dims=missing_dims) 1221 indexes, index_variables = isel_indexes(self.xindexes, indexers) 1223 coords = {}

File .../xarray/core/variable.py:1172, in Variable.isel(self, indexers, missing_dims, **indexers_kwargs) 1169 indexers = drop_dims_from_indexers(indexers, self.dims, missing_dims) 1171 key = tuple(indexers.get(dim, slice(None)) for dim in self.dims) -> 1172 return self[key]

File .../xarray/core/variable.py:765, in Variable.getitem(self, key) 752 """Return a new Variable object whose contents are consistent with 753 getting the provided key from the underlying data. 754 (...) 762 array x.values directly. 763 """ 764 dims, indexer, new_order = self._broadcast_indexes(key) --> 765 data = as_indexable(self._data)[indexer] 766 if new_order: 767 data = np.moveaxis(data, range(len(new_order)), new_order)

File .../xarray/core/indexing.py:1269, in NumpyIndexingAdapter.getitem(self, key) 1267 def getitem(self, key): 1268 array, key = self._indexing_array_and_key(key) -> 1269 return array[key]

File .../lib/python3.9/site-packages/pint/quantity.py:1899, in Quantity.getitem(self, key) 1897 def getitem(self, key): 1898 try: -> 1899 return type(self)(self._magnitude[key], self._units) 1900 except PintTypeError: 1901 raise

File .../lib/python3.9/site-packages/dask/array/core.py:1892, in Array.getitem(self, index) 1889 return self 1891 out = "getitem-" + tokenize(self, index2) -> 1892 dsk, chunks = slice_array(out, self.name, self.chunks, index2, self.itemsize) 1894 graph = HighLevelGraph.from_collections(out, dsk, dependencies=[self]) 1896 meta = meta_from_array(self._meta, ndim=len(chunks))

File .../lib/python3.9/site-packages/dask/array/slicing.py:174, in slice_array(out_name, in_name, blockdims, index, itemsize) 171 index += (slice(None, None, None),) * missing 173 # Pass down to next function --> 174 dsk_out, bd_out = slice_with_newaxes(out_name, in_name, blockdims, index, itemsize) 176 bd_out = tuple(map(tuple, bd_out)) 177 return dsk_out, bd_out

File .../lib/python3.9/site-packages/dask/array/slicing.py:196, in slice_with_newaxes(out_name, in_name, blockdims, index, itemsize) 193 where_none[i] -= n 195 # Pass down and do work --> 196 dsk, blockdims2 = slice_wrap_lists(out_name, in_name, blockdims, index2, itemsize) 198 if where_none: 199 expand = expander(where_none)

File .../lib/python3.9/site-packages/dask/array/slicing.py:242, in slice_wrap_lists(out_name, in_name, blockdims, index, itemsize) 238 where_list = [ 239 i for i, ind in enumerate(index) if is_arraylike(ind) and ind.ndim > 0 240 ] 241 if len(where_list) > 1: --> 242 raise NotImplementedError("Don't yet support nd fancy indexing") 243 # Is the single list an empty list? In this case just treat it as a zero 244 # length slice 245 if where_list and not index[where_list[0]].size:

NotImplementedError: Don't yet support nd fancy indexing ```

  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6414/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1181715207 PR_kwDOAMm_X841EOjL 6415 upgrade `sphinx` keewis 14808389 closed 0     2 2022-03-26T12:16:08Z 2022-03-26T22:13:53Z 2022-03-26T22:13:50Z MEMBER   0 pydata/xarray/pulls/6415

sphinx is now close to releasing 4.5, but we're still pinning it to sphinx<4.

Along with it, this updates our RTD configuration.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6415/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1174386963 I_kwDOAMm_X85F_7kT 6382 `reindex` drops attrs keewis 14808389 closed 0     1 2022-03-19T22:37:46Z 2022-03-21T07:53:05Z 2022-03-21T07:53:04Z MEMBER      

What happened?

reindex stopped propagating attrs (detected in xarray-contrib/pint-xarray#159).

As far as I can tell, the new reindexing code in Aligner does not handle attrs yet?

Minimal Complete Verifiable Example

```Python

before #5692

In [1]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature")

In [2]: ds.reindex(lat=range(10, 80, 5)).lat Out[2]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y

In [3]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[3]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y

after #5692

In [3]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature")

In [4]: ds.reindex(lat=range(10, 80, 5)).lat Out[4]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75

In [5]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[5]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6382/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1170003912 PR_kwDOAMm_X840etlq 6361 Revert "explicitly install `ipython_genutils`" keewis 14808389 closed 0     1 2022-03-15T17:57:29Z 2022-03-15T19:06:32Z 2022-03-15T19:06:31Z MEMBER   0 pydata/xarray/pulls/6361

Since the dependency issue has been fixed upstream, this reverts pydata/xarray#6350

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6361/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1166353506 PR_kwDOAMm_X840S8Yo 6350 explicitly install `ipython_genutils` keewis 14808389 closed 0     2 2022-03-11T12:19:49Z 2022-03-15T17:56:41Z 2022-03-11T14:54:45Z MEMBER   0 pydata/xarray/pulls/6350

This can be reverted once the nbconvert package on conda-forge has been updated.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6350/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1125877338 I_kwDOAMm_X85DG4Za 6250 failing docs builds because the `scipy` intersphinx registry is unreachable keewis 14808389 closed 0     6 2022-02-07T11:44:44Z 2022-02-08T21:49:48Z 2022-02-08T21:49:47Z MEMBER      

What happened?

scipy seems to have some trouble with their documentation host setup, which means that trying to fetch its intersphinx registry returns a 404.

There's nothing we can do to really fix this, but we can try to the avoid docs build failures by disabling that intersphinx entry (not sure if that results in other errors, though)

What did you expect to happen?

No response

Minimal Complete Verifiable Example

No response

Relevant log output

python WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://docs.scipy.org/doc/scipy/objects.inv' not fetchable due to <class 'requests.exceptions.HTTPError'>: 404 Client Error: Not Found for url: https://docs.scipy.org/doc/scipy/objects.inv

Anything else we need to know?

No response

Environment

See RTD

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6250/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
861860833 MDU6SXNzdWU4NjE4NjA4MzM= 5190 PRs cancel CI on push keewis 14808389 closed 0     1 2021-04-19T20:33:52Z 2022-01-31T16:59:27Z 2022-01-31T16:59:27Z MEMBER      

The cancel step doesn't seem to be configured properly, a push to a pull request shouldn't cancel CI on master. For reference, here are the logs for two examples: CI Additional, CI.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5190/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
765639913 MDExOlB1bGxSZXF1ZXN0NTM5MDE3MDE1 4687 keep attrs in xarray.where keewis 14808389 closed 0     16 2020-12-13T20:42:40Z 2022-01-19T20:06:48Z 2022-01-19T19:35:41Z MEMBER   0 pydata/xarray/pulls/4687

Since that question came up at least twice, this allows changing the behavior of where using the keep_attrs option. Dataset.where and DataArray.where always use keep_attrs=True (set in ops.where_method), so we could probably copy that.

Edit: we should also document that only the attributes of cond will be preserved.

  • [x] Closes #4682, closes #4141
  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4687/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1043378880 PR_kwDOAMm_X84uB0QQ 5931 fix the detection of backend entrypoints keewis 14808389 closed 0     8 2021-11-03T10:40:02Z 2021-11-03T16:56:20Z 2021-11-03T16:55:55Z MEMBER   0 pydata/xarray/pulls/5931

In #5845, we accidentally broke the detection of the backends. Since this has a big impact we probably need to release v0.20.1 very soon.

I'm not sure if it's possible to add tests for this, though.

  • [x] Closes #5930
  • [ ] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5931/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
990926533 MDExOlB1bGxSZXF1ZXN0NzI5NDY1NzE4 5778 Add `self` back to the backend entrypoint example classes keewis 14808389 closed 0     1 2021-09-08T09:23:39Z 2021-09-08T10:13:22Z 2021-09-08T09:44:53Z MEMBER   0 pydata/xarray/pulls/5778

Reverts pydata/xarray#5532

Not being able to pass entrypoints to the engine parameter is a bug which should be fixed by #5684.

  • [x] closes #5777
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5778/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
976113045 MDExOlB1bGxSZXF1ZXN0NzE3MTM2OTg4 5724 extend show_versions keewis 14808389 closed 0     1 2021-08-21T11:14:37Z 2021-08-21T11:46:53Z 2021-08-21T11:24:37Z MEMBER   0 pydata/xarray/pulls/5724

add cupy, sparse and fsspec to the output of xr.show_versions()

  • [x] Passes pre-commit run --all-files
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5724/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
935279688 MDExOlB1bGxSZXF1ZXN0NjgyMjIzODA2 5560 conditionally disable bottleneck keewis 14808389 closed 0     7 2021-07-01T23:12:03Z 2021-08-12T15:05:48Z 2021-08-12T14:41:34Z MEMBER   0 pydata/xarray/pulls/5560

As this came up in #5424 (and because I can't seem to reliably reproduce expected values in #4972 if it is enabled) this adds a option to disable bottleneck, even if it is installed.

In #5424 it was suggested to also allow replacing bottleneck with numbagg. If that's something we want (and numbagg supports the operations we need) I can try looking into renaming the new option to something like xr.set_options(accelerate_with="bottleneck") (or something else, if someone has a great idea).

Tests are missing because I have no idea how to check that this works (except by mocking bottleneck).

  • [x] Closes #5424
  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5560/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
958203328 MDExOlB1bGxSZXF1ZXN0NzAxNTE3NDA4 5665 update the link to `scipy`'s intersphinx file keewis 14808389 closed 0     2 2021-08-02T14:27:24Z 2021-08-02T22:55:23Z 2021-08-02T15:55:49Z MEMBER   0 pydata/xarray/pulls/5665
  • [x] Passes pre-commit run --all-files
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5665/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
956551664 MDExOlB1bGxSZXF1ZXN0NzAwMTI3NzA0 5650 fix the binder environment keewis 14808389 closed 0     3 2021-07-30T09:00:05Z 2021-07-30T19:45:30Z 2021-07-30T10:56:19Z MEMBER   0 pydata/xarray/pulls/5650

As far as I can tell, the examples on binder currently fail because pooch is not installed.

  • [x] Passes pre-commit run --all-files
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5650/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
951121126 MDExOlB1bGxSZXF1ZXN0Njk1NTYxMzc1 5630 remove deprecations scheduled for 0.19 keewis 14808389 closed 0     10 2021-07-22T23:02:16Z 2021-07-23T20:41:23Z 2021-07-23T20:12:38Z MEMBER   0 pydata/xarray/pulls/5630

As discussed in the meeting yesterday, here's the PR for the scheduled deprecations. We also have quite a few deprecations which are not scheduled... we should check which of them can be removed / completed, but that can wait until after the release.

I tried to keep the changes for each individual deprecation in their own commit, which should make rolling back easier in case we decide removing / completing a specific deprecation is too aggressive.

whats-new.rst still needs to be updated, but I'll leave that until after the code changes have been approved.

  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5630/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
949888403 MDExOlB1bGxSZXF1ZXN0Njk0NTIwMTY2 5627 pin fsspec keewis 14808389 closed 0     1 2021-07-21T16:19:16Z 2021-07-21T17:05:16Z 2021-07-21T16:43:32Z MEMBER   0 pydata/xarray/pulls/5627

As discussed this pins fsspec to known good versions, which should hopefully make sure the CI passes.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5627/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
945390900 MDExOlB1bGxSZXF1ZXN0NjkwNzI2OTE4 5608 check the development version of fsspec in the upstream-dev CI keewis 14808389 closed 0     2 2021-07-15T13:39:35Z 2021-07-15T14:09:16Z 2021-07-15T14:09:13Z MEMBER   0 pydata/xarray/pulls/5608

cc @martindurant

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5608/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
936485545 MDExOlB1bGxSZXF1ZXN0NjgzMTk1NDI5 5573 remove the sync script keewis 14808389 closed 0     1 2021-07-04T15:45:13Z 2021-07-11T11:41:33Z 2021-07-11T11:19:31Z MEMBER   0 pydata/xarray/pulls/5573

follow-up to #5468. We removed the external version file but the sync script is still run in CI, which obviously fails and causes the report to be stripped from the PR comment.

For a future PR / investigation: pre-commit allows installing dependencies from the language's package index (e.g. PyPI for python), but packages installed using that method are not updated until the hook is updated / reinstalled. For blackdoc we were thinking about pinning black at some point so we could work around that by synchronizing the version with the hook version, but this is probably more work than it is worth. I guess this means we have to wait on pre-commit to fix this and in the meantime we'll probably have to tell people to clean their pre-commit caches on updates for black and numpy...

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5573/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
928501063 MDExOlB1bGxSZXF1ZXN0Njc2NDk1ODY2 5520 update references to `master` keewis 14808389 closed 0     2 2021-06-23T17:48:58Z 2021-06-24T16:10:44Z 2021-06-24T08:53:36Z MEMBER   0 pydata/xarray/pulls/5520

towards #5516. With this, all references to the master branch have been updated to point to main.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5520/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
928448281 MDU6SXNzdWU5Mjg0NDgyODE= 5517 mypy issues on `main` keewis 14808389 closed 0     3 2021-06-23T16:38:24Z 2021-06-24T08:58:06Z 2021-06-24T08:58:06Z MEMBER      

The pre-commit is reporting some mypy issues on the main branch: xarray/core/formatting.py:193: error: No overload variant of "max" matches argument types "signedinteger[Any]", "int" [call-overload] xarray/core/formatting.py:193: note: Possible overload variant: xarray/core/formatting.py:193: note: def [SupportsLessThanT <: SupportsLessThan] max(__arg1, SupportsLessThanT, SupportsLessThanT, *_args: SupportsLessThanT, key: None = ...) -> SupportsLessThanT xarray/core/formatting.py:193: note: <5 more non-matching overloads not shown> xarray/core/indexes.py:166: error: Incompatible types in assignment (expression has type "Union[dtype[object_], dtype[Any], dtype[void]]", variable has type "dtype[object_]") [assignment] xarray/core/computation.py:607: error: Argument 1 to "append" of "list" has incompatible type "None"; expected "slice" [arg-type] xarray/core/accessor_str.py:277: error: <nothing> not callable [misc] Found 4 errors in 4 files (checked 143 source files)

does anyone know how to fix those?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5517/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
928463126 MDExOlB1bGxSZXF1ZXN0Njc2NDYzNDYy 5518 fix RTD keewis 14808389 closed 0     1 2021-06-23T16:57:31Z 2021-06-23T17:15:58Z 2021-06-23T17:14:12Z MEMBER   0 pydata/xarray/pulls/5518

The stable builds on RTD are currently failing. latest is failing, too, but that might be fixed by #5476 (I can also cherry-pick the merge commit to main, though).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5518/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
895470157 MDU6SXNzdWU4OTU0NzAxNTc= 5344 failing cftime plot tests keewis 14808389 closed 0     2 2021-05-19T13:48:49Z 2021-06-12T12:57:52Z 2021-06-12T12:57:52Z MEMBER      

This was reported in #5077 and seems to be an upstream issue (see https://github.com/pydata/xarray/issues/5077#issuecomment-808605313):

FAILED xarray/tests/test_plot.py::TestCFDatetimePlot::test_cfdatetime_line_plot FAILED xarray/tests/test_plot.py::TestCFDatetimePlot::test_cfdatetime_pcolormesh_plot FAILED xarray/tests/test_plot.py::TestCFDatetimePlot::test_cfdatetime_contour_plot

In order to get the upsteam-dev CI open new issues for other failing tests on upstream-dev I've xfail'ed them in #5343. This should be undone once it is fixed in either cftime or nc-time-axis.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5344/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 96.396ms · About: xarray-datasette