id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type
2194953062,PR_kwDOAMm_X85qFqp1,8854,array api-related upstream-dev failures,14808389,open,0,,,15,2024-03-19T13:17:09Z,2024-05-03T22:46:41Z,,MEMBER,,0,pydata/xarray/pulls/8854,"- [x] towards #8844

This ""fixes"" the upstream-dev failures related to the removal of `numpy.array_api`. There are a couple of open questions, though:
- `array-api-strict` is not installed by default, so `namedarray` would get a new dependency. Not sure how to deal with that – as far as I can tell, `numpy.array_api` was not supposed to be used that way, so maybe we need to use `array-api-compat` instead? What do you think, @andersy005, @Illviljan?
- `array-api-strict` does not define `Array.nbytes` (causing a funny exception that wrongly claims `DataArray` does not define `nbytes`)
- `array-api-strict` has a different `DType` class, which makes it tricky to work with both `numpy` dtypes and said dtype class in the same code. In particular, if I understand correctly we're supposed to check dtypes using `isdtype`, but `numpy.isdtype` will only exist in `numpy>=2`, `array-api-strict`'s version does not define datetime / string / object dtypes, and `numpy.issubdtype` does not work with the non-`numpy` dtype class). So maybe we need to use `array-api-compat` internally?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8854/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2269295936,PR_kwDOAMm_X85uBwtv,8983,fixes for the `pint` tests,14808389,open,0,,,0,2024-04-29T15:09:28Z,2024-05-03T18:30:06Z,,MEMBER,,0,pydata/xarray/pulls/8983,"This removes the use of the deprecated `numpy.core._exceptions.UFuncError` (and multiplication as a way to attach units), and makes sure we run the `pint` tests in the upstream-dev CI again.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2275404926,PR_kwDOAMm_X85uWjVP,8993,call `np.cross` with 3D vectors only,14808389,closed,0,,,1,2024-05-02T12:21:30Z,2024-05-03T15:56:49Z,2024-05-03T15:22:26Z,MEMBER,,0,pydata/xarray/pulls/8993,"- [x] towards #8844

In the tests, we've been calling `np.cross` with vectors of 2 or 3 dimensions, `numpy>=2` will deprecate 2D vectors (*plus*, we're now raising on warnings). Thus, we 0-pad the inputs before generating the expected result (which generally should not change the outcome of the tests).

For a later PR: add tests to check if `xr.cross` works if more than a single dimension is present, and pre-compute the expected result. Also, for property-based testing: the cross-product of two vectors is perpendicular to both input vectors (use the dot product to check that), and its length (l2-norm) is the product of the lengths of the input vectors.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8993/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2241526039,PR_kwDOAMm_X85skMs0,8939,avoid a couple of warnings in `polyfit`,14808389,closed,0,,,14,2024-04-13T11:49:13Z,2024-05-01T16:42:06Z,2024-05-01T15:34:20Z,MEMBER,,0,pydata/xarray/pulls/8939,"- [x] towards #8844
---
- replace `numpy.core.finfo` with `numpy.finfo`
- add `dtype` and `copy` parameters to all definitions of `__array__`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8939/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2270984193,PR_kwDOAMm_X85uHk70,8986,clean up the upstream-dev setup script,14808389,closed,0,,,1,2024-04-30T09:34:04Z,2024-04-30T23:26:13Z,2024-04-30T20:59:56Z,MEMBER,,0,pydata/xarray/pulls/8986,"In trying to install packages that are compatible with `numpy>=2` I added several projects that are built in CI without build isolation (so that they will be built with the nightly version of `numpy`). That was a temporary workaround, so we should start thinking about cleaning this up.

As it seems `numcodecs` is now compatible (or uses less of `numpy` in compiled code, not sure), this is an attempt to see if CI works if we use the version from `conda-forge`.

`bottleneck` and `cftime` now build against `numpy>=2.0.0rc1`, so we can stop building them without build isolation.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8986/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2267711587,PR_kwDOAMm_X85t8VWy,8978,more engine environment tricks in preparation for `numpy>=2`,14808389,closed,0,,,7,2024-04-28T17:54:38Z,2024-04-29T14:56:22Z,2024-04-29T14:56:21Z,MEMBER,,0,pydata/xarray/pulls/8978,"Turns out `pydap` also needs to build with `numpy>=2`. Until it does, we should remove it from the upstream-dev environment. Also, `numcodecs` build-depends on `setuptools-scm`.

And finally, the `h5py` nightlies might support `numpy>=2` (`h5py>=3.11` supposedly is `numpy>=2` compatible), so once again I'll try and see if CI passes.

- [x] towards #8844","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8978/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
590630281,MDU6SXNzdWU1OTA2MzAyODE=,3921,issues discovered by the all-but-dask CI,14808389,closed,0,,,4,2020-03-30T22:08:46Z,2024-04-25T14:48:15Z,2024-02-10T02:57:34Z,MEMBER,,,,"After adding the `py38-all-but-dask` CI in #3919, it discovered a few backend issues:
- `zarr`:
  - [x] `open_zarr` with `chunks=""auto""` always tries to chunk, even if `dask` is not available (fixed in #3919)
  - [x] `ZarrArrayWrapper.__getitem__` incorrectly passes the indexer's `tuple` attribute to `_arrayize_vectorized_indexer` (this only happens if `dask` is not available) (fixed in #3919)
  - [x] slice indexers with negative steps get transformed incorrectly if `dask` is not available https://github.com/pydata/xarray/pull/8674
- `rasterio`:
  - ~calling `pickle.dumps` on a `Dataset` object returned by `open_rasterio` fails because a non-serializable lock was used (if `dask` is installed, a serializable lock is used instead)~","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3921/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
2234142680,PR_kwDOAMm_X85sK0g8,8923,"`""source""` encoding for datasets opened from `fsspec` objects",14808389,open,0,,,5,2024-04-09T19:12:45Z,2024-04-23T16:54:09Z,,MEMBER,,0,pydata/xarray/pulls/8923,"When opening files from path-like objects (`str`, `pathlib.Path`), the backend machinery (`_dataset_from_backend_dataset`) sets the `""source""` encoding. This is useful if we need the original path for additional processing, like writing to a similarly named file, or to extract additional metadata. This would be useful as well when using `fsspec` to open remote files.

In this PR, I'm extracting the `path` attribute that most `fsspec` objects have to set that value. I've considered using `isinstance` checks instead of the `getattr`-with-default, but the list of potential classes is too big to be practical (at least 4 classes just within `fsspec` itself).

If this sounds like a good idea, I'll update the documentation of the `""source""` encoding to mention this feature.

<!-- Feel free to remove check-list items aren't relevant to your change -->

- [x] Tests added
- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8923/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2255271332,PR_kwDOAMm_X85tSKJs,8961,use `nan` instead of `NaN`,14808389,closed,0,,,0,2024-04-21T21:26:18Z,2024-04-21T22:01:04Z,2024-04-21T22:01:03Z,MEMBER,,0,pydata/xarray/pulls/8961,"FYI @aulemahal, `numpy.NaN` will be removed in the upcoming `numpy=2.0` release.

- [x] follow-up to #8603","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8961/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2241492018,PR_kwDOAMm_X85skF_A,8937,drop support for `python=3.9`,14808389,open,0,,,3,2024-04-13T10:18:04Z,2024-04-15T15:07:39Z,,MEMBER,,0,pydata/xarray/pulls/8937,"According to our policy (and NEP-29) we can drop support for `python=3.9` since about a week ago. Interestingly, SPEC0 says we could have started doing this about half a year ago (Q4 2023).

We could delay this until we have a release that is compatible with `numpy>=2.0`, though (`numpy>=2.1` will drop support for `python=3.9`).

- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8937/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2241528898,PR_kwDOAMm_X85skNON,8940,adapt more tests to the copy-on-write behavior of pandas,14808389,closed,0,,,1,2024-04-13T11:57:10Z,2024-04-13T19:36:30Z,2024-04-13T14:44:50Z,MEMBER,,0,pydata/xarray/pulls/8940,- [x] follow-up to #8846,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8940/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2241499231,PR_kwDOAMm_X85skHW9,8938,use `pd.to_timedelta` instead of `TimedeltaIndex`,14808389,closed,0,,,0,2024-04-13T10:38:12Z,2024-04-13T12:32:14Z,2024-04-13T12:32:13Z,MEMBER,,0,pydata/xarray/pulls/8938,"`pandas` recently removed the deprecated `unit` kwarg to `TimedeltaIndex`.

- [x] towards #8844","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8938/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2181644595,PR_kwDOAMm_X85pYPWY,8823,try to get the `upstream-dev` CI to complete again,14808389,closed,0,,,2,2024-03-12T13:36:20Z,2024-03-12T16:59:12Z,2024-03-12T16:04:53Z,MEMBER,,0,pydata/xarray/pulls/8823,"There's a couple of accumulated failures now, including a crash because `pandas` apparently depends on `pyarrow` now, which on `conda-forge` is not built for `numpy>=2.0`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8823/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2141273710,PR_kwDOAMm_X85nOs6t,8767,new whats-new section,14808389,closed,0,,,0,2024-02-19T00:39:59Z,2024-02-20T10:35:35Z,2024-02-20T10:35:35Z,MEMBER,,0,pydata/xarray/pulls/8767,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8767/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2141229636,PR_kwDOAMm_X85nOjve,8766,release v2024.02.0,14808389,closed,0,,,0,2024-02-18T23:06:06Z,2024-02-18T23:06:22Z,2024-02-18T23:06:22Z,MEMBER,,0,pydata/xarray/pulls/8766,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2141111970,PR_kwDOAMm_X85nOMmO,8764,release summary for 2024.02.0,14808389,closed,0,,,3,2024-02-18T17:45:01Z,2024-02-18T23:00:26Z,2024-02-18T22:52:14Z,MEMBER,,0,pydata/xarray/pulls/8764,- [x] closes #8748,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8764/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1505375386,PR_kwDOAMm_X85F6MBQ,7395,implement `isnull` using `full_like` instead of `zeros_like`,14808389,closed,0,,,2,2022-12-20T22:07:30Z,2024-01-28T14:19:26Z,2024-01-23T18:29:14Z,MEMBER,,0,pydata/xarray/pulls/7395,"After changing the behavior of the implementation of `*_like` in `pint`, it seems comparisons fail now. As it turns out, this is because we're using `zeros_like` to return all-`False` arrays from `isnull` for input with non-nullable dtypes.

I'd argue that
```python
full_like(data, dtype=bool, fill_value=False)
```
is a little bit easier to understand than
```python
zeros_like(data, dtype=bool)
```
as the latter requires knowledge about the bit representation of `False`, so the change is not *only* to get `pint` to work properly.

- [x] Tests added","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7395/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2010594399,PR_kwDOAMm_X85gWlAz,8483,import from the new location of `normalize_axis_index` if possible,14808389,closed,0,,,13,2023-11-25T12:19:32Z,2024-01-18T16:52:02Z,2024-01-18T15:34:57Z,MEMBER,,0,pydata/xarray/pulls/8483,"Another one of the `numpy=2.0` fixes, this time `numpy.core.multiarray.normalize_axis_index` has been moved to `numpy.lib.array_utils` (and apparently this is the first time it has been officially exposed as public API).

Since as far as I remember `numpy` is working on removing `numpy.core` entirely, we might also want to change our usage of `defchararray` (in the formatting tests). Not sure how, though.

- [x] Towards #8091","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8483/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2078559800,PR_kwDOAMm_X85j6NsC,8605,run CI on `python=3.12`,14808389,closed,0,,,7,2024-01-12T10:47:18Z,2024-01-17T21:54:13Z,2024-01-17T21:54:12Z,MEMBER,,0,pydata/xarray/pulls/8605,- [x] Closes #8580,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2079089277,I_kwDOAMm_X8577GJ9,8607,allow computing just a small number of variables,14808389,open,0,,,4,2024-01-12T15:21:27Z,2024-01-12T20:20:29Z,,MEMBER,,,,"### Is your feature request related to a problem?

I frequently find myself computing a handful of variables of a dataset (typically coordinates) and assigning them back to the dataset, and wishing we had a method / function that allowed that.

### Describe the solution you'd like

I'd imagine something like
```python
ds.compute(variables=variable_names)
```
but I'm undecided on whether that's a good idea (it might make `.compute` more complex?)

### Describe alternatives you've considered

So far I've been using something like
```python
ds.assign_coords({k: lambda ds: ds[k].compute() for k in variable_names})
ds.pipe(lambda ds: ds.merge(ds[variable_names].compute()))
```
but both are not easy to type / understand (though having `.merge` take a callable would make this much easier). Also, the first option computes variables separately, which may not be ideal?

### Additional context

_No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
1977836822,PR_kwDOAMm_X85enruo,8416,migrate the other CI to python 3.11,14808389,closed,0,,,3,2023-11-05T15:26:31Z,2024-01-03T20:17:11Z,2023-11-17T15:27:21Z,MEMBER,,0,pydata/xarray/pulls/8416,"(namely, additional CI and upstream-dev CI)

`python=3.11` has been released more than a year ago and `python=3.12` is out as well, which means that it is a good idea to migrate sooner than later.

Regarding `python=3.12`, usually it is `numba` that keeps us from testing on a new `python` version for some time, where `numbagg` and `sparse` are the only dependencies that would use it. Should we create a environment without those two dependencies and switch back to the normal one once `numba` supports the new python version?

We still have the special environment files for `python>=3.11` because the normal ones still include `cdms2`. We deprecated that back in May – not sure which release that ended up in – but since `cdms2` will be abandoned end of this year, that's when we're free to drop support and merge both environments (though maybe we can justify dropping support earlier?)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8416/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2060807644,PR_kwDOAMm_X85i-Lpn,8576,ignore a `DeprecationWarning` emitted by `seaborn`,14808389,closed,0,,,0,2023-12-30T17:30:28Z,2023-12-30T22:10:08Z,2023-12-30T22:10:08Z,MEMBER,,0,pydata/xarray/pulls/8576,"Not sure if this is something that we'll only see on `main` after the next release of `pandas` (if ever), though.

I also moved the `hdf5` warning to `xarray/tests/__init__.py`, as that is usually the source of these warnings.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2028193332,PR_kwDOAMm_X85hSQNW,8526,explicitly skip using `__array_namespace__` for `numpy.ndarray`,14808389,closed,0,,,3,2023-12-06T10:09:48Z,2023-12-07T09:18:05Z,2023-12-06T17:58:46Z,MEMBER,,0,pydata/xarray/pulls/8526,"- [x] towards #8091

`numpy` recently added `__array_namespace__` to the `ndarray` class, which returns the main `numpy` module. However, that does not yet provide a couple of functions, in this case we need `concat`. This adds a additional condition to `duck_array_ops.concat` that disables using `__array_namespace__` for `ndarray` objects.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8526/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1655290694,I_kwDOAMm_X85iqbtG,7721,`as_shared_dtype` converts scalars to 0d `numpy` arrays if chunked `cupy` is involved,14808389,open,0,,,7,2023-04-05T09:48:34Z,2023-12-04T10:45:43Z,,MEMBER,,,,"I tried to run `where` with chunked `cupy` arrays:
```python
In [1]: import xarray as xr
   ...: import cupy
   ...: import dask.array as da
   ...: 
   ...: arr = xr.DataArray(cupy.arange(4), dims=""x"")
   ...: mask = xr.DataArray(cupy.array([False, True, True, False]), dims=""x"")
```
this works:
```python
In [2]: arr.where(mask)
Out[2]: 
<xarray.DataArray (x: 4)>
array([nan,  1.,  2., nan])
Dimensions without coordinates: x
```
this fails:
```python
In [4]: arr.chunk().where(mask).compute()
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[4], line 1
----> 1 arr.chunk().where(mask).compute()

File ~/repos/xarray/xarray/core/dataarray.py:1095, in DataArray.compute(self, **kwargs)
   1076 """"""Manually trigger loading of this array's data from disk or a
   1077 remote source into memory and return a new array. The original is
   1078 left unaltered.
   (...)
   1092 dask.compute
   1093 """"""
   1094 new = self.copy(deep=False)
-> 1095 return new.load(**kwargs)

File ~/repos/xarray/xarray/core/dataarray.py:1069, in DataArray.load(self, **kwargs)
   1051 def load(self: T_DataArray, **kwargs) -> T_DataArray:
   1052     """"""Manually trigger loading of this array's data from disk or a
   1053     remote source into memory and return this array.
   1054 
   (...)
   1067     dask.compute
   1068     """"""
-> 1069     ds = self._to_temp_dataset().load(**kwargs)
   1070     new = self._from_temp_dataset(ds)
   1071     self._variable = new._variable

File ~/repos/xarray/xarray/core/dataset.py:752, in Dataset.load(self, **kwargs)
    749 import dask.array as da
    751 # evaluate all the dask arrays simultaneously
--> 752 evaluated_data = da.compute(*lazy_data.values(), **kwargs)
    754 for k, data in zip(lazy_data, evaluated_data):
    755     self.variables[k].data = data

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/base.py:600, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
    597     keys.append(x.__dask_keys__())
    598     postcomputes.append(x.__dask_postcompute__())
--> 600 results = schedule(dsk, keys, **kwargs)
    601 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/threaded.py:89, in get(dsk, keys, cache, num_workers, pool, **kwargs)
     86     elif isinstance(pool, multiprocessing.pool.Pool):
     87         pool = MultiprocessingPoolExecutor(pool)
---> 89 results = get_async(
     90     pool.submit,
     91     pool._max_workers,
     92     dsk,
     93     keys,
     94     cache=cache,
     95     get_id=_thread_get_id,
     96     pack_exception=pack_exception,
     97     **kwargs,
     98 )
    100 # Cleanup pools associated to dead threads
    101 with pools_lock:

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:511, in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs)
    509         _execute_task(task, data)  # Re-execute locally
    510     else:
--> 511         raise_exception(exc, tb)
    512 res, worker_id = loads(res_info)
    513 state[""cache""][key] = res

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:319, in reraise(exc, tb)
    317 if exc.__traceback__ is not tb:
    318     raise exc.with_traceback(tb)
--> 319 raise exc

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:224, in execute_task(key, task_info, dumps, loads, get_id, pack_exception)
    222 try:
    223     task, data = loads(task_info)
--> 224     result = _execute_task(task, data)
    225     id = get_id()
    226     result = dumps((result, id))

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk)
    115     func, args = arg[0], arg[1:]
    116     # Note: Don't assign the subtask results to a variable. numpy detects
    117     # temporaries by their reference count and can execute certain
    118     # operations in-place.
--> 119     return func(*(_execute_task(a, cache) for a in args))
    120 elif not ishashable(arg):
    121     return arg

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/optimization.py:990, in SubgraphCallable.__call__(self, *args)
    988 if not len(args) == len(self.inkeys):
    989     raise ValueError(""Expected %d args, got %d"" % (len(self.inkeys), len(args)))
--> 990 return core.get(self.dsk, self.outkey, dict(zip(self.inkeys, args)))

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:149, in get(dsk, out, cache)
    147 for key in toposort(dsk):
    148     task = dsk[key]
--> 149     result = _execute_task(task, cache)
    150     cache[key] = result
    151 result = _execute_task(out, cache)

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk)
    115     func, args = arg[0], arg[1:]
    116     # Note: Don't assign the subtask results to a variable. numpy detects
    117     # temporaries by their reference count and can execute certain
    118     # operations in-place.
--> 119     return func(*(_execute_task(a, cache) for a in args))
    120 elif not ishashable(arg):
    121     return arg

File <__array_function__ internals>:180, in where(*args, **kwargs)

File cupy/_core/core.pyx:1723, in cupy._core.core._ndarray_base.__array_function__()

File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/cupy/_sorting/search.py:211, in where(condition, x, y)
    209 if fusion._is_fusing():
    210     return fusion._call_ufunc(_where_ufunc, condition, x, y)
--> 211 return _where_ufunc(condition.astype('?'), x, y)

File cupy/_core/_kernel.pyx:1287, in cupy._core._kernel.ufunc.__call__()

File cupy/_core/_kernel.pyx:160, in cupy._core._kernel._preprocess_args()

File cupy/_core/_kernel.pyx:146, in cupy._core._kernel._preprocess_arg()

TypeError: Unsupported type <class 'numpy.ndarray'>
```
this works again:
```python
In [7]: arr.chunk().where(mask.chunk(), cupy.array(cupy.nan)).compute()
Out[7]: 
<xarray.DataArray (x: 4)>
array([nan,  1.,  2., nan])
Dimensions without coordinates: x
```
And other methods like `fillna` show similar behavior.

I think the reason is that this: https://github.com/pydata/xarray/blob/d4db16699f30ad1dc3e6861601247abf4ac96567/xarray/core/duck_array_ops.py#L195 is not sufficient to detect `cupy` beneath other layers of duckarrays (most commonly `dask`, `pint`, or both). In this specific case we could extend the condition to also match chunked `cupy` arrays (like `arr.cupy.is_cupy` does, but using `is_duck_dask_array`), but this will still break for other duckarray layers or if `dask` is not involved, and we're also in the process of moving away from special-casing `dask`. So short of asking `cupy` to treat 0d arrays like scalars I'm not sure how to fix this.

cc @jacobtomlinson","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7721/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
1988047821,PR_kwDOAMm_X85fKfB6,8441,remove `cdms2`,14808389,closed,0,,,4,2023-11-10T17:25:50Z,2023-11-14T17:34:57Z,2023-11-14T17:15:49Z,MEMBER,,0,pydata/xarray/pulls/8441,"`cdms2` is going to be abandoned at the end of this year, the recommended replacement is `xcdat`. We have deprecated our conversion functions in 2023.06.0 (release on June 23) which was less than 6 months ago, but given that `cdms2` is already not installable on some architectures it makes sense to remove earlier.

This also appears to allow us to remove the special `python=3.11` environment files.

- [x] Follow-up to #7876, closes #8419
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8441/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1730451312,I_kwDOAMm_X85nJJdw,7879,occasional segfaults on CI,14808389,closed,0,,,3,2023-05-29T09:52:01Z,2023-11-06T22:03:43Z,2023-11-06T22:03:42Z,MEMBER,,,,"The upstream-dev CI currently fails sometimes due to a segfault (the normal CI crashes, too, but since we use `pytest-xdist` we only get a message stating ""worker x crashed"").

I'm not sure why, and I can't reproduce locally, either. Given that `dask`'s local scheduler is in the traceback and the failing test is `test_open_mfdataset_manyfiles`, I assume there's some issue with parallel disk access or the temporary file creation.

<details><summary>log of the segfaulting CI job</summary>

```
============================= test session starts ==============================
platform linux -- Python 3.10.11, pytest-7.3.1, pluggy-1.0.0
rootdir: /home/runner/work/xarray/xarray
configfile: setup.cfg
testpaths: xarray/tests, properties
plugins: env-0.8.1, xdist-3.3.1, timeout-2.1.0, cov-4.1.0, reportlog-0.1.2, hypothesis-6.75.6
timeout: 60.0s
timeout method: signal
timeout func_only: False
collected 16723 items / 2 skipped

xarray/tests/test_accessor_dt.py ....................................... [  0%]
........................................................................ [  0%]
........................................................................ [  1%]
........................................................................ [  1%]
...............................                                          [  1%]
xarray/tests/test_accessor_str.py ...................................... [  1%]
........................................................................ [  2%]
........................................................................ [  2%]
.............................................                            [  3%]
xarray/tests/test_array_api.py ...........                               [  3%]
xarray/tests/test_backends.py ........................X........x........ [  3%]
...................................s.........................X........x. [  3%]
.........................................s.........................X.... [  4%]
....x.......................................X........................... [  4%]
....................................X........x.......................... [  5%]
.............X.......................................................... [  5%]
....X........x....................x.x................X.................. [  5%]
x..x..x..x...................................X........x................. [  6%]
...x.x................X..................x..x..x..x..................... [  6%]
..............X........x....................x.x................X........ [  7%]
..........x..x..x..x.......................................X........x... [  7%]
...........................................X........x................... [  8%]
...ss........................X........x................................. [  8%]
.................X........x............................................. [  8%]
..X........x.............................................X........x..... [  9%]
............................................X........x.................. [  9%]
.........................................................X........x..... [ 10%]
......................................................................X. [ 10%]
.......x................................................................ [ 11%]
Fatal Python error: Segmentation fault

Thread 0x00007f9c7b8ff640 (most recent call first):
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py"", line 81 in _worker
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 953 in run
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 1016 in _bootstrap_inner
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 973 in _bootstrap

Current thread 0x00007f9c81f1d640 (most recent call first):
  File ""/home/runner/work/xarray/xarray/xarray/backends/file_manager.py"", line 216 in _acquire_with_cache_info
  File ""/home/runner/work/xarray/xarray/xarray/backends/file_manager.py"", line 198 in acquire_context
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py"", line 135 in __enter__
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 392 in _acquire
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 398 in ds
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 336 in __init__
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 389 in open
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 588 in open_dataset
  File ""/home/runner/work/xarray/xarray/xarray/backends/api.py"", line 566 in open_dataset
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py"", line 73 in apply
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py"", line 121 in _execute_task
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 224 in execute_task
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 238 in <listcomp>
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 238 in batch_execute_tasks
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py"", line 58 in run
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py"", line 83 in _worker
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 953 in run
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 1016 in _bootstrap_inner
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 973 in _bootstrap

Thread 0x00007f9c82f1e640 (most recent call first):
  File ""/home/runner/work/xarray/xarray/xarray/backends/file_manager.py"", line 216 in _acquire_with_cache_info
  File ""/home/runner/work/xarray/xarray/xarray/backends/file_manager.py"", line 198 in acquire_context
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py"", line 135 in __enter__
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 392 in _acquire
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 398 in ds
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 336 in __init__
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 389 in open
  File ""/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py"", line 588 in open_dataset
  File ""/home/runner/work/xarray/xarray/xarray/backends/api.py"", line 566 in open_dataset
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py"", line 73 in apply
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py"", line 121 in _execute_task
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 224 in execute_task
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 238 in <listcomp>
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 238 in batch_execute_tasks
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py"", line 58 in run
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py"", line 83 in _worker
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 953 in run
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 1016 in _bootstrap_inner
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 973 in _bootstrap

Thread 0x00007f9ca575e740 (most recent call first):
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py"", line 320 in wait
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/queue.py"", line 171 in get
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 137 in queue_get
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py"", line 500 in get_async
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py"", line 89 in get
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py"", line 595 in compute
  File ""/home/runner/work/xarray/xarray/xarray/backends/api.py"", line 1046 in open_mfdataset
  File ""/home/runner/work/xarray/xarray/xarray/tests/test_backends.py"", line 3295 in test_open_mfdataset_manyfiles
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py"", line 194 in pytest_pyfunc_call
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py"", line 39 in _multicall
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py"", line 80 in _hookexec
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py"", line 265 in __call__
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py"", line 1799 in runtest
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py"", line 169 in pytest_runtest_call
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py"", line 39 in _multicall
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py"", line 80 in _hookexec
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py"", line 265 in __call__
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py"", line 262 in <lambda>
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py"", line 341 in from_call
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py"", line 261 in call_runtest_hook
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py"", line 222 in call_and_report
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py"", line 133 in runtestprotocol
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py"", line 114 in pytest_runtest_protocol
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py"", line 39 in _multicall
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py"", line 80 in _hookexec
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py"", line 265 in __call__
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py"", line 348 in pytest_runtestloop
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py"", line 39 in _multicall
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py"", line 80 in _hookexec
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py"", line 265 in __call__
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py"", line 323 in _main
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py"", line 269 in wrap_session
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py"", line 316 in pytest_cmdline_main
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py"", line 39 in _multicall
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py"", line 80 in _hookexec
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py"", line 265 in __call__
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py"", line 166 in main
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py"", line 189 in console_main
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pytest/__main__.py"", line 5 in <module>
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py"", line 86 in _run_code
  File ""/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py"", line 196 in _run_module_as_main

Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pandas._libs.ops, numexpr.interpreter, bottleneck.move, bottleneck.nonreduce, bottleneck.nonreduce_axis, bottleneck.reduce, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.indexing, pandas._libs.index, pandas._libs.internals, pandas._libs.join, pandas._libs.writers, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, cftime._cftime, yaml._yaml, cytoolz.utils, cytoolz.itertoolz, cytoolz.functoolz, cytoolz.dicttoolz, cytoolz.recipes, xxhash._xxhash, psutil._psutil_linux, psutil._psutil_posix, markupsafe._speedups, numpy.linalg.lapack_lite, matplotlib._c_internal_utils, PIL._imaging, matplotlib._path, kiwisolver._cext, scipy._lib._ccallback_c, _cffi_backend, unicodedata2, netCDF4._netCDF4, h5py._errors, h5py.defs, h5py._objects, h5py.h5, h5py.h5r, h5py.utils, h5py.h5s, h5py.h5ac, h5py.h5p, h5py.h5t, h5py._conv, h5py.h5z, h5py._proxy, h5py.h5a, h5py.h5d, h5py.h5ds, h5py.h5g, h5py.h5i, h5py.h5f, h5py.h5fd, h5py.h5pl, h5py.h5o, h5py.h5l, h5py._selector, pyproj._compat, pyproj._datadir, pyproj._network, pyproj._geod, pyproj.list, pyproj._crs, pyproj.database, pyproj._transformer, pyproj._sync, matplotlib._image, rasterio._version, rasterio._err, rasterio._filepath, rasterio._env, rasterio._transform, rasterio._base, rasterio.crs, rasterio._features, rasterio._warp, rasterio._io, numcodecs.compat_ext, numcodecs.blosc, numcodecs.zstd, numcodecs.lz4, numcodecs._shuffle, msgpack._cmsgpack, numcodecs.vlen, zstandard.backend_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._isolve._iterative, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._ellip_harm_2, scipy.spatial.transform._rotation, scipy.ndimage._nd_image, _ni_label, scipy.ndimage._ni_label, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize.__nnls, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._statlib, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._mvn, scipy.stats._rcont.rcont, scipy.cluster._vq, scipy.cluster._hierarchy, scipy.cluster._optimal_leaf_ordering, shapely.lib, shapely._geos, shapely._geometry_helpers, cartopy.trace, scipy.fftpack.convolve, tornado.speedups, cf_units._udunits2, scipy.io.matlab._mio_utils, scipy.io.matlab._streams, scipy.io.matlab._mio5_utils (total: 241)
/home/runner/work/_temp/b3f3888c-5349-4d19-80f6-41d140b86db5.sh: line 3:  6114 Segmentation fault      (core dumped) python -m pytest --timeout=60 -rf --report-log output-3.10-log.jsonl
```

</details>","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7879/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned,13221727,issue
1158378382,I_kwDOAMm_X85FC3OO,6323,propagation of `encoding`,14808389,open,0,,,8,2022-03-03T12:57:29Z,2023-10-25T23:20:31Z,,MEMBER,,,,"### What is your issue?

We frequently get bug reports related to `encoding` that can usually be fixed by clearing it or by overriding it using the `encoding` parameter of the `to_*` methods, e.g.
- #4224
- #4380
- #4655
- #5427
- #5490
- fsspec/kerchunk#130

There are also a few discussions with more background:
- https://github.com/pydata/xarray/pull/5065#issuecomment-806154872
- https://github.com/pydata/xarray/issues/1614
- #5082
- #5336

We discussed this in the meeting yesterday and as far as I can remember agreed that the current default behavior is not ideal and decided to investigate #5336: a `keep_encoding` option, similar to `keep_attrs`, that would be `True` (propagate `encoding`) by default but will be changed to `False` (drop `encoding` on any operation) in the future.

cc @rabernat, @shoyer","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6323/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
1845449919,PR_kwDOAMm_X85Xp1U1,8064,adapt to NEP 51,14808389,closed,0,,,7,2023-08-10T15:43:13Z,2023-09-30T09:27:25Z,2023-09-25T04:46:49Z,MEMBER,,0,pydata/xarray/pulls/8064,"With NEP 51 (and the changes to `numpy` `main`), scalar types no longer pretend to be standard python types in their string representation. This fixes most of the errors in the tests but there are still a few remaining in the doctests (in particular, doctests for private plotting utils).

- [x] towards #8091","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8064/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1898193938,PR_kwDOAMm_X85abbJ4,8188,fix the failing docs,14808389,closed,0,,,7,2023-09-15T11:01:42Z,2023-09-20T11:04:03Z,2023-09-15T13:26:24Z,MEMBER,,0,pydata/xarray/pulls/8188,"The docs have been failing because of a malformatted docstring we inherit from `pandas`, and this caused us to miss another error in #8183. The fix is to avoid installing `pandas=2.1.0`, which should be the only version with the malformatted docstring, and to apply the missing changes from #8183 here.

- [x] Closes #8157, follow-up to #8183","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8188/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1869782001,PR_kwDOAMm_X85Y76lw,8117,fix miscellaneous `numpy=2.0` errors,14808389,closed,0,,,9,2023-08-28T13:34:56Z,2023-09-13T15:34:15Z,2023-09-11T03:55:52Z,MEMBER,,0,pydata/xarray/pulls/8117,"- [x] towards #8091
- [x] closes #8133 

Edit: looking at the relevant `numpy` issues, it appears `numpy` will stay a bit unstable for the next few weeks / months. Not sure how quickly we should try to adapt to those changes.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1845114193,PR_kwDOAMm_X85Xorkf,8061,unpin `numpy`,14808389,closed,0,,,8,2023-08-10T12:43:32Z,2023-08-17T18:14:22Z,2023-08-17T18:14:21Z,MEMBER,,0,pydata/xarray/pulls/8061,"- [x] follow-up to #7415

It seems in a previous PR I ""temporarily"" pinned `numpy` to get CI to pass, but then forgot to unpin later and merged it as-is. As a result, we have not been running the main CI with `numpy>=1.24` ever since, even though now `numpy=1.25` has been around for a while.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8061/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1423972935,PR_kwDOAMm_X85BlCII,7225,join together duplicate entries in the text `repr`,14808389,closed,0,,,4,2022-10-26T12:53:49Z,2023-07-24T18:37:05Z,2023-07-20T21:13:57Z,MEMBER,,0,pydata/xarray/pulls/7225,"`Indexes` contains one entry per coordinate, even if there are indexes that index multiple coordinates. This deduplicates the entries, but the format is not quite clear.

- [x] follow-up to #6795
- [x] Tests added
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`

The formatting options we were able to come up with:
1. separate with just newlines (see e29aeb9085bc677e16dabd4a2b94cf63d06c155e):
```
Indexes:
    one    CustomIndex
    two
    three  PandasIndex
```
2. mark unique indexes with a prefix to make it look like a list (see 9b90f8bceda6f012c863927865cc638e0ff3fb88):
```
Indexes:
  - one    CustomIndex
    two
  - three  PandasIndex
```
3. use unicode box components (in front of the coordinate names) (see 2cec070ebf4f1958d4ffef98d7649eda21ac09a3):
```
Indexes:
  ┌ one    CustomIndex
  │ two
  └ three
    four  PandasIndex
```
4. use unicode box components (after the coordinate names) (see 492ab47ccce8c43d264d5c759841060d33cafe4d):
```
Indexes:
    one   ┐ CustomIndex
    two   │
    three ┘
    four    PandasIndex
```

For the unicode box components, we can choose between the light and heavy variants.

@benbovy and I think the unicode variants (especially variant 3) are the easiest to understand, but we would need to decide whether we care about terminals that don't support unicode.

Edit: in the meeting we decided that support for the subsection of unicode should be common enough that we can use it. I'll clean this PR up to implement option 3, then.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7225/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1789429376,PR_kwDOAMm_X85Uso19,7961,manually unshallow the repository on RTD,14808389,closed,0,,,0,2023-07-05T12:15:31Z,2023-07-11T13:24:18Z,2023-07-05T15:44:09Z,MEMBER,,0,pydata/xarray/pulls/7961,RTD is deprecating the feature flag we made use of before.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7961/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1745794965,PR_kwDOAMm_X85SaJCg,7899,use trusted publishers instead of a API token,14808389,closed,0,,,4,2023-06-07T12:30:56Z,2023-06-16T08:58:05Z,2023-06-16T08:37:07Z,MEMBER,,0,pydata/xarray/pulls/7899,"PyPI introduced the concept of ""trusted publishers"" a few months ago, which allows requesting short-lived API tokens for trusted publishing services (such as GHA, in our case).

Someone with the appropriate rights will have to enable this on PyPI, and I will do the same for TestPyPI.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7899/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1730664352,PR_kwDOAMm_X85RmgD2,7880,don't use `CacheFileManager.__del__` on interpreter shutdown,14808389,closed,0,,,9,2023-05-29T12:16:06Z,2023-06-06T20:37:40Z,2023-06-06T15:14:37Z,MEMBER,,0,pydata/xarray/pulls/7880,"Storing a reference to the function on the class tells the garbage collector to not collect the function *before* the class, such that any instance can safely complete its `__del__`.

No tests because I don't know how to properly test this. Any ideas?

- [x] Closes #7814
- [ ] Tests added
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7880/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1738586208,PR_kwDOAMm_X85SBfsz,7889,retire the TestPyPI workflow,14808389,closed,0,,,1,2023-06-02T17:54:04Z,2023-06-04T19:58:08Z,2023-06-04T18:46:14Z,MEMBER,,0,pydata/xarray/pulls/7889,"With the recent addition of the workflow to upload nightly releases to `anaconda.org/scientific-python-nightly-wheels`, we don't really need the TestPyPI workflow anymore, especially since PyPI instances are not designed to automatically delete very old releases.

<!-- Feel free to remove check-list items aren't relevant to your change -->

- [x] Follow-up to #7863 and #7865
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7889/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1721896187,PR_kwDOAMm_X85RIyh6,7867,add `numba` to the py3.11 environment,14808389,closed,0,,,1,2023-05-23T11:49:37Z,2023-06-03T11:36:10Z,2023-05-28T06:30:10Z,MEMBER,,0,pydata/xarray/pulls/7867,"`numba=0.57.0` has been out for quite some time already and is available on `conda-forge` as-of last week, which means we can almost retire the separate `python=3.11` environment file.

I'm not sure what to do about `cdms2` (we will see if that fails in CI), but in any case we should just deprecate and remove any functions that use it: if I understand correctly, `cdms2` is in maintenance mode until the end of the year and will be discontinued afterwards.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7867/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1730414479,PR_kwDOAMm_X85RlpAe,7878,move to `setup-micromamba`,14808389,closed,0,,,0,2023-05-29T09:27:15Z,2023-06-01T16:21:57Z,2023-06-01T16:21:56Z,MEMBER,,0,pydata/xarray/pulls/7878,"The `provision-with-micromamba` action has been deprecated and will not receive any further releases. It is replaced by the `setup-micromamba` action, which does basically the same thing, but with a different implementation and a few changes to the configuration.

<!-- Feel free to remove check-list items aren't relevant to your change -->

- [x] Closes #7877","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7878/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1729709527,PR_kwDOAMm_X85RjPc9,7876,deprecate the `cdms2` conversion methods,14808389,closed,0,,,2,2023-05-28T22:18:55Z,2023-05-30T20:59:48Z,2023-05-29T19:01:20Z,MEMBER,,0,pydata/xarray/pulls/7876,"As the `cdms2` library has been deprecated and will be retired at the end of this year, maintaining conversion functions does not make sense anymore. Additionally, one of the tests is currently failing on the upstream-dev CI (`cdms2` is incompatible with a recent change to `numpy`), and it seems unlikely this will ever be fixed (*and* there's no `python=3.11` release, blocking us from merging the `py311` environment with the default one).

cc @tomvothecoder for visibility

- [x] Closes #7707
- [x] Tests added
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7876/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1726529405,PR_kwDOAMm_X85RYfGo,7875,defer to `numpy` for the expected result,14808389,closed,0,,,1,2023-05-25T21:48:18Z,2023-05-27T19:53:08Z,2023-05-27T19:53:07Z,MEMBER,,0,pydata/xarray/pulls/7875,"`numpy` has recently changed the result of `np.cos(0)` by a very small value, which makes our tests break.

I'm not really sure what the best fix is, so I split the changes into two parts: the first commit uses
```python
xr.testing.assert_allclose(a + 1, np.cos(a))
```
to test the result, while the second commit uses
```python
expected = xr.full_like(a, fill_value=np.cos(0), dtype=float)
actual = np.cos(a)
xr.testing.assert_identical(actual, expected)
```

- [x] towards #7707","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7875/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1718144679,PR_kwDOAMm_X85Q8Hne,7855,adapt the `pint` + `dask` test to the newest version of `pint`,14808389,closed,0,,,0,2023-05-20T11:35:47Z,2023-05-25T17:25:01Z,2023-05-25T17:24:34Z,MEMBER,,0,pydata/xarray/pulls/7855,"- [x] towards #7707

`pint` recently improved the support for wrapping `dask`, breaking our older tests. With this change, we basically require the newest version of `pint` (`pint>=0.21.1`, which should be released pretty soon) to interact with `dask`, as `0.21.0` did break our use of `np.allclose` and `np.isclose`. However, I guess `dask` support has never been properly tested and thus should be considered ""experimental"", anyways.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7855/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1688716198,PR_kwDOAMm_X85PZRyC,7793,adjust the deprecation policy for python,14808389,closed,0,,,2,2023-04-28T15:03:51Z,2023-05-02T11:51:27Z,2023-05-01T22:26:55Z,MEMBER,,0,pydata/xarray/pulls/7793,"As discussed in #7765, this extends the policy months by 6 to a total of 30 months. With that, the support for a python version can be removed as soon as the *next* version is at least 30 months old. Together with the 12 month release cycle python has, we get the 42 month release window from NEP 29.

Note that this is still missing the release overview proposed in #7765, I'm still thinking about how to best implement the automatic update / formatting, and how to coordinate it with the (still manual) version overrides.

<!-- Feel free to remove check-list items aren't relevant to your change -->

- [x] towards #7765, closes #7777
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7793/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1440494247,I_kwDOAMm_X85V3DKn,7270,type checking CI is failing,14808389,closed,0,,,3,2022-11-08T16:10:24Z,2023-04-15T18:31:59Z,2023-04-15T18:31:59Z,MEMBER,,,,"The most recent runs of the type checking CI have started to fail with a segfault:
```
/home/runner/work/_temp/dac0c060-b19a-435a-8063-bbc5b8ffbf24.sh: line 1:  2945 Segmentation fault      (core dumped) python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report
```
This seems to be due to the release of `mypy=0.990`.

#7269 pinned `mypy` to `mypy<0.990`, which should be undone once we figured out how to fix that (most likely by waiting on the next release), and any complaints the new version has (see this [workflow run](https://github.com/pydata/xarray/actions/runs/3412877230/jobs/5678944346)).","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7270/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 1, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1668326257,PR_kwDOAMm_X85OVLQA,7756,remove the `black` hook,14808389,closed,0,,,0,2023-04-14T14:10:36Z,2023-04-14T17:42:49Z,2023-04-14T16:36:18Z,MEMBER,,0,pydata/xarray/pulls/7756,"Apparently, in addition to formatting notebooks, `black-jupyter` does exactly the same thing as `black`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1668319039,PR_kwDOAMm_X85OVJv5,7755,reword the what's new entry for the `pandas` 2.0 dtype changes,14808389,closed,0,,,0,2023-04-14T14:06:54Z,2023-04-14T14:30:51Z,2023-04-14T14:30:50Z,MEMBER,,0,pydata/xarray/pulls/7755,"As a follow-up to #7724, this makes the what's new entry a bit more precise.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1655782486,PR_kwDOAMm_X85Nr3hH,7724,`pandas=2.0` support,14808389,closed,0,,,8,2023-04-05T14:52:30Z,2023-04-12T13:24:07Z,2023-04-12T13:04:11Z,MEMBER,,0,pydata/xarray/pulls/7724,"As mentioned in https://github.com/pydata/xarray/issues/7716#issuecomment-1497623839, this tries to unpin `pandas`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
683142059,MDU6SXNzdWU2ODMxNDIwNTk=,4361,restructure the contributing guide,14808389,open,0,,,5,2020-08-20T22:51:39Z,2023-03-31T17:39:00Z,,MEMBER,,,,"From #4355

@max-sixty:
> Stepping back on the contributing doc — I admit I haven't look at it in a while — I wonder whether we can slim it down a bit, for example by linking to other docs for generic tooling — I imagine we're unlikely to have the best docs on working with GH, for example. Or referencing our PR template rather than the (now out-of-date) PR checklist.

We could also add a docstring guide since the `numpydoc` guide does not cover every little detail (for example, `default` notation, type spec vs. type hint, space before the colon separating parameter names from types, no colon for parameters without types, etc.)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4361/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
1637616804,PR_kwDOAMm_X85MvWza,7664,use the `files` interface instead of the deprecated `read_binary`,14808389,closed,0,,,2,2023-03-23T14:06:36Z,2023-03-30T14:59:22Z,2023-03-30T14:58:43Z,MEMBER,,0,pydata/xarray/pulls/7664,"Apparently, `read_binary` has been marked as deprecated in `python=3.11`, and is to be replaced by `importlib.resources.files`, which has been available since `python=3.9`. Since we dropped support for `python=3.8` a while ago, we can safely follow the instructions in the deprecation warning.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7664/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1635470616,PR_kwDOAMm_X85MoK6O,7657,add timeouts for tests,14808389,closed,0,,,9,2023-03-22T10:20:04Z,2023-03-24T16:42:48Z,2023-03-24T15:49:22Z,MEMBER,,0,pydata/xarray/pulls/7657,"The `macos` 3.11 CI seems stall very often at the moment, which makes it hit the 6 hours mark and get cancelled. Since our tests should never run that long (the ubuntu runners usually take between 15-20 minutes), I'm introducing a pretty generous timeout of 5 minutes. By comparison, we already have a timeout of 60 seconds in the upstream-dev CI, but that's on a ubuntu runner which usually is much faster than any of the macos / windows runners.

Tests that time out raise an error, which might help us with figuring out which test it is that stalls, and also if we can do anything about that.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1306795760,I_kwDOAMm_X85N5B7w,6793,improve docstrings with examples and links,14808389,open,0,,,10,2022-07-16T12:30:33Z,2023-03-24T16:33:28Z,,MEMBER,,,,"This is a (incomplete) checklist for #5816 to make it easier to find methods that are in need of examples and links to the narrative docs with further information (of course, changes to the docstrings of all other methods / functions part of the public API are also appreciated).

Good examples explicitly construct small xarray objects to make it easier to follow (e.g. use `np.{ones,full,zeros}` or the `np.array` constructor instead of `np.random` / loading from files) and show both input and output of the function.

Use
```sh
pytest --doctest-modules xarray --ignore xarray/tests/
```
to verify the examples, or push to a PR to have the CI do it for you (note that you will have much quicker feedback locally though).

To easily generate the expected output install `pytest-accept` ([docs]()) in your dev environment and then run
```
pytest --doctest-modules FILE_NAME --accept || true
```

To link to other documentation pages we can use
```python
:doc:`project:label`
    Description of the linked page
```
where we can leave out `project` if we link to somewhere within xarray's documentation. To figure out the label, we can either look at the source, search the output of `python -m sphinx.ext.intersphinx https://docs.xarray.dev/en/latest/objects.inv`, or use `sphobjinv` (install from PyPI):
```sh
sphobjinv search -su https://docs.xarray.dev/en/latest/ missing
```

Top-level functions:
- [ ] `get_options`
- [ ] `decode_cf`
- [ ] `polyval`
- [ ] `unify_chunks`
- [ ] `infer_freq`
- [ ] `date_range`

I/O:
- [ ] `load_dataarray`
- [ ] `load_dataset`
- [ ] `open_dataarray`
- [ ] `open_dataset`
- [ ] `open_mfdataset`

Contents:
- [ ] `DataArray.assign_attrs`, `Dataset.assign_attrs`
- [ ] `DataArray.expand_dims`, `Dataset.expand_dims`
- [ ] `DataArray.drop_duplicates`, `Dataset.drop_duplicates`
- [ ] `DataArray.drop_vars`, `Dataset.drop_vars`
- [ ] `Dataset.drop_dims`
- [ ] `DataArray.convert_calendar`, `Dataset.convert_calendar`
- [ ] `DataArray.set_coords`, `Dataset.set_coords`
- [ ] `DataArray.reset_coords`, `Dataset.reset_coords`

Comparisons:
- [ ] `DataArray.equals`, `Dataset.equals`
- [ ] `DataArray.identical`, `Dataset.identical`
- [ ] `DataArray.broadcast_equals`, `Dataset.broadcast_equals`

Dask:
- [ ] `DataArray.compute`, `Dataset.compute`
- [ ] `DataArray.chunk`, `Dataset.chunk`
- [ ] `DataArray.persist`, `Dataset.persist`

Missing values:
- [ ] `DataArray.bfill`, `Dataset.bfill`
- [ ] `DataArray.ffill`, `Dataset.ffill`
- [ ] `DataArray.fillna`, `Dataset.fillna`
- [ ] `DataArray.dropna`, `Dataset.dropna`

Indexing:
- [ ] `DataArray.loc` (no docstring at all - came up in https://github.com/pydata/xarray/discussions/7528#discussion-4858556)
- [ ] `DataArray.drop_isel`
- [ ] `DataArray.drop_sel`
- [ ] `DataArray.head`, `Dataset.head`
- [ ] `DataArray.tail`, `Dataset.tail`
- [ ] `DataArray.interp_like`, `Dataset.interp_like`
- [ ] `DataArray.reindex_like`, `Dataset.reindex_like`
- [ ] `Dataset.isel`

Aggregations:
- [ ] `Dataset.argmax`
- [ ] `Dataset.argmin`
- [ ] `DataArray.cumsum`, `Dataset.cumsum` (intermediate to advanced)
- [ ] `DataArray.cumprod`, `Dataset.cumprod` (intermediate to advanced)
- [ ] `DataArray.reduce`, `Dataset.reduce`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6793/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
1615259652,PR_kwDOAMm_X85LkhcR,7594,ignore the `pkg_resources` deprecation warning,14808389,closed,0,,,0,2023-03-08T13:18:10Z,2023-03-08T13:41:55Z,2023-03-08T13:41:54Z,MEMBER,,0,pydata/xarray/pulls/7594,"In one of the recent `setuptools` releases `pkg_resources` finally got deprecated in favor of the `importlib.*` modules. We don't use `pkg_resources` directly, so the `DeprecationWarning` that causes the doctest CI to fail is from a dependency. As such, the only way to get it to work again is to ignore the warning until the upstream packages have released versions that don't use/import `pkg_resources`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1534634670,PR_kwDOAMm_X85Hc1wx,7442,update the docs environment,14808389,closed,0,,,5,2023-01-16T09:58:43Z,2023-03-03T10:17:14Z,2023-03-03T10:14:13Z,MEMBER,,0,pydata/xarray/pulls/7442,"Most notably:
- bump `python` to `3.10`
- bump `sphinx` to at least `5.0`
- remove the `pydata-sphinx-theme` pin: `sphinx-book-theme` pins to a exact minor version so pinning as well does not change anything
- xref https://github.com/executablebooks/sphinx-book-theme/issues/686

~Edit: it seems this is blocked by `sphinx-book-theme` pinning `sphinx` to `>=3,<5`. They already changed the pin, we're just waiting on a release~","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1556739109,PR_kwDOAMm_X85IhJib,7477,RTD maintenance,14808389,closed,0,,,0,2023-01-25T14:20:27Z,2023-01-25T14:58:50Z,2023-01-25T14:58:46Z,MEMBER,,0,pydata/xarray/pulls/7477,"Since the last time the RTD configuration was updated, a few things have changed:
- the OS image is now a bit old
- we can tell git (and thus `setuptools_scm`) to ignore changes by RTD

If I read the documentation / history of RTD correctly, they removed the `DONT_SHALLOW_CLONE` feature flag from documentation (it still is enabled and works for this repository, though), so we might have to migrate to adding a `post_checkout` build job that does that soon.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1556471464,PR_kwDOAMm_X85IgPzN,7476,fix the RTD build skipping feature,14808389,closed,0,,,0,2023-01-25T11:15:26Z,2023-01-25T11:18:00Z,2023-01-25T11:17:57Z,MEMBER,,0,pydata/xarray/pulls/7476,"We can't use the example `grep` options because we usually enclose tags with `[]`, which `grep` will interpret as character classes.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1553155277,PR_kwDOAMm_X85IVH-_,7470,allow skipping RTD builds,14808389,closed,0,,,1,2023-01-23T14:02:30Z,2023-01-24T16:09:54Z,2023-01-24T16:09:51Z,MEMBER,,0,pydata/xarray/pulls/7470,"RTD somewhat recently introduced the [build.jobs](https://docs.readthedocs.io/en/stable/config-file/v2.html#build-jobs) setting, which allows [skipping builds](https://docs.readthedocs.io/en/stable/build-customization.html#cancel-build-based-on-a-condition) (technically it's more of a ""automated cancel"" with a special error code instead of a skip) on user-defined conditions. We can use that to manually ""skip"" builds we don't need a documentation build for.

Edit: the only downside seems to be that the build is not actually marked as ""skipped""
Edit2: apparently that's a bug that's being worked on, see readthedocs/readthedocs.org#9807","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1552997940,PR_kwDOAMm_X85IUl4t,7469,create separate environment files for `python=3.11`,14808389,closed,0,,,0,2023-01-23T12:17:08Z,2023-01-23T14:03:36Z,2023-01-23T13:03:13Z,MEMBER,,0,pydata/xarray/pulls/7469,"This builds on #7353, which found that `cdms2` and `numba` (and thus also `numbagg` and `sparse`) don't yet support `python=3.11`.

In order to still test that we support `python=3.11` but without dropping those dependencies from the other environments, this adds separate environment files, which should be removed once `cdms2` and `numba` support `python=3.11`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7469/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1519058102,PR_kwDOAMm_X85GoVw9,7415,install `numbagg` from `conda-forge`,14808389,closed,0,,,5,2023-01-04T14:17:44Z,2023-01-20T19:46:46Z,2023-01-20T19:46:43Z,MEMBER,,0,pydata/xarray/pulls/7415,"It seems there is a `numbagg` package on `conda-forge` now.

Not sure what to do about the `min-all-deps` CI, but given that the most recent version of `numbagg` happened more than 12 months ago (more than 18 months, even) maybe we can just bump it to the version on `conda-forge`?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1519154372,PR_kwDOAMm_X85Goq2J,7416,remove `numbagg` and `numba` from the upstream-dev CI,14808389,closed,0,,,3,2023-01-04T15:18:51Z,2023-01-04T20:07:33Z,2023-01-04T20:07:29Z,MEMBER,,0,pydata/xarray/pulls/7416,"- [x] opposite of #7311
- [x] closes #7306

Using the `numpy` HEAD together with `numba` is, in general, not supported (see numba/numba#8615). So in order to avoid the current failures and still be able to test `numpy` HEAD, this (temporarily) removes `numbagg` and `numba` (pulled in through `numbagg`), from the upstream-dev CI.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
760574919,MDU6SXNzdWU3NjA1NzQ5MTk=,4670,increase the visibility of the upstream-dev PR CI,14808389,closed,0,,,3,2020-12-09T18:37:57Z,2022-12-29T21:15:05Z,2021-01-19T15:27:26Z,MEMBER,,,,"We currently have two upstream-dev PR CI: the old pipelines CI and the new github actions CI added together with the scheduled upstream-dev (""nightly"") CI. Since we don't need both I think we should disable one of these, presumably the old pipelines CI. 

There's an issue with the CI result, though: since github doesn't have a [icon for ""passed with issues""](https://github.com/actions/runner/issues/2347), we have to choose between ""passed"" or ""failed"" as the CI result (neither of which is optimal).

The advantage of using ""failed"" is that a failure is easily visible, but often the total CI result on PRs is set to ""failed"" because we didn't get around to fixing bugs introduced by recent changes to dependencies (which is confusing for contributors).

In #4584 I switched the pipelines upstream-dev CI to ""allowed failure"" so we get a warning instead of a failure. However, github doesn't print the number of warnings in the summary line, which means that if the icon is green nobody checks the status and upstream-dev CI failures are easily missed.

Our new scheduled nightly CI improves the situation quite a bit since we automatically get a issue containing the failures, but that means we aren't able to catch these failures before actually merging. As pointed out in https://github.com/pydata/xarray/issues/4574#issuecomment-725795622 that might be acceptable, though.

If we still want to fix this, we could have the PR CI automatically add a comment to the PR, which would contain a summary of the failures but also state that these failures can be ignored as long as they get the approval of a maintainer. This would increase the noise on PRs, though.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1440354343,PR_kwDOAMm_X85Cbqgn,7267,`keep_attrs` for pad,14808389,closed,0,,,6,2022-11-08T14:55:05Z,2022-12-12T15:59:46Z,2022-12-12T15:59:42Z,MEMBER,,0,pydata/xarray/pulls/7267,"I ran into this while trying `DataTree.pad`, which silently dropped the attrs, even with `keep_attrs=True`.

- [x] Tests added
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7267/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1440486521,PR_kwDOAMm_X85CcHQ5,7269,pin mypy to a known good version,14808389,closed,0,,,0,2022-11-08T16:06:47Z,2022-11-08T16:49:16Z,2022-11-08T16:49:13Z,MEMBER,,0,pydata/xarray/pulls/7269,"The type checking CI has started to fail with the new upgrade. In order not to disturb unrelated PRs, this pins `mypy` to a known good version (`mypy<0.990`).","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7269/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1438416173,PR_kwDOAMm_X85CVEkR,7260,use the moving release tag of `issue-from-pytest-log`,14808389,closed,0,,,0,2022-11-07T14:01:35Z,2022-11-07T14:40:28Z,2022-11-07T14:32:02Z,MEMBER,,0,pydata/xarray/pulls/7260,"Like most github actions (or at least all the official `actions/*` ones), `issue-from-pytest-log` maintains a moving release tag that points to the most recently released version.

In order to decrease the amount of updating PRs, this makes use of that tag.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1429769840,PR_kwDOAMm_X85B4VCq,7242,fix the environment setup: actually use the python version,14808389,closed,0,,,0,2022-10-31T12:30:37Z,2022-10-31T13:24:36Z,2022-10-31T13:16:46Z,MEMBER,,0,pydata/xarray/pulls/7242,"While working on #7241, I realized that I forgot to specify the python in our main CI, with the effect that we've been testing python 3.10 twice for each OS since the introduction of the `micromamba` action.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7242/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1422482990,PR_kwDOAMm_X85BgEYZ,7212,use the new action to create an issue from the output of reportlog,14808389,closed,0,,,0,2022-10-25T13:37:42Z,2022-10-26T09:34:09Z,2022-10-26T09:12:42Z,MEMBER,,0,pydata/xarray/pulls/7212,"I'm not sure if we actually need the dedicated ""report"" job, or whether adding an additional step to the main ci job would suffice?

I can see two reasons for a separate job:
1. we get to control the python version independently from the version of the main job
2. we upload the reportlog files as artifacts

I think if we can change the action to abort if it is run on a python version it does not support the first concern would not matter anymore, and for 2 we might just keep the ""upload artifact"" action (but is it even possible to manually access artifacts? If not we might not even need the upload).

Since I don't think either is a major concern, I went ahead and joined the jobs.


- [x] Closes #6810","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1422502185,PR_kwDOAMm_X85BgIjA,7213,use the moving release tag of ci-trigger,14808389,closed,0,,,0,2022-10-25T13:50:13Z,2022-10-25T14:29:54Z,2022-10-25T14:29:51Z,MEMBER,,0,pydata/xarray/pulls/7213,"In order to follow the minor releases of `ci-trigger`, we can use the new `v1` release tag that will always point to the most recent release.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1413425793,PR_kwDOAMm_X85BBvaI,7185,indexes section in the HTML repr,14808389,closed,0,,,4,2022-10-18T15:25:34Z,2022-10-20T06:59:05Z,2022-10-19T21:12:46Z,MEMBER,,0,pydata/xarray/pulls/7185,"To see the effect, try this:

<details>

```python
import xarray as xr
from xarray.core.indexes import Index


class CustomIndex(Index):
    def __init__(self, names, options):
        self.names = names
        self.options = options
    
    @classmethod
    def from_variables(cls, variables, options):
        names = list(variables.keys())
        
        return cls(names, options)
    
    def __repr__(self):
        options = (
            {""names"": repr(self.names)}
            | {str(k): str(v) for k, v in self.options.items()}
        )
        
        return f""CustomIndex({', '.join(k + '=' + v for k, v in options.items())})""

    def _repr_html_(self):
        header_row = ""<tr><td>KDTree params</td></tr>""
        option_rows = [
            f""<tr><td>{option}</td><td>{value}</td></tr>""
            for option, value in self.options.items()
        ]
        return f""<left><table>{header_row}{''.join(option_rows)}</table></left>""

ds = xr.tutorial.open_dataset(""rasm"")
ds1 = ds.set_xindex([""xc"", ""yc""], CustomIndex, param1=""a"", param2=""b"")

with xr.set_options(display_style=""text""):
    display(ds1)

with xr.set_options(display_style=""html""):
    display(ds1)
```

</details>

~The repr looks a bit strange because I've been borrowing the variable CSS classes.~ Edit: @benbovy fixed that for me

Also, the discussion about what `_repr_inline_` should include from #7183 is relevant here as well.

- [x] Follow-up to #6795, depends on #7183
- [x] Tests added
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7185/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1412926287,PR_kwDOAMm_X85BADV9,7183,use `_repr_inline_` for indexes that define it,14808389,closed,0,,,6,2022-10-18T10:00:47Z,2022-10-19T14:06:51Z,2022-10-19T14:06:47Z,MEMBER,,0,pydata/xarray/pulls/7183,"Also, some tests for the index summarizer.

- [x] Follow-up to #6795
- [x] Tests added
- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1306887842,PR_kwDOAMm_X847g7WQ,6795,display the indexes in the string reprs,14808389,closed,0,,,7,2022-07-16T19:42:19Z,2022-10-15T18:28:36Z,2022-10-12T16:52:53Z,MEMBER,,0,pydata/xarray/pulls/6795,"With the flexible indexes refactor indexes have become much more important, which means we should include them in the reprs of `DataArray` and `Dataset` objects.

This is a initial attempt, covering only the string reprs, with a few unanswered questions:
- how do we format indexes? Do we delegate to their `__repr__` or some other method?
- should we skip `PandasIndex` and `PandasMultiIndex`?
- how do we present indexes that wrap multiple columns? At the moment, they are duplicated (see also the discussion in #6392)
- what do we do with the index marker in the coords repr?

(also, how do we best test this?)

- [ ] Tests added
- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6795/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1404894283,PR_kwDOAMm_X85AlZGn,7153,use a hook to synchronize the versions of `black`,14808389,closed,0,,,5,2022-10-11T16:07:05Z,2022-10-12T08:00:10Z,2022-10-12T08:00:07Z,MEMBER,,0,pydata/xarray/pulls/7153,"We started to pin the version of `black` used in the environment of `blackdoc`, but the version becomes out-of-date pretty quickly. The new hook I'm adding here is still experimental, but pretty limited in what it can destroy (the `pre-commit` configuration) so for now we can just review any new autoupdate PRs from the `pre-commit-ci` a bit more thoroughly.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7153/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
818059250,MDExOlB1bGxSZXF1ZXN0NTgxNDIzNTIx,4972,Automatic duck array testing - reductions,14808389,open,0,,,23,2021-02-27T23:57:23Z,2022-08-16T13:47:05Z,,MEMBER,,1,pydata/xarray/pulls/4972,"This is the first of a series of PRs to add a framework to make testing the integration of duck arrays as simple as possible. It uses `hypothesis` for increased coverage and maintainability.

- [x] Tests added
- [x] Passes `pre-commit run --all-files`
- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`
- [ ] New functions/methods are listed in `api.rst`
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4972/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1327082521,PR_kwDOAMm_X848kdpQ,6873,rely on `numpy`'s version of `nanprod` and `nansum`,14808389,closed,0,,,1,2022-08-03T11:33:35Z,2022-08-09T17:31:27Z,2022-08-09T14:55:21Z,MEMBER,,0,pydata/xarray/pulls/6873,"At the moment, `nanprod` and `nansum` will replace any `nan` values with `1` for `nanprod` or `0` for `nansum`. For `pint`, inserting dimensionless values into quantities is explicitly not allowed, so our version of `nanprod` cannot be used on `pint` quantities (`0` is a exception so `nansum` does work).

I'm proposing to rely on `numpy`'s version for that, which would allow `pint` to customize the behavior.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6873/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1326997649,PR_kwDOAMm_X848kLEu,6872,skip creating a cupy-backed IndexVariable,14808389,closed,0,,,0,2022-08-03T10:21:06Z,2022-08-03T15:34:18Z,2022-08-03T15:04:56Z,MEMBER,,0,pydata/xarray/pulls/6872,"We could probably replace the default indexes with `cudf` indexes, but with `pandas` indexes this test doesn't make too much sense.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6872/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
532696790,MDU6SXNzdWU1MzI2OTY3OTA=,3594,support for units with pint,14808389,open,0,,,7,2019-12-04T13:49:28Z,2022-08-03T11:44:05Z,,MEMBER,,,,"`pint`'s implementation of NEP-18 (see hgrecco/pint#905) is close enough so we can finally start working on the `pint` support (i.e. make the integration tests pass). This would be the list of tasks to get there:
* integration tests:
  - [x] implement integration tests for `DataArray`, `Dataset` and top-level functions (#3238, #3447, #3493)
  - [x] add tests for `Variable` as discussed in #3493 (#3654)
  - [x] clean up the current tests (#3600)
  - [x] use the standard `assert_identical` and `assert_allclose` functions (#3611, #3643, #3654, #3706, #3975)
  - [x] clean up the `TestVariable.test_pad` tests
* actually get xarray to support units:
  - [x] top-level functions (#3611)
  - [x] `Variable` (#3706)
    + `rolling_window` and `identical` need larger modifications
  - [x] `DataArray` (#3643)
  - [x] `Dataset`
  - [x] silence all the `UnitStrippedWarnings` in the testsuite (#4163)
  - [ ] try to get `nanprod` to work with quantities
  - [x] add support for per variable fill values (#4165)
  - [x] `repr` with units (#2773)
  - [ ] type hierarchy (e.g. for `np.maximum(data_array, quantity)` vs `np.maximum(quantity, data_array)`) (#3950)
* update the documentation
   - [x] point to [pint-xarray](https://github.com/xarray-contrib/pint-xarray) (see #4530)
   - [x] mention the requirement for `UnitRegistry(force_ndarray=True)` or `UnitRegistry(force_ndarray_like=True)` (see https://pint-xarray.readthedocs.io/en/stable/creation.html#attaching-units)
   - [x] list the known issues (see https://github.com/pydata/xarray/pull/3643#issue-354872657 and https://github.com/pydata/xarray/pull/3643#issuecomment-602225731) (#4530):
       + `pandas` (indexing)
       + `bottleneck` (`bfill`, `ffill`)
       + `scipy` (`interp`)
       + `numbagg` (`rolling_exp`)
       + `numpy.lib.stride_tricks.as_strided`: `rolling`
       + `numpy.vectorize`: `interpolate_na`
   - [x] ~update the install instructions (we can use standard `conda` / `pip` now)~ this should be done by `pint-xarray`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3594/reactions"", ""total_count"": 14, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 14, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
1250755592,I_kwDOAMm_X85KjQQI,6645,pre-release for v2022.06.0,14808389,closed,0,,,14,2022-05-27T13:14:06Z,2022-07-22T15:44:59Z,2022-07-22T15:44:59Z,MEMBER,,,,"There's a few unreleased and potentially breaking changes in `main`, most importantly the index refactor and the new `groupby` using `numpy-groupies` and `flox`. During the meeting on Wednesday we decided to release a preview version to get feedback before releasing a full version, especially from those who don't run their tests against our `main` branch.

I am planning to create the pre-release tomorrow, but if there's any big changes that should be included please post here.

cc @TomNicholas

Edit: the version will be called `2022.05.0.dev0`, which will ensure that e.g. `pip` will require the `--pre` flag to install it.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6645/reactions"", ""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
818957820,MDU6SXNzdWU4MTg5NTc4MjA=,4976,reported version in the docs is misleading,14808389,closed,0,,,3,2021-03-01T15:08:12Z,2022-07-10T13:00:46Z,2022-07-10T13:00:46Z,MEMBER,,,,"The [editable install](https://readthedocs.org/projects/xray/builds/13110398/) on RTD is reported to have the version `0.17.1.dev0+g835a53e6.d20210226` (which technically is correct but it would be better to have a clean version on tags).

This is not something I can reproduce with `python -m pip install -e .`, so it is either some RTD weirdness or happens because we get `mamba` to do the editable install for us.

We should try to get the version right.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4976/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1284543780,PR_kwDOAMm_X846Wk8u,6727,resolve the timeouts on RTD,14808389,closed,0,,,1,2022-06-25T10:38:36Z,2022-06-30T01:00:21Z,2022-06-25T11:00:50Z,MEMBER,,0,pydata/xarray/pulls/6727,"The reason the changes introduced in #6542 caused timeouts is that they redefined `ds` to a bigger dataset, which would then be used to demonstrate `to_dict`.

The fix is to explicitly set the dataset before calling `to_dict`, which also makes that section a bit easier to follow.

- [x] Closes #6720","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6727/reactions"", ""total_count"": 4, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 3, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1280027449,I_kwDOAMm_X85MS6s5,6714,`mypy` CI is failing on `main`,14808389,closed,0,,,1,2022-06-22T11:54:16Z,2022-06-22T16:01:45Z,2022-06-22T16:01:45Z,MEMBER,,,,"The most recent runs of the `mypy` CI on `main` are failing with:
```
xarray/core/dataset.py:6934: error: Incompatible types in assignment (expression has type ""str"", variable has type ""Optional[Optional[Literal['Y', 'M', 'W', 'D', 'h', 'm', 's', 'ms', 'us', 'ns', 'ps', 'fs', 'as']]]"")  [assignment]
```
I'm not sure what changed since the last pass, though.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6714/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1274838660,PR_kwDOAMm_X8451-Gp,6701,try to import `UndefinedVariableError` from the new location,14808389,closed,0,,,1,2022-06-17T10:05:18Z,2022-06-22T10:33:28Z,2022-06-22T10:33:25Z,MEMBER,,0,pydata/xarray/pulls/6701,- [x] Closes #6698,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6701/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1271869460,PR_kwDOAMm_X845sEYR,6699,use `pytest-reportlog` to generate upstream-dev CI failure reports,14808389,closed,0,,,0,2022-06-15T08:31:00Z,2022-06-16T08:12:28Z,2022-06-16T08:11:22Z,MEMBER,,0,pydata/xarray/pulls/6699,"We currently use the output of `pytest` to generate our test results, which is both fragile and does not detect import errors on test collection.

Instead, we can use `pytest-reportlog` to generate a machine-readable file that is easy to parse (`junit` XML would probably work, too, but apparently does not contain all the different failure modes of `pytest`).

The new script will collect failure summaries like the old version, but it should be fairly easy to create a fancy report with more information.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6699/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1264767977,PR_kwDOAMm_X845Uh71,6674,use micromamba instead of mamba,14808389,closed,0,,,3,2022-06-08T13:38:15Z,2022-06-10T11:33:05Z,2022-06-10T11:33:00Z,MEMBER,,0,pydata/xarray/pulls/6674,"I'm not sure if this is exactly equal to what we had before, but we might be able to save 3-4 minutes of CI time with this.

- [x] supersedes and closes #6544","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6674/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1264868027,PR_kwDOAMm_X845U3bZ,6675,install the development version of `matplotlib` into the upstream-dev CI,14808389,closed,0,,,1,2022-06-08T14:42:15Z,2022-06-10T11:25:33Z,2022-06-10T11:25:31Z,MEMBER,,0,pydata/xarray/pulls/6675,- [x] follow-up to #4947,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6675/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
597566530,MDExOlB1bGxSZXF1ZXN0NDAxNjU2MTc1,3960,examples for special methods on accessors,14808389,open,0,,,6,2020-04-09T21:34:30Z,2022-06-09T14:50:17Z,,MEMBER,,0,pydata/xarray/pulls/3960,"This starts adding the parametrized accessor examples from #3829 to the accessor documentation as suggested by @jhamman. Since then the `weighted` methods have been added, though, so I'd like to use a different example instead (ideas welcome).

Also, this feature can be abused to add functions to the main `DataArray` / `Dataset` namespace (by registering a function with the `register_*_accessor` decorators, see the second example). Is this something we want to explicitly discourage?

(~When trying to build the docs locally, sphinx keeps complaining about a code block without code. Not sure what that is about~ seems the `ipython` directive does not allow more than one expression, so I used `code` instead)

 - [x] Closes #3829
 - [x] Passes `isort -rc . && black . && mypy . && flake8`
 - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3960/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
801728730,MDExOlB1bGxSZXF1ZXN0NTY3OTkzOTI3,4863,apply to dataset,14808389,open,0,,,14,2021-02-05T00:05:22Z,2022-06-09T14:50:17Z,,MEMBER,,0,pydata/xarray/pulls/4863,"as discussed in #4837, this adds a method that applies a function to a `DataArray` by first converting it to a temporary dataset using `_to_temp_dataset`, applies the function and converts it back. I'm not really happy with the name but I can't find a better one.

This function is really similar to `pipe`, so I guess a keyword argument to pipe would work, too. The disadvantage of that is that `pipe` passes all kwargs to the passed function, which means we would shadow a specific kwarg.

- [x] Closes #4837
- [x] Tests added
- [x] Passes `pre-commit run --all-files`
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`
- [x] New functions/methods are listed in `api.rst`
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4863/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
959063390,MDExOlB1bGxSZXF1ZXN0NzAyMjM0ODc1,5668,create the context objects passed to custom `combine_attrs` functions,14808389,open,0,,,1,2021-08-03T12:24:50Z,2022-06-09T14:50:16Z,,MEMBER,,0,pydata/xarray/pulls/5668,"Follow-up to #4896: this creates the context object in reduce methods and passes it to `merge_attrs`, with more planned.

- [ ] might help with xarray-contrib/cf-xarray#228
- [ ] Tests added
- [x] Passes `pre-commit run --all-files`
- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`
- [ ] New functions/methods are listed in `api.rst`

Note that for now this is a bit inconvenient to use for provenance tracking (as discussed in the `cf-xarray` issue) because functions implementing that would still have to deal with merging the `attrs`.
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5668/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1264971669,PR_kwDOAMm_X845VNfZ,6676,release notes for the pre-release,14808389,closed,0,,,2,2022-06-08T15:59:28Z,2022-06-09T14:41:34Z,2022-06-09T14:41:32Z,MEMBER,,0,pydata/xarray/pulls/6676,"The only thing it contains so far is the known regression, do we want to have a summary of the most notable changes like in the previous releases?

- [x] towards #6645

cc @pydata/xarray","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1265366275,I_kwDOAMm_X85La_UD,6678,exception groups,14808389,open,0,,,1,2022-06-08T22:09:37Z,2022-06-08T23:38:28Z,,MEMBER,,,,"### What is your issue?

As I mentioned in the meeting today, we have a lot of features where the the exception group support from [PEP654](https://peps.python.org/pep-0654/) (which is scheduled for python 3.11 and consists of the class and a syntax change) might be useful. For example, we might want to collect all errors raised by `rename` in a exception group instead of raising them one-by-one.

For `python<=3.10` there's a [backport](https://github.com/agronholm/exceptiongroup) that contains the class and a workaround for the new syntax.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
1264494714,PR_kwDOAMm_X845TmsM,6673,more testpypi workflow fixes,14808389,closed,0,,,0,2022-06-08T09:57:50Z,2022-06-08T13:59:29Z,2022-06-08T13:52:08Z,MEMBER,,0,pydata/xarray/pulls/6673,Hopefully the final follow-up to #6660.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1262891004,PR_kwDOAMm_X845OOpR,6671,try to finally fix the TestPyPI workflow,14808389,closed,0,,,0,2022-06-07T08:04:32Z,2022-06-07T08:33:15Z,2022-06-07T08:33:01Z,MEMBER,,0,pydata/xarray/pulls/6671,"As per https://github.com/pydata/xarray/issues/6659#issuecomment-1148285401, don't invoke `git restore`.

- [x] Closes #6659","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1262480325,PR_kwDOAMm_X845M3wk,6669,pin setuptools in the testpypi configure script,14808389,closed,0,,,1,2022-06-06T22:04:12Z,2022-06-06T22:38:48Z,2022-06-06T22:33:16Z,MEMBER,,0,pydata/xarray/pulls/6669,"This works around a incompatibility between `setuptools>=60` and `setuptools_scm`.

- [x] Closes #6659","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1262396039,PR_kwDOAMm_X845MlHs,6668,fix the python version for the TestPyPI release workflow,14808389,closed,0,,,0,2022-06-06T20:42:51Z,2022-06-06T21:16:01Z,2022-06-06T21:12:23Z,MEMBER,,0,pydata/xarray/pulls/6668,follow-up to #6660,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6668/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1259827097,PR_kwDOAMm_X845EJs5,6660,upload wheels from `main` to TestPyPI,14808389,closed,0,,,4,2022-06-03T12:00:02Z,2022-06-06T19:49:08Z,2022-06-06T19:49:02Z,MEMBER,,0,pydata/xarray/pulls/6660,"This adds a new workflow that uploads every commit to main as a new wheel to TestPyPI. No tests, though, so those wheels might be broken (but that's fine, I guess).

Should we document this somewhere, like the install guide?

- [x] Closes #6659
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6660/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
624778130,MDU6SXNzdWU2MjQ3NzgxMzA=,4095,merging non-dimension coordinates with the Dataset constructor,14808389,open,0,,,1,2020-05-26T10:30:37Z,2022-04-19T13:54:43Z,,MEMBER,,,,"When adding two `DataArray` objects with different coordinates to a `Dataset`, a `MergeError` is raised even though one of the conflicting coords is a subset of the other. Merging dimension coordinates works so I'd expect associated non-dimension coordinates to work, too.

This fails:
```python
In [1]: import xarray as xr
   ...: import numpy as np

In [2]: a = np.linspace(0, 1, 10)
   ...: b = np.linspace(-1, 0, 12)
   ...:
   ...: x_a = np.arange(10)
   ...: x_b = np.arange(12)
   ...:
   ...: y_a = x_a * 1000
   ...: y_b = x_b * 1000
   ...:
   ...: arr1 = xr.DataArray(data=a, coords={""x"": x_a, ""y"": (""x"", y_a)}, dims=""x"")
   ...: arr2 = xr.DataArray(data=b, coords={""x"": x_b, ""y"": (""x"", y_b)}, dims=""x"")
   ...:
   ...: xr.Dataset({""a"": arr1, ""b"": arr2})
...
MergeError: conflicting values for variable 'y' on objects to be combined. You can skip this check by specifying compat='override'.
```
While this works:
```python
In [3]: a = np.linspace(0, 1, 10)
   ...: b = np.linspace(-1, 0, 12)
   ...: 
   ...: x_a = np.arange(10)
   ...: x_b = np.arange(12)
   ...: 
   ...: y_a = x_a * 1000
   ...: y_b = x_b * 1000
   ...:
   ...: xr.Dataset({
   ...:     ""a"": xr.DataArray(data=a, coords={""x"": x_a}, dims=""x""),
   ...:     ""b"": xr.DataArray(data=b, coords={""x"": x_b}, dims=""x""),
   ...: })
Out[3]:
<xarray.Dataset>
Dimensions:  (x: 12)
Coordinates:
  * x        (x) int64 0 1 2 3 4 5 6 7 8 9 10 11
Data variables:
    a        (x) float64 0.0 0.1111 0.2222 0.3333 0.4444 ... 0.8889 1.0 nan nan
    b        (x) float64 -1.0 -0.9091 -0.8182 -0.7273 ... -0.1818 -0.09091 0.0
```

I can work around this by calling:
```python
In [4]: xr.merge([arr1.rename(""a"").to_dataset(), arr2.rename(""b"").to_dataset()])
Out[4]: 
<xarray.Dataset>
Dimensions:  (x: 12)
Coordinates:
  * x        (x) int64 0 1 2 3 4 5 6 7 8 9 10 11
    y        (x) float64 0.0 1e+03 2e+03 3e+03 ... 8e+03 9e+03 1e+04 1.1e+04
Data variables:
    a        (x) float64 0.0 0.1111 0.2222 0.3333 0.4444 ... 0.8889 1.0 nan nan
    b        (x) float64 -1.0 -0.9091 -0.8182 -0.7273 ... -0.1818 -0.09091 0.0
```
but I think the `Dataset` constructor should be capable of that, too.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4095/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue
628436420,MDU6SXNzdWU2Mjg0MzY0MjA=,4116,xarray ufuncs,14808389,closed,0,,,5,2020-06-01T13:25:54Z,2022-04-19T03:26:53Z,2022-04-19T03:26:53Z,MEMBER,,,,"The documentation warns that the universal functions in `xarray.ufuncs` should not be used unless compatibility with `numpy < 1.13` is required.

Since we only support `numpy >= 1.15`: is it time to remove that (already deprecated) module?

Since there are also functions that are not true ufuncs (e.g. `np.angle` and `np.median`) and need `__array_function__` (or something similar, see #3917), we could also keep those and just remove the ones that are dispatched using `__array_ufunc__`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1183366218,PR_kwDOAMm_X841Jvka,6419,unpin `jinja2`,14808389,closed,0,,,0,2022-03-28T12:27:48Z,2022-03-30T13:54:51Z,2022-03-30T13:54:50Z,MEMBER,,0,pydata/xarray/pulls/6419,"`nbconvert` released a fixed version today, so we can remove the pin of `jinja2`.

- [x] Follow-up to #6415","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6419/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1181704575,PR_kwDOAMm_X841EMNH,6414,use the `DaskIndexingAdapter` for `duck dask` arrays,14808389,closed,0,,,2,2022-03-26T12:00:34Z,2022-03-27T20:38:43Z,2022-03-27T20:38:40Z,MEMBER,,0,pydata/xarray/pulls/6414,"(detected while trying to implement a `PintMetaIndex` in xarray-contrib/pint-xarray#163)

This fixes position-based indexing of `duck dask arrays`:
``` python

In [1]: import xarray as xr
   ...: impIn [1]: import xarray as xr
   ...: import dask.array as da
   ...: import pint
   ...: 
   ...: ureg = pint.UnitRegistry(force_ndarray_like=True)
   ...: 
   ...: a = da.zeros((20, 20), chunks=(10, 10))
   ...: q = ureg.Quantity(a, ""m"")
   ...: 
   ...: arr1 = xr.DataArray(a, dims=(""x"", ""y""))
   ...: arr2 = xr.DataArray(q, dims=(""x"", ""y""))

In [2]: arr1.isel(x=[0, 2, 4], y=[1, 3, 5])
Out[2]: 
<xarray.DataArray 'zeros_like-d81259c3a77e6dff3e60975e2afe4ff9' (x: 3, y: 3)>
dask.array<getitem, shape=(3, 3), dtype=float64, chunksize=(3, 3), chunktype=numpy.ndarray>
Dimensions without coordinates: x, y

In [3]: arr2.isel(x=[0, 2, 4], y=[1, 3, 5])
---------------------------------------------------------------------------
NotImplementedError                       Traceback (most recent call last)
Input In [3], in <module>
----> 1 arr2.isel(x=[0, 2, 4], y=[1, 3, 5])

File .../xarray/core/dataarray.py:1220, in DataArray.isel(self, indexers, drop, missing_dims, **indexers_kwargs)
   1215     return self._from_temp_dataset(ds)
   1217 # Much faster algorithm for when all indexers are ints, slices, one-dimensional
   1218 # lists, or zero or one-dimensional np.ndarray's
-> 1220 variable = self._variable.isel(indexers, missing_dims=missing_dims)
   1221 indexes, index_variables = isel_indexes(self.xindexes, indexers)
   1223 coords = {}

File .../xarray/core/variable.py:1172, in Variable.isel(self, indexers, missing_dims, **indexers_kwargs)
   1169 indexers = drop_dims_from_indexers(indexers, self.dims, missing_dims)
   1171 key = tuple(indexers.get(dim, slice(None)) for dim in self.dims)
-> 1172 return self[key]

File .../xarray/core/variable.py:765, in Variable.__getitem__(self, key)
    752 """"""Return a new Variable object whose contents are consistent with
    753 getting the provided key from the underlying data.
    754 
   (...)
    762 array `x.values` directly.
    763 """"""
    764 dims, indexer, new_order = self._broadcast_indexes(key)
--> 765 data = as_indexable(self._data)[indexer]
    766 if new_order:
    767     data = np.moveaxis(data, range(len(new_order)), new_order)

File .../xarray/core/indexing.py:1269, in NumpyIndexingAdapter.__getitem__(self, key)
   1267 def __getitem__(self, key):
   1268     array, key = self._indexing_array_and_key(key)
-> 1269     return array[key]

File .../lib/python3.9/site-packages/pint/quantity.py:1899, in Quantity.__getitem__(self, key)
   1897 def __getitem__(self, key):
   1898     try:
-> 1899         return type(self)(self._magnitude[key], self._units)
   1900     except PintTypeError:
   1901         raise

File .../lib/python3.9/site-packages/dask/array/core.py:1892, in Array.__getitem__(self, index)
   1889     return self
   1891 out = ""getitem-"" + tokenize(self, index2)
-> 1892 dsk, chunks = slice_array(out, self.name, self.chunks, index2, self.itemsize)
   1894 graph = HighLevelGraph.from_collections(out, dsk, dependencies=[self])
   1896 meta = meta_from_array(self._meta, ndim=len(chunks))

File .../lib/python3.9/site-packages/dask/array/slicing.py:174, in slice_array(out_name, in_name, blockdims, index, itemsize)
    171 index += (slice(None, None, None),) * missing
    173 # Pass down to next function
--> 174 dsk_out, bd_out = slice_with_newaxes(out_name, in_name, blockdims, index, itemsize)
    176 bd_out = tuple(map(tuple, bd_out))
    177 return dsk_out, bd_out

File .../lib/python3.9/site-packages/dask/array/slicing.py:196, in slice_with_newaxes(out_name, in_name, blockdims, index, itemsize)
    193         where_none[i] -= n
    195 # Pass down and do work
--> 196 dsk, blockdims2 = slice_wrap_lists(out_name, in_name, blockdims, index2, itemsize)
    198 if where_none:
    199     expand = expander(where_none)

File .../lib/python3.9/site-packages/dask/array/slicing.py:242, in slice_wrap_lists(out_name, in_name, blockdims, index, itemsize)
    238 where_list = [
    239     i for i, ind in enumerate(index) if is_arraylike(ind) and ind.ndim > 0
    240 ]
    241 if len(where_list) > 1:
--> 242     raise NotImplementedError(""Don't yet support nd fancy indexing"")
    243 # Is the single list an empty list? In this case just treat it as a zero
    244 # length slice
    245 if where_list and not index[where_list[0]].size:

NotImplementedError: Don't yet support nd fancy indexing
```

- [x] Tests added
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6414/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1181715207,PR_kwDOAMm_X841EOjL,6415,upgrade `sphinx`,14808389,closed,0,,,2,2022-03-26T12:16:08Z,2022-03-26T22:13:53Z,2022-03-26T22:13:50Z,MEMBER,,0,pydata/xarray/pulls/6415,"`sphinx` is now close to releasing `4.5`, but we're still pinning it to `sphinx<4`.

Along with it, this updates our RTD configuration.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1174386963,I_kwDOAMm_X85F_7kT,6382,`reindex` drops attrs,14808389,closed,0,,,1,2022-03-19T22:37:46Z,2022-03-21T07:53:05Z,2022-03-21T07:53:04Z,MEMBER,,,,"### What happened?

`reindex` stopped propagating `attrs` (detected in xarray-contrib/pint-xarray#159).

As far as I can tell, the new reindexing code in `Aligner` does not handle `attrs` yet?

### Minimal Complete Verifiable Example

```Python
# before #5692
In [1]: import xarray as xr
   ...: 
   ...: xr.set_options(keep_attrs=True)
   ...: 
   ...: ds = xr.tutorial.open_dataset(""air_temperature"")

In [2]: ds.reindex(lat=range(10, 80, 5)).lat
Out[2]: 
<xarray.DataArray 'lat' (lat: 14)>
array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75])
Coordinates:
  * lat      (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75
Attributes:
    standard_name:  latitude
    long_name:      Latitude
    units:          degrees_north
    axis:           Y

In [3]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={""attr"": ""value""}, dims=""lat"")).lat
Out[3]: 
<xarray.DataArray 'lat' (lat: 14)>
array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75])
Coordinates:
  * lat      (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75
Attributes:
    standard_name:  latitude
    long_name:      Latitude
    units:          degrees_north
    axis:           Y

# after #5692
In [3]: import xarray as xr
   ...: 
   ...: xr.set_options(keep_attrs=True)
   ...: 
   ...: ds = xr.tutorial.open_dataset(""air_temperature"")

In [4]: ds.reindex(lat=range(10, 80, 5)).lat
Out[4]: 
<xarray.DataArray 'lat' (lat: 14)>
array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75])
Coordinates:
  * lat      (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75

In [5]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={""attr"": ""value""}, dims=""lat"")).lat
Out[5]: 
<xarray.DataArray 'lat' (lat: 14)>
array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75])
Coordinates:
  * lat      (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75
```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1170003912,PR_kwDOAMm_X840etlq,6361,"Revert ""explicitly install `ipython_genutils`""",14808389,closed,0,,,1,2022-03-15T17:57:29Z,2022-03-15T19:06:32Z,2022-03-15T19:06:31Z,MEMBER,,0,pydata/xarray/pulls/6361,"Since the dependency issue has been fixed upstream, this reverts pydata/xarray#6350","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6361/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1166353506,PR_kwDOAMm_X840S8Yo,6350,explicitly install `ipython_genutils`,14808389,closed,0,,,2,2022-03-11T12:19:49Z,2022-03-15T17:56:41Z,2022-03-11T14:54:45Z,MEMBER,,0,pydata/xarray/pulls/6350,This can be reverted once the `nbconvert` package on `conda-forge` has been updated.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6350/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1125877338,I_kwDOAMm_X85DG4Za,6250,failing docs builds because the `scipy` intersphinx registry is unreachable,14808389,closed,0,,,6,2022-02-07T11:44:44Z,2022-02-08T21:49:48Z,2022-02-08T21:49:47Z,MEMBER,,,,"### What happened?

`scipy` seems to have some trouble with their documentation host setup, which means that trying to fetch its intersphinx registry returns a 404.

There's nothing we can do to really fix this, but we can try to the avoid docs build failures by disabling that intersphinx entry (not sure if that results in other errors, though)

### What did you expect to happen?

_No response_

### Minimal Complete Verifiable Example

_No response_

### Relevant log output

```python
WARNING: failed to reach any of the inventories with the following issues:
intersphinx inventory 'https://docs.scipy.org/doc/scipy/objects.inv' not fetchable due to <class 'requests.exceptions.HTTPError'>: 404 Client Error: Not Found for url: https://docs.scipy.org/doc/scipy/objects.inv
```


### Anything else we need to know?

_No response_

### Environment

See RTD","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue