id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type
1462173557,I_kwDOAMm_X85XJv91,7316,Support for python 3.11,43316012,closed,0,,,2,2022-11-23T17:52:18Z,2024-03-15T06:07:26Z,2024-03-15T06:07:26Z,COLLABORATOR,,,,"### Is your feature request related to a problem?
Now that python 3.11 has been released, we should start to support it officially.
### Describe the solution you'd like
I guess the first step would be to replace python 3.10 as a maximum version in the tests and see what crashes (and get lucky).
### Describe alternatives you've considered
_No response_
### Additional context
_No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7316/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
2034528244,I_kwDOAMm_X855RG_0,8537,Doctests failing,43316012,closed,0,,,1,2023-12-10T20:49:43Z,2023-12-11T21:00:03Z,2023-12-11T21:00:03Z,COLLABORATOR,,,,"### What is your issue?
The doctest is currently failing with
> E UserWarning: h5py is running against HDF5 1.14.3 when it was built against 1.14.2, this may cause problems","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8537/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1936080078,I_kwDOAMm_X85zZjzO,8291,`NamedArray.shape` does not support unknown dimensions,43316012,closed,0,,,1,2023-10-10T19:36:42Z,2023-10-18T06:22:54Z,2023-10-18T06:22:54Z,COLLABORATOR,,,,"### What is your issue?
According to the array api standard, the `shape` property returns `tuple[int | None, ...]`.
Currently we only support `tuple[int, ...]`
This will actually raise some errors if a duckarray actually returns some None.
E.g. `NamedArray.size` will fail.
(On a side note: dask arrays actually use NaN instead of None for some reason.... Only advantage of this is that `NamedArray.size` will actually also return NaN instead of raising...)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1915876808,I_kwDOAMm_X85yMfXI,8236,DataArray with multiple (Pandas)Indexes on the same dimension is impossible to align,43316012,closed,0,,,3,2023-09-27T15:52:05Z,2023-10-02T06:53:27Z,2023-10-01T07:19:09Z,COLLABORATOR,,,,"### What happened?
I have a DataArray with a single dimension and multiple (Pandas)Indexes assigned to various coordinates for efficient indexing using sel.
Edit: the problem is even worse than originally described below: such a DataArray breaks all alignment and it's basically unusable...
----
When I try to add an additional coordinate without any index (I simply use the tuple[dimension, values] way) I get a ValueError about aligning with conflicting indexes.
If the original DataArray only has a single (Pandas)Index everything works as expected.
### What did you expect to happen?
I expected that I can simply assign new coordinates without an index.
### Minimal Complete Verifiable Example
```Python
import xarray as xr
da = xr.DataArray(
[1, 2, 3],
dims=""t"",
coords={
""a"": (""t"", [3, 4, 5]),
""b"": (""t"", [5, 6, 7])
}
)
# set one index
da2 = da.set_xindex(""a"")
# set second index (same dimension, maybe thats a problem?)
da3 = da2.set_xindex(""b"")
# this works
da2.coords[""c""] = (""t"", [2, 3, 4])
# this does not
da3.coords[""c""] = (""t"", [2, 3, 4])
```
### MVCE confirmation
- [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
- [X] Complete example — the example is self-contained, including all data and the text of any traceback.
- [X] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result.
- [X] New issue — a search of GitHub Issues suggests this is not a duplicate.
### Relevant log output
> ValueError: cannot re-index or align objects with conflicting indexes found for the following dimensions: 't' (2 conflicting indexes)
Conflicting indexes may occur when
- they relate to different sets of coordinate and/or dimension names
- they don't have the same type
- they may be used to reindex data along common dimensions
### Anything else we need to know?
_No response_
### Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.9.10 (main, Mar 21 2022, 13:08:11)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
python-bits: 64
OS: Linux
OS-release: 3.10.0-1160.66.1.el7.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.2
libnetcdf: 4.9.0
xarray: 2022.12.0
pandas: 2.0.2
numpy: 1.24.3
scipy: 1.10.0
netCDF4: 1.6.2
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.6.2
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: 3.6.3
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
flox: None
numpy_groupies: None
setuptools: 58.1.0
pip: 21.2.4
conda: None
pytest: 7.3.2
mypy: 1.0.0
IPython: 8.8.0
sphinx: None
I have not yet tried this with a newer version of xarray....","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8236/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned,13221727,issue
1401066481,I_kwDOAMm_X85TgpPx,7141,Coverage shows reduced value since mypy flag was added,43316012,closed,0,,,3,2022-10-07T12:01:15Z,2023-08-30T18:47:35Z,2023-08-30T18:47:35Z,COLLABORATOR,,,,"### What is your issue?
The coverage was reduced from ~94% to ~68% after merging #7126
See https://app.codecov.io/gh/pydata/xarray or our badge
I _think_ this is because the unittests never included the tests directory while mypy does. And codecov uses the sum of both coverage reports to come up with its number.
Adding the flag to the badge also does not seem to help?
Not sure how or even if that is possible to solve, maybe we need to ask in codecov?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1648748263,I_kwDOAMm_X85iRebn,7703,Readthedocs build failing,43316012,closed,0,,,3,2023-03-31T06:20:53Z,2023-03-31T15:45:10Z,2023-03-31T15:45:10Z,COLLABORATOR,,,,"### What is your issue?
It seems that the readthedocs build is failing since some upstream update.
`pydata-sphinx-theme` seems to be incompatible with the `sphinx-book-theme`.
Maybe we have to pin to a specific or a maximum version for now.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7703/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1419825696,I_kwDOAMm_X85UoNIg,7199,Deprecate cfgrib backend,43316012,closed,0,,,4,2022-10-23T15:09:14Z,2023-03-29T15:19:53Z,2023-03-29T15:19:53Z,COLLABORATOR,,,,"### What is your issue?
Since cfgrib 0.9.9 (04/2021) it comes with its own xarray backend plugin (looks mainly like a copy of our internal version).
We should deprecate our internal plugin.
The deprecation is complicated since we usually bind the minimum version to a minor step, but cfgrib seems to be on 0.9 since 4 years already. Maybye an exception like for netCDF4?
Anyway, if we decide to leave it as it is for now, this ticket is just a reminder to remove it someday :)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7199/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1603831809,I_kwDOAMm_X85fmIgB,7572,`test_open_nczarr` failing,43316012,closed,0,,,3,2023-02-28T21:20:22Z,2023-03-02T16:49:25Z,2023-03-02T16:49:25Z,COLLABORATOR,,,,"### What is your issue?
In the latest CI runs it seems that `test_backends.py::TestNCZarr::test_open_nczarr` is failing with
> KeyError: 'Zarr object is missing the attribute `_ARRAY_DIMENSIONS` and the NCZarr metadata, which are required for xarray to determine variable dimensions.'
I don't see an obvious reason for this, especially since the zarr version has not changed compared to some runs that were successful (2.13.6).","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1361246796,I_kwDOAMm_X85RIvpM,6985,FutureWarning for pandas date_range,43316012,closed,0,,,1,2022-09-04T20:35:17Z,2023-02-06T17:51:48Z,2023-02-06T17:51:48Z,COLLABORATOR,,,,"### What is your issue?
Xarray raises a FutureWarning in its date_range, also observable in your tests.
The precise warning is:
> xarray/coding/cftime_offsets.py:1130: FutureWarning: Argument `closed` is deprecated in favor of `inclusive`.
You should discuss if you will adapt the new `inclusive` argument or add a workaround.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6985/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1464905814,I_kwDOAMm_X85XULBW,7322,Doctests failing,43316012,closed,0,,,4,2022-11-25T20:20:29Z,2022-11-28T19:31:04Z,2022-11-28T19:31:04Z,COLLABORATOR,,,,"### What is your issue?
It seems that some update in urllib3 causes our doctests to fail.
The reason seems to be that botocore uses an interesting construction to import deprecated urllib3 things:
```python
try:
# pyopenssl will be removed in urllib3 2.0, we'll fall back to ssl_ at that point.
# This can be removed once our urllib3 floor is raised to >= 2.0.
with warnings.catch_warnings():
warnings.simplefilter(""ignore"", category=DeprecationWarning)
# Always import the original SSLContext, even if it has been patched
from urllib3.contrib.pyopenssl import (
orig_util_SSLContext as SSLContext,
)
except ImportError:
from urllib3.util.ssl_ import
```
I assume that this fails because we use `-Werror` which translates the warning into an error which then is not ignored...
Not sure if this is an issue with botocore or we have to catch this?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1347715262,I_kwDOAMm_X85QVIC-,6949,Plot accessors miss static typing,43316012,closed,0,,,0,2022-08-23T10:38:56Z,2022-10-16T09:26:55Z,2022-10-16T09:26:55Z,COLLABORATOR,,,,"### What happened?
The plot accessors i.e. `dataarray.plot` of type `_PlotMethods` are missing static typing especially of function attributes. See #6947 for an example.
The problem is that many plotting methods are added using hooks via decorators, something that mypy does not understand.
### What did you expect to happen?
As a quick fix: type the plot accessors as `_PlotMethods | Any` to avoid false positives in mypy.
Better to either restructure the accessor with static methods instead of hooks or figure out another way of telling static type checkers about these methods.
Anyway: mypy should not complain.
### Minimal Complete Verifiable Example
```Python
import xarray as xr
da = xr.DataArray([[1,2,3], [4,5,6]], dims=[""x"", ""y""])
da.plot.contourf(x=""x"", y=""y"")
# mypy complains:
# error: ""_PlotMethods"" has no attribute ""contourf""
```
### MVCE confirmation
- [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
- [X] Complete example — the example is self-contained, including all data and the text of any traceback.
- [X] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result.
- [X] New issue — a search of GitHub Issues suggests this is not a duplicate.
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
On mobile, can edit it later if required.
Newest xarray should have this problem, before the accessor was Any.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6949/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1120405560,I_kwDOAMm_X85CyAg4,6229,[Bug]: rename_vars to dimension coordinate does not create an index,43316012,closed,0,,,6,2022-02-01T09:09:50Z,2022-09-27T09:33:42Z,2022-09-27T09:33:42Z,COLLABORATOR,,,,"### What happened?
We used `Data{set,Array}.rename{_vars}({coord: dim_coord})` to make a coordinate a dimension coordinate (instead of `set_index`).
This results in the coordinate correctly being displayed as a dimension coordinate (with the *) but it does not create an index, such that further operations like `sel` fail with a strange `KeyError`.
### What did you expect to happen?
I expect one of two things to be true:
1. `rename{_vars}` does not allow setting dimension coordinates (raises Error and tells you to use set_index)
2. `rename{_vars}` checks for this occasion and sets the index correctly
### Minimal Complete Verifiable Example
```python
import xarray as xr
data = xr.DataArray([5, 6, 7], coords={""c"": (""x"", [1, 2, 3])}, dims=""x"")
#
# array([5, 6, 7])
# Coordinates:
# c (x) int64 1 2 3
# Dimensions without coordinates: x
data_renamed = data.rename({""c"": ""x""})
#
# array([5, 6, 7])
# Coordinates:
# * x (x) int64 1 2 3
data_renamed.indexes
# Empty
data_renamed.sel(x=2)
# KeyError: 'no index found for coordinate x'
# if we use set_index it works
data_indexed = data.set_index({""x"": ""c""})
# looks the same as data_renamed!
#
# array([1, 2, 3])
# Coordinates:
# * x (x) int64 1 2 3
data_indexed.indexes
# x: Int64Index([1, 2, 3], dtype='int64', name='x')
```
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.9.1 (default, Jan 13 2021, 15:21:08)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
python-bits: 64
OS: Linux
OS-release: 3.10.0-1160.49.1.el7.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.0
libnetcdf: 4.7.4
xarray: 0.20.2
pandas: 1.3.5
numpy: 1.21.5
scipy: 1.7.3
netCDF4: 1.5.8
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.5.1.1
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: 3.5.1
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
setuptools: 49.2.1
pip: 22.0.2
conda: None
pytest: 6.2.5
IPython: 8.0.0
sphinx: None","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
917034151,MDU6SXNzdWU5MTcwMzQxNTE=,5458,DataArray.rename docu missing renaming of dimensions,43316012,closed,0,,,0,2021-06-10T07:57:11Z,2022-07-18T14:48:02Z,2022-07-18T14:48:02Z,COLLABORATOR,,,,"**What happened**:
http://xarray.pydata.org/en/stable/generated/xarray.DataArray.rename.html#xarray.DataArray.rename
states that:
> Returns a new DataArray with renamed coordinates or a new name.
**What you expected to happen**:
It should state:
""Returns a new DataArray with renamed coordinates, dimensions or a new name.""
Since it definitely can do that.
**Minimal example**
```
xr.DataArray([1, 2, 3]).rename({""dim_0"": ""new""})
```
**Further**
While at it: Dataset.rename als does not mention explicitly that you can rename coordinates.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5458/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1120378011,I_kwDOAMm_X85Cx5yb,6227,"[Bug]: Dataset.where(x, drop=True) behaves inconsistent",43316012,closed,0,,,0,2022-02-01T08:40:30Z,2022-06-12T22:06:51Z,2022-06-12T22:06:51Z,COLLABORATOR,,,,"### What happened?
I tried to reduce some dimensions using where (sel did not work in this case) and shorten the dimensions with ""drop=True"".
This works fine on DataArrays and Datasets with only a single dimension but fails as soon as you have a Dataset with two dimensions on different variables.
The dimensions are left untouched and you have NaNs in the data, just as if you were using ""drop=False"" (see example).
I am actually not sure what the expected behavior is, maybe I am wrong and it is correct due to some broadcasting rules?
### What did you expect to happen?
I expected that relevant dims are shortened.
If the `ds.where` with ""drop=False"" all variables along a dimenions have some NaNs, then using ""drop=True"" I expect these dimensions to be shortened and the NaNs removed.
### Minimal Complete Verifiable Example
```python
import xarray as xr
# this works
ds = xr.Dataset({""a"": (""x"", [1, 2 ,3])})
ds.where(ds > 2, drop=True)
# returns:
#
# Dimensions: (x: 1)
# Dimensions without coordinates: x
# Data variables:
# a (x) float64 3.0
# this doesn't
ds = xr.Dataset({""a"": (""x"", [1, 2 ,3]), ""b"": (""y"", [2, 3, 4])})
ds.where(ds > 2, drop=True)
# returns:
#
# Dimensions: (x: 3, y: 3)
# Dimensions without coordinates: x, y
# Data variables:
# a (x) float64 nan nan 3.0
# b (y) float64 nan 3.0 4.0
```
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.9.1 (default, Jan 13 2021, 15:21:08)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
python-bits: 64
OS: Linux
OS-release: 3.10.0-1160.49.1.el7.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.0
libnetcdf: 4.7.4
xarray: 0.20.2
pandas: 1.3.5
numpy: 1.21.5
scipy: 1.7.3
netCDF4: 1.5.8
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.5.1.1
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: 3.5.1
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
setuptools: 49.2.1
pip: 22.0.2
conda: None
pytest: 6.2.5
IPython: 8.0.0
sphinx: None","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6227/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1245726154,I_kwDOAMm_X85KQEXK,6632,Literal type of engine argument incompatible with custom backends,43316012,closed,0,,,5,2022-05-23T21:40:14Z,2022-05-28T10:29:16Z,2022-05-28T10:29:16Z,COLLABORATOR,,,,"### What is your issue?
In the recent typing improvements the `engine` argument for `open_dataset` was changed from Str to a Literal of xarrays internal engines.
This will cause problems for all third party backend plugins.
We have several possibilities:
1. I don't know if there is a way to know installed backends at type checking time. Then we could add this support. (I doubt this is possible seeing how dynamic these imports are)
2. Is it possible for these plugins to tell type checkers that their engine is valid, i.e. change the type signature of xarrays function? Then we should add a how-to in the docu.
3. Else we should probably revert to using Str.
Any typing experts here that could help?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1222103599,I_kwDOAMm_X85I19Iv,6554,isel with drop=True does not drop coordinates if using scalar DataArray as indexer,43316012,closed,0,,,2,2022-05-01T10:14:37Z,2022-05-10T06:18:19Z,2022-05-10T06:18:19Z,COLLABORATOR,,,,"### What happened?
When using `DataArray/Dataset.isel` with `drop=True` with a scalar DataArray as indexer (see example) resulting scalar coordinates do not get dropped.
When using an integer the behavior is as expected.
### What did you expect to happen?
I expect that using a scalar DataArray behaves the same as an integer.
### Minimal Complete Verifiable Example
```Python
import xarray as xr
da = xr.DataArray([1, 2, 3], dims=""x"", coord={""k"": (""x"", [0, 1, 2])})
#
# array([1, 2, 3])
# Coordinates:
# k (x) int32 0 1 2
da.isel({""x"": 1}, drop=True)
# works
#
# array(2)
da.isel({""x"": xr.DataArray(1)}, drop=True)
# does not drop ""k"" coordinate
#
# array(2)
# Coordinates:
# k int32 1
```
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
INSTALLED VERSIONS
------------------
commit: 4fbca23a9fd8458ec8f917dd0e54656925503e90
python: 3.9.6 | packaged by conda-forge | (default, Jul 6 2021, 08:46:02) [MSC v.1916 64 bit (AMD64)]
python-bits: 64
OS: Windows
OS-release: 10
machine: AMD64
processor: AMD64 Family 23 Model 113 Stepping 0, AuthenticAMD
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('de_DE', 'cp1252')
libhdf5: 1.10.6
libnetcdf: 4.7.4
xarray: 0.18.2.dev76+g3a7e7ca2.d20210706
pandas: 1.3.0
numpy: 1.21.0
scipy: 1.7.0
netCDF4: 1.5.6
pydap: installed
h5netcdf: 0.11.0
h5py: 3.3.0
Nio: None
zarr: 2.8.3
cftime: 1.5.0
nc_time_axis: 1.3.1
PseudoNetCDF: installed
cfgrib: None
iris: 2.4.0
bottleneck: 1.3.2
dask: 2021.06.2
distributed: 2021.06.2
matplotlib: 3.4.2
cartopy: 0.19.0.post1
seaborn: 0.11.1
numbagg: 0.2.1
fsspec: 2021.06.1
cupy: None
pint: 0.17
sparse: 0.12.0
setuptools: 49.6.0.post20210108
pip: 21.3.1
conda: None
pytest: 6.2.4
IPython: None
sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6554/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1217543476,I_kwDOAMm_X85Ikj00,6526,xr.polyval first arg requires name attribute,43316012,closed,0,,,2,2022-04-27T15:47:02Z,2022-05-05T19:15:58Z,2022-05-05T19:15:58Z,COLLABORATOR,,,,"### What happened?
I have some polynomial coefficients and want to evaluate them at some values using `xr.polyval`.
As described in the docstring/docu I created a 1D coordinate DataArray and pass it to `xr.polyval` but it raises a KeyError (see example).
### What did you expect to happen?
I expected that the polynomial would be evaluated at the given points.
### Minimal Complete Verifiable Example
```Python
import xarray as xr
coeffs = xr.DataArray([1, 2, 3], dims=""degree"")
# With a ""handmade"" coordinate it fails:
coord = xr.DataArray([0, 1, 2], dims=""x"")
xr.polyval(coord, coeffs)
# raises:
# Traceback (most recent call last):
# File """", line 1, in
# File ""xarray/core/computation.py"", line 1847, in polyval
# x = get_clean_interp_index(coord, coord.name, strict=False)
# File ""xarray/core/missing.py"", line 252, in get_clean_interp_index
# index = arr.get_index(dim)
# File ""xarray/core/common.py"", line 404, in get_index
# raise KeyError(key)
# KeyError: None
# If one adds a name to the coord that is called like the dimension:
coord2 = xr.DataArray([0, 1, 2], dims=""x"", name=""x"")
xr.polyval(coord2, coeffs)
# works
```
### Relevant log output
_No response_
### Anything else we need to know?
I assume that the ""standard"" workflow is to obtain the `coord` argument from an existing DataArrays coordinate, where the name would be correctly set already.
However, that is not clear from the description, and also prevents my ""manual"" workflow.
It could be that the problem will be solved by replacing the coord DataArray argument by an explicit Index in the future.
### Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.9.10 (main, Mar 15 2022, 15:56:56)
[GCC 7.5.0]
python-bits: 64
OS: Linux
OS-release: 3.10.0-1160.49.1.el7.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.0
libnetcdf: 4.7.4
xarray: 2022.3.0
pandas: 1.4.2
numpy: 1.22.3
scipy: None
netCDF4: 1.5.8
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.6.0
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: 3.5.1
cartopy: 0.20.2
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
setuptools: 58.1.0
pip: 22.0.4
conda: None
pytest: None
IPython: 8.2.0
sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1222215528,I_kwDOAMm_X85I2Ydo,6555,sortby with ascending=False should create an index,43316012,closed,0,,,4,2022-05-01T16:57:51Z,2022-05-01T22:17:50Z,2022-05-01T22:17:50Z,COLLABORATOR,,,,"### Is your feature request related to a problem?
When using `sortby` with `ascending=False` on a DataArray/Dataset **without** an explicit index, the data gets correctly reversed, but it is not possible to tell anymore which ordering the data has.
If an explicit index (like [0, 1, 2]) exists, it gets correctly reordered and allowes correct aligning.
### Describe the solution you'd like
For consistency with aligning xarray should create a new index that indicates that the data has been reordered, i.e. [2, 1, 0].
Only downside: this will break code that relies on non-existent indexes.
### Describe alternatives you've considered
_No response_
### Additional context
_No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1155321209,I_kwDOAMm_X85E3M15,6313,groubpy on array with multiindex renames indices,43316012,closed,0,,,1,2022-03-01T13:08:30Z,2022-03-17T17:11:44Z,2022-03-17T17:11:44Z,COLLABORATOR,,,,"### What happened?
When grouping and reducing an array or dataset over a multi-index the coordinates that make up the multi-index get renamed to ""{name_of_multiindex}\_level\_{i}"".
It only works correctly when the Multiindex is a ""homogenous grid"", i.e. as obtained by stacking.
### What did you expect to happen?
I expect that all coordinates keep their initial names.
### Minimal Complete Verifiable Example
```Python
import xarray as xr
# this works:
d = xr.DataArray(range(4), dims=""t"", coords={""x"": (""t"", [0, 0, 1, 1]), ""y"": (""t"", [0, 1, 0, 1])})
dd = d.set_index({""t"": [""x"", ""y""]})
# returns
#
# array([0, 1, 2, 3])
# Coordinates:
# * t (t) MultiIndex
# - x (t) int64 0 0 1 1
# - y (t) int64 0 1 0 1
dd.groupby(""t"").mean(...)
# returns
#
# array([0., 1., 2., 3.])
# Coordinates:
# * t (t) MultiIndex
# - x (t) int64 0 0 1 1
# - y (t) int64 0 1 0 1
# this does not work
d2 = xr.DataArray(range(6), dims=""t"", coords={""x"": (""t"", [0, 0, 1, 1, 0, 1]), ""y"": (""t"", [0, 1, 0, 1, 0, 0])})
dd2 = d2.set_index({""t"": [""x"", ""y""]})
# returns
#
# array([0, 1, 2, 3, 4, 5])
# Coordinates:
# * t (t) MultiIndex
# - x (t) int64 0 0 1 1 0 1
# - y (t) int64 0 1 0 1 0 0
dd2.groupby(""t"").mean(...)
# returns
#
# array([2. , 1. , 3.5, 3. ])
# Coordinates:
# * t (t) MultiIndex
# - t_level_0 (t) int64 0 0 1 1
# - t_level_1 (t) int64 0 1 0 1
```
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.9.1 (default, Jan 13 2021, 15:21:08)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
python-bits: 64
OS: Linux
OS-release: 3.10.0-1160.49.1.el7.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.0
libnetcdf: 4.7.4
xarray: 0.21.1
pandas: 1.4.0
numpy: 1.21.5
scipy: 1.7.3
netCDF4: 1.5.8
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.5.1.1
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: 3.5.1
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
setuptools: 49.2.1
pip: 22.0.3
conda: None
pytest: 6.2.5
IPython: 8.0.0
sphinx: None","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
932677183,MDU6SXNzdWU5MzI2NzcxODM=,5550,Dataset.transpose support for missing_dims,43316012,closed,0,,,6,2021-06-29T13:32:37Z,2021-07-17T21:02:59Z,2021-07-17T21:02:59Z,COLLABORATOR,,,,"**Is your feature request related to a problem? Please describe.**
I have a dataset where I do not know which of two dimensions (lets call them `a` and `b`) exists in this dataset (So either it has dims (""a"", ""other"") or (""b"", ""other"")).
I would like to make sure that this dimension is first using transpose, but currently this is only possible using `if` or `try` statements. Just using `ds.transpose(""a"", ""b"", ""other"")` raises a `ValueError arguments to transpose XXX must be permuted dataset dimensions YYY`.
**Describe the solution you'd like**
It would be nice if I could just use `ds.transpose(""a"", ""b"", ""other"", missing_dims=""ignore"")` similar to how `DataArray.transpose` handles it.
**Describe alternatives you've considered**
Currently I'm also using `ds.map(lambda x: x.transpose(""a"", ""b"", ""other"", missing_dims=""ignore""))`, which could (maybe?) replace the current implementation of the transpose.
While at it, `transpose_coords` could also be exposed to Dataset.transpose.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue