id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type
485508509,MDU6SXNzdWU0ODU1MDg1MDk=,3267,Resample excecution time is significantly longer in version 0.12 than 0.11,20617032,closed,0,,,5,2019-08-26T23:51:11Z,2023-04-29T03:30:20Z,2023-04-29T03:30:20Z,NONE,,,,"#### MCVE Code Sample
```python
import numpy as np
import xarray as xr
import pandas as pd
import time
size = 1000000
data = np.random.random(size)
times = pd.date_range('2019-01-01', periods=size, freq='ms')
da = xr.DataArray(data, dims=['time'], coords={'time': times})
start = time.time()
da.resample(time='s').mean()
print('Elapsed time: ' + str(time.time() - start))
print('xarray version: ' + str(xr.__version__))
```
* Output for xarray==0.11.3
Elapsed time: 0.2671010494232178
xarray version: 0.11.3
* Output for xarray==0.12.0
Elapsed time: 6.652455568313599
xarray version: 0.12.0
#### Expected Output
I expect that for the default parameters, resample should take a similar amount of time as previous versions. Or at least the documentation should specify what parameters must be passed into resample or mean to achieve the same time as in version 0.11.3
#### Problem Description
resampling a dataarray or dataset and then calling mean takes significantly longer in the latest versions of xarray 0.12. The changelog specifies multiple changes to the resample method and new parameters, but its not clear what one would specify to achieve execution time improvements.
#### Output of ``xr.show_versions()``
# Paste the output here xr.show_versions() here
xr.show_versions() gives me the following error
`AttributeError: module 'h5py' has no attribute '__hdf5libversion__'`
So I have just printed the version of xarray. I changed the xarray version by using `pip install xarray==0.11.3`.
Here is my conda environment
_anaconda_depends 2019.03 py37_0
_ipyw_jlab_nb_ext_conf 0.1.0 py37_0
alabaster 0.7.12 py37_0
anaconda custom py37_1
anaconda-client 1.7.2 py37_0
anaconda-navigator 1.9.7 py37_0
anaconda-project 0.8.3 py_0
asn1crypto 0.24.0 py37_0
astroid 2.2.5 py37_0
astropy 3.2.1 py37he774522_0
atomicwrites 1.3.0 py37_1
attrs 19.1.0 py37_1
babel 2.7.0 py_0
backcall 0.1.0 py37_0
backports 1.0 py_2
backports.functools_lru_cache 1.5 py_2
backports.os 0.1.1 py37_0
backports.shutil_get_terminal_size 1.0.0 py37_2
backports.tempfile 1.0 py_1
backports.weakref 1.0.post1 py_1
beautifulsoup4 4.7.1 py37_1
bitarray 0.9.3 py37he774522_0
bkcharts 0.2 py37_0
blas 1.0 mkl
bleach 3.1.0 py37_0
blosc 1.16.3 h7bd577a_0
bokeh 1.2.0 py37_0
boto 2.49.0 py37_0
bottleneck 1.2.1 py37h452e1ab_1
bzip2 1.0.8 he774522_0
ca-certificates 2019.5.15 0
cantera 2.4.0 np115py37_2 cantera
certifi 2019.6.16 py37_0
cffi 1.12.3 py37h7a1dbc1_0
chardet 3.0.4 py37_1
click 7.0 py37_0
cloudpickle 1.2.1 py_0
clyent 1.2.2 py37_1
colorama 0.4.1 py37_0
comtypes 1.1.7 py37_0
conda 4.7.5 py37_0
conda-build 3.18.7 py37_0
conda-env 2.6.0 1
conda-package-handling 1.3.11 py37_0
conda-verify 3.4.2 py_0
console_shortcut 0.1.1 3
contextlib2 0.5.5 py37_0
cryptography 2.7 py37h7a1dbc1_0
curl 7.64.1 h2a8f88b_0
cycler 0.10.0 py37_0
cython 0.29.12 py37ha925a31_0
cytoolz 0.10.0 py37he774522_0
dask 2.1.0 py_0
dask-core 2.1.0 py_0
decorator 4.4.0 py37_1
defusedxml 0.6.0 pypi_0 pypi
distributed 2.1.0 py_0
docutils 0.14 py37_0
easygui 0.98.1 pypi_0 pypi
entrypoints 0.3 py37_0
et_xmlfile 1.0.1 py37_0
fastcache 1.1.0 py37he774522_0
filelock 3.0.12 py_0
flask 1.1.1 py_0
freetype 2.9.1 ha9979f8_1
future 0.17.1 py37_0
get_terminal_size 1.0.0 h38e98db_0
gevent 1.4.0 py37he774522_0
gitdb2 2.0.5 pypi_0 pypi
gitpython 2.1.11 pypi_0 pypi
glob2 0.7 py_0
greenlet 0.4.15 py37hfa6e2cd_0
h5netcdf 0.7.4 pypi_0 pypi
h5py 2.9.0 py37h5e291fa_0
hdf5 1.10.4 h7ebc959_0
heapdict 1.0.0 py37_2
html5lib 1.0.1 py37_0
icc_rt 2019.0.0 h0cc432a_1
icu 58.2 ha66f8fd_1
idna 2.8 py37_0
imageio 2.5.0 py37_0
imagesize 1.1.0 py37_0
importlib_metadata 0.17 py37_1
intel-openmp 2019.4 245
ipykernel 5.1.1 py37h39e3cac_0
ipython 7.6.1 py37h39e3cac_0
ipython_genutils 0.2.0 py37_0
ipywidgets 7.5.0 py_0
isort 4.3.21 py37_0
itsdangerous 1.1.0 py37_0
jdcal 1.4.1 py_0
jedi 0.13.3 py37_0
jinja2 2.10.1 py37_0
joblib 0.13.2 py37_0
jpeg 9b hb83a4c4_2
json5 0.8.4 py_0
jsonschema 3.0.1 py37_0
jupyter 1.0.0 py37_7
jupyter_client 5.3.1 py_0
jupyter_console 6.0.0 py37_0
jupyter_core 4.5.0 py_0
jupyterlab 1.0.2 py37hf63ae98_0
jupyterlab-git 0.6.1 pypi_0 pypi
jupyterlab_server 1.0.0 py_0
keyring 18.0.0 py37_0
kiwisolver 1.1.0 py37ha925a31_0
krb5 1.16.1 hc04afaa_7
lazy-object-proxy 1.4.1 py37he774522_0
libarchive 3.3.3 h0643e63_5
libcurl 7.64.1 h2a8f88b_0
libiconv 1.15 h1df5818_7
liblief 0.9.0 ha925a31_2
libpng 1.6.37 h2a8f88b_0
libsodium 1.0.16 h9d3ae62_0
libssh2 1.8.2 h7a1dbc1_0
libtiff 4.0.10 hb898794_2
libxml2 2.9.9 h464c3ec_0
libxslt 1.1.33 h579f668_0
llvmlite 0.29.0 py37ha925a31_0
locket 0.2.0 py37_1
lxml 4.3.4 py37h1350720_0
lz4-c 1.8.1.2 h2fa13f4_0
lzo 2.10 h6df0209_2
m2w64-gcc-libgfortran 5.3.0 6
m2w64-gcc-libs 5.3.0 7
m2w64-gcc-libs-core 5.3.0 7
m2w64-gmp 6.1.0 2
m2w64-libwinpthread-git 5.0.0.4634.697f757 2
markupsafe 1.1.1 py37he774522_0
matplotlib 3.1.0 py37hc8f65d3_0
mccabe 0.6.1 py37_1
menuinst 1.4.16 py37he774522_0
mistune 0.8.4 py37he774522_0
mkl 2019.4 245
mkl-service 2.0.2 py37he774522_0
mkl_fft 1.0.12 py37h14836fe_0
mkl_random 1.0.2 py37h343c172_0
mock 3.0.5 py37_0
more-itertools 7.0.0 py37_0
mpmath 1.1.0 py37_0
msgpack-python 0.6.1 py37h74a9793_1
msys2-conda-epoch 20160418 1
multipledispatch 0.6.0 py37_0
navigator-updater 0.2.1 py37_0
nbconvert 5.5.0 py_0
nbdime 1.0.7 pypi_0 pypi
nbformat 4.4.0 py37_0
networkx 2.3 py_0
nltk 3.4.4 py37_0
nose 1.3.7 py37_2
notebook 5.7.8 py37_0
nptdms 0.14.0 pypi_0 pypi
numba 0.44.1 py37hf9181ef_0
numexpr 2.6.9 py37hdce8814_0
numpy 1.15.4 py37h19fb1c0_0
numpy-base 1.15.4 py37hc3f5095_0
numpydoc 0.9.1 py_0
olefile 0.46 py37_0
openpyxl 2.6.2 py_0
openssl 1.1.1c he774522_1
packaging 19.0 py37_0
pandas 0.24.2 py37ha925a31_0
pandoc 2.2.3.2 0
pandocfilters 1.4.2 py37_1
parso 0.5.0 py_0
partd 1.0.0 py_0
path.py 12.0.1 py_0
pathlib2 2.3.4 py37_0
patsy 0.5.1 py37_0
pep8 1.7.1 py37_0
pickleshare 0.7.5 py37_0
pillow 6.1.0 py37hdc69c19_0
pip 19.1.1 py37_0
pkginfo 1.5.0.1 py37_0
plotly 4.0.0 py_0 plotly
pluggy 0.12.0 py_0
ply 3.11 py37_0
powershell_shortcut 0.0.1 2
prometheus_client 0.7.1 py_0
prompt_toolkit 2.0.9 py37_0
psutil 5.6.3 py37he774522_0
py 1.8.0 py37_0
py-lief 0.9.0 py37ha925a31_2
pycodestyle 2.5.0 py37_0
pycosat 0.6.3 py37hfa6e2cd_0
pycparser 2.19 py37_0
pycrypto 2.6.1 py37hfa6e2cd_9
pycurl 7.43.0.3 py37h7a1dbc1_0
pyflakes 2.1.1 py37_0
pygments 2.4.2 py_0
pylint 2.3.1 py37_0
pyodbc 4.0.26 py37ha925a31_0
pyopenssl 19.0.0 py37_0
pyparsing 2.4.0 py_0
pyqt 5.9.2 py37h6538335_2
pyqt5 5.13.0 pypi_0 pypi
pyqt5-sip 4.19.18 pypi_0 pypi
pyreadline 2.1 py37_1
pyrsistent 0.14.11 py37he774522_0
pysocks 1.7.0 py37_0
pytables 3.5.2 py37h1da0976_1
pytest 5.0.1 py37_0
pytest-arraydiff 0.3 py37h39e3cac_0
pytest-astropy 0.5.0 py37_0
pytest-doctestplus 0.3.0 py37_0
pytest-openfiles 0.3.2 py37_0
pytest-remotedata 0.3.1 py37_0
python 3.7.3 h8c8aaf0_1
python-dateutil 2.8.0 py37_0
python-libarchive-c 2.8 py37_10
pythonnet 2.4.0 pypi_0 pypi
pytz 2019.1 py_0
pywavelets 1.0.3 py37h8c2d366_1
pywin32 223 py37hfa6e2cd_1
pywinpty 0.5.5 py37_1000
pyyaml 5.1.1 py37he774522_0
pyzmq 18.0.0 py37ha925a31_0
qt 5.9.7 vc14h73c81de_0
qtawesome 0.5.7 py37_1
qtconsole 4.5.1 py_0
qtpy 1.8.0 py_0
requests 2.22.0 py37_0
retrying 1.3.3 py37_2
rope 0.14.0 py_0
ruamel_yaml 0.15.46 py37hfa6e2cd_0
scikit-image 0.15.0 py37ha925a31_0
scikit-learn 0.21.2 py37h6288b17_0
scipy 1.2.1 py37h29ff71c_0
seaborn 0.9.0 py37_0
send2trash 1.5.0 py37_0
setuptools 41.0.1 py37_0
simplegeneric 0.8.1 py37_2
singledispatch 3.4.0.3 py37_0
sip 4.19.8 py37h6538335_0
six 1.12.0 py37_0
smmap2 2.0.5 pypi_0 pypi
snappy 1.1.7 h777316e_3
snowballstemmer 1.9.0 py_0
sortedcollections 1.1.2 py37_0
sortedcontainers 2.1.0 py37_0
soupsieve 1.8 py37_0
spe2py 2.0.0 pypi_0 pypi
sphinx 2.1.2 py_0
sphinxcontrib 1.0 py37_1
sphinxcontrib-applehelp 1.0.1 py_0
sphinxcontrib-devhelp 1.0.1 py_0
sphinxcontrib-htmlhelp 1.0.2 py_0
sphinxcontrib-jsmath 1.0.1 py_0
sphinxcontrib-qthelp 1.0.2 py_0
sphinxcontrib-serializinghtml 1.1.3 py_0
sphinxcontrib-websupport 1.1.2 py_0
spyder 3.3.6 py37_0
spyder-kernels 0.5.1 py37_0
sqlalchemy 1.3.5 py37he774522_0
sqlite 3.29.0 he774522_0
statsmodels 0.10.0 py37h8c2d366_0
sympy 1.4 py37_0
tblib 1.4.0 py_0
terminado 0.8.2 py37_0
testpath 0.4.2 py37_0
tk 8.6.8 hfa6e2cd_0
toolz 0.10.0 py_0
tornado 6.0.3 py37he774522_0
tqdm 4.32.1 py_0
traitlets 4.3.2 py37_0
tzlocal 2.0.0b3 pypi_0 pypi
unicodecsv 0.14.1 py37_0
urllib3 1.24.2 py37_0
vc 14.1 h0510ff6_4
vs2015_runtime 14.15.26706 h3a45250_4
wcwidth 0.1.7 py37_0
webencodings 0.5.1 py37_1
werkzeug 0.15.4 py_0
wheel 0.33.4 py37_0
widgetsnbextension 3.5.0 py37_0
win_inet_pton 1.1.0 py37_0
win_unicode_console 0.5 py37_0
wincertstore 0.2 py37_0
winpty 0.4.3 4
wrapt 1.11.2 py37he774522_0
xarray 0.12.0 pypi_0 pypi
xlrd 1.2.0 py37_0
xlsxwriter 1.1.8 py_0
xlwings 0.15.8 py37_0
xlwt 1.3.0 py37_0
xyzpy 0.3.1 pypi_0 pypi
xz 5.2.4 h2fa13f4_4
yaml 0.1.7 hc54c509_2
zeromq 4.3.1 h33f27b4_3
zict 1.0.0 py_0
zipp 0.5.1 py_0
zlib 1.2.11 h62dcd97_3
zstd 1.3.7 h508b16e_0
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3267/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
619327957,MDExOlB1bGxSZXF1ZXN0NDE4ODc5MTU3,4067,make text wrap width an argument in label_from_attrs,20617032,closed,0,,,4,2020-05-15T23:44:09Z,2023-03-26T20:06:29Z,2023-03-26T20:06:28Z,NONE,,0,pydata/xarray/pulls/4067,"label_from_attrs used textwrap.wrap with a default wrap width of 30,
this commit changes label_from_attrs to instead take an argument
wrap_width that specifies the wrap width.
context: I find the label_from_attrs function useful for plots that I make using pyplot, but needed to change the text wrap with.
I am new to contributing to projects like this so bear with me. I only had the mypy and flake8 linters and figured that was good enough for a minor change like this. And I didn't see label_from_attrs in the api document so just added a line to whats-new.rst
- [x] Passes mypy . && flake8`
- [x] Fully documented, including `whats-new.rst`
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1352621981,I_kwDOAMm_X85Qn1-d,6959,Assigning coordinate level to MultiIndex fails if MultiIndex only has one level,20617032,closed,0,4160723,,0,2022-08-26T18:48:18Z,2022-09-27T10:35:39Z,2022-09-27T10:35:39Z,NONE,,,,"### What happened?
This issue originates from [this discussion ](https://github.com/pydata/xarray/discussions/6936) where I was trying to figure out the best way to replace coordinate values in a MultiIndex level. I found that removing the level with `reset_index` and replacing the coordinate level with `assign_coords` works except when removing level leaves you with a MultIndex with only one level. In this case a ValueError is thrown.
### What did you expect to happen?
I expect that removing and replacing a coordinate level would work the same independent of the number of levels in the MultiIndex.
### Minimal Complete Verifiable Example
```Python
import numpy as np
import pandas as pd
import xarray as xr
# Replace the coordinates in level 'one'. This works as expected.
midx = pd.MultiIndex.from_product([[0,1,2], [3, 4], [5,6]], names=(""one"", ""two"",""three""))
mda = xr.DataArray(np.random.rand(12, 3), [(""x"", midx), (""y"", range(3))])
new_coords = mda.coords['one'].values*2
mda.reset_index('one', drop=True).assign_coords(one= ('x',new_coords)).set_index(x='one',append=True) #Works
# Drop the two level before had such that the intermediate state has a multindex
# with only the 'three' level, this throws a ValueError
mda.reset_index('two',drop=True).reset_index('one', drop=True).assign_coords(one= ('x',new_coords)) #ValueError
# We can intialize a data array with only two levels and only drop the 'one'
# level, which gives the same ValueError. This shows that the problem is not
# due to something with dropping the 'two' level above, but something inherent
# to dropping to a state with only one multinddex level
midx = pd.MultiIndex.from_product([[0,1,2], [3, 4]], names=(""one"", ""two""))
mda = xr.DataArray(np.random.rand(6, 2), [(""x"", midx), (""y"", range(2))])
new_coords = mda.coords['one'].values*2
mda.reset_index('one', drop=True).assign_coords(one= ('x',new_coords)).set_index(x='one',append=True) #ValueError
```
### MVCE confirmation
- [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
- [X] Complete example — the example is self-contained, including all data and the text of any traceback.
- [X] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result.
- [X] New issue — a search of GitHub Issues suggests this is not a duplicate.
### Relevant log output
```Python
# First example, starting from 3 level multiindex and dropping two levels
ValueError Traceback (most recent call last)
c:\Users\aspit\Git\Learn\xarray\replace_coord_issue.py in
15 # Drop the two level before had such that the intermediate state has a multindex
16 # with only the 'three' level, this throws a ValueError
---> 17 mda.reset_index('two',drop=True).reset_index('one', drop=True).assign_coords(one= ('x',new_coords))
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\common.py in assign_coords(self, coords, **coords_kwargs)
590 data = self.copy(deep=False)
591 results: dict[Hashable, Any] = self._calc_assign_results(coords_combined)
--> 592 data.coords.update(results)
593 return data
594
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\coordinates.py in update(self, other)
160 other_vars = getattr(other, ""variables"", other)
161 self._maybe_drop_multiindex_coords(set(other_vars))
--> 162 coords, indexes = merge_coords(
163 [self.variables, other_vars], priority_arg=1, indexes=self.xindexes
164 )
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\merge.py in merge_coords(objects, compat, join, priority_arg, indexes, fill_value)
564 collected = collect_variables_and_indexes(aligned)
565 prioritized = _get_priority_vars_and_indexes(aligned, priority_arg, compat=compat)
--> 566 variables, out_indexes = merge_collected(collected, prioritized, compat=compat)
567 return variables, out_indexes
568
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\merge.py in merge_collected(grouped, prioritized, compat, combine_attrs, equals)
252
253 _assert_compat_valid(compat)
--> 254 _assert_prioritized_valid(grouped, prioritized)
255
256 merged_vars: dict[Hashable, Variable] = {}
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\merge.py in _assert_prioritized_valid(grouped, prioritized)
199 common_names_str = "", "".join(f""{k!r}"" for k in common_names)
200 index_names_str = "", "".join(f""{k!r}"" for k in index_coord_names)
--> 201 raise ValueError(
202 f""cannot set or update variable(s) {common_names_str}, which would corrupt ""
203 f""the following index built from coordinates {index_names_str}:\n""
ValueError: cannot set or update variable(s) 'one', which would corrupt the following index built from coordinates 'x', 'one', 'three':
# Second Example Starting from two level multindex and dropping one level
ValueError Traceback (most recent call last)
c:\Users\aspit\Git\Learn\xarray\replace_coord_issue.py in
11
12 new_coords = mda.coords['one'].values*2
---> 13 mda.reset_index('one', drop=True).assign_coords(one= ('x',new_coords)).set_index(x='one',append=True)
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\common.py in assign_coords(self, coords, **coords_kwargs)
590 data = self.copy(deep=False)
591 results: dict[Hashable, Any] = self._calc_assign_results(coords_combined)
--> 592 data.coords.update(results)
593 return data
594
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\coordinates.py in update(self, other)
160 other_vars = getattr(other, ""variables"", other)
161 self._maybe_drop_multiindex_coords(set(other_vars))
--> 162 coords, indexes = merge_coords(
163 [self.variables, other_vars], priority_arg=1, indexes=self.xindexes
164 )
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\merge.py in merge_coords(objects, compat, join, priority_arg, indexes, fill_value)
564 collected = collect_variables_and_indexes(aligned)
565 prioritized = _get_priority_vars_and_indexes(aligned, priority_arg, compat=compat)
--> 566 variables, out_indexes = merge_collected(collected, prioritized, compat=compat)
567 return variables, out_indexes
568
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\merge.py in merge_collected(grouped, prioritized, compat, combine_attrs, equals)
252
253 _assert_compat_valid(compat)
--> 254 _assert_prioritized_valid(grouped, prioritized)
255
256 merged_vars: dict[Hashable, Variable] = {}
c:\Users\aspit\anaconda3\envs\dataanalysis\lib\site-packages\xarray\core\merge.py in _assert_prioritized_valid(grouped, prioritized)
199 common_names_str = "", "".join(f""{k!r}"" for k in common_names)
200 index_names_str = "", "".join(f""{k!r}"" for k in index_coord_names)
--> 201 raise ValueError(
202 f""cannot set or update variable(s) {common_names_str}, which would corrupt ""
203 f""the following index built from coordinates {index_names_str}:\n""
ValueError: cannot set or update variable(s) 'one', which would corrupt the following index built from coordinates 'x', 'one', 'two':
```
### Anything else we need to know?
_No response_
### Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.9.7 | packaged by conda-forge | (default, Sep 29 2021, 19:15:42) [MSC v.1916 64 bit (AMD64)]
python-bits: 64
OS: Windows
OS-release: 10
machine: AMD64
processor: Intel64 Family 6 Model 142 Stepping 12, GenuineIntel
byteorder: little
LC_ALL: None
LANG: None
LOCALE: ('English_United States', '1252')
libhdf5: 1.12.1
libnetcdf: 4.8.1
xarray: 2022.6.0
pandas: 1.3.4
numpy: 1.21.4
scipy: 1.7.3
netCDF4: 1.5.8
pydap: None
h5netcdf: 1.0.2
h5py: 3.7.0
Nio: None
zarr: None
cftime: 1.5.1.1
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: 1.3.5
dask: 2022.02.1
distributed: 2022.2.1
matplotlib: 3.4.3
cartopy: None
seaborn: None
numbagg: None
fsspec: 2022.7.1
cupy: None
pint: 0.18
sparse: None
flox: None
numpy_groupies: None
setuptools: 59.1.0
pip: 21.3.1
conda: None
pytest: 6.2.5
IPython: 7.29.0
sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6959/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue