id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 2105738254,PR_kwDOAMm_X85lVsgz,8680,use ruff.flake8-tidy-imports to enforce absolute imports,10194086,closed,0,,,1,2024-01-29T15:19:34Z,2024-01-30T16:42:46Z,2024-01-30T16:38:48Z,MEMBER,,0,pydata/xarray/pulls/8680," use ruff.flake8-tidy-imports to enforce absolute imports - https://github.com/MarcoGorelli/absolufy-imports has been archived (no reason given) - removes a pre-commit hook which should make it faster locally ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2097971637,PR_kwDOAMm_X85k789-,8649,ruff: use extend-exclude,10194086,closed,0,,,1,2024-01-24T10:39:46Z,2024-01-24T18:32:20Z,2024-01-24T15:59:11Z,MEMBER,,0,pydata/xarray/pulls/8649," I think we should use `extend-exclude` instead of `exclude` for ruff. We can then also remove `"".eggs""` as this is in the default. From https://docs.astral.sh/ruff/settings/#exclude: > Note that you'll typically want to use [extend-exclude](https://docs.astral.sh/ruff/settings/#extend-exclude) to modify the excluded paths. > > Default value: ["".bzr"", "".direnv"", "".eggs"", "".git"", "".git-rewrite"", "".hg"", "".mypy_cache"", "".nox"", "".pants.d"", "".pytype"", "".ruff_cache"", "".svn"", "".tox"", "".venv"", ""\_\_pypackages\_\_"", ""_build"", ""buck-out"", ""build"", ""dist"", ""node_modules"", ""venv""] (I really dislike how github formats toml files... What would be the correct syntax, then?)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8649/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2070895451,PR_kwDOAMm_X85jf-2J,8600,fix and test empty CFTimeIndex,10194086,closed,0,,,1,2024-01-08T17:11:43Z,2024-01-17T12:29:11Z,2024-01-15T21:49:34Z,MEMBER,,0,pydata/xarray/pulls/8600," - [x] Closes #7298 - [x] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` Otherwise `da.indexes` and the html repr raise a `ValueError`. I first had `""""` but I think `None` is better. cc @spencerkclark @keewis ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1455395909,I_kwDOAMm_X85Wv5RF,7298,html repr fails for empty cftime arrays,10194086,closed,0,,,1,2022-11-18T16:09:00Z,2024-01-15T21:49:36Z,2024-01-15T21:49:35Z,MEMBER,,,,"### What happened? The html repr of a cftime array wants to display the ""calendar"", which it cannot if it is empty. ### What did you expect to happen? No error. ### Minimal Complete Verifiable Example ```Python import numpy as np import xarray as xr data_obs = np.random.randn(3) time_obs = xr.date_range(""2000-01-01"", periods=3, freq=""YS"", calendar=""noleap"") obs = xr.DataArray(data_obs, coords={""time"": time_obs}) o = obs[:0] xr.core.formatting_html.array_repr(o) ``` ### MVCE confirmation - [ ] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray. - [ ] Complete example — the example is self-contained, including all data and the text of any traceback. - [ ] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result. - [ ] New issue — a search of GitHub Issues suggests this is not a duplicate. ### Relevant log output ```Python ValueError Traceback (most recent call last) Input In [1], in () 8 obs = xr.DataArray(data_obs, coords={""time"": time_obs}) 10 o = obs[:0] ---> 12 xr.core.formatting_html.array_repr(o) File ~/code/xarray/xarray/core/formatting_html.py:318, in array_repr(arr) 316 if hasattr(arr, ""xindexes""): 317 indexes = _get_indexes_dict(arr.xindexes) --> 318 sections.append(index_section(indexes)) 320 sections.append(attr_section(arr.attrs)) 322 return _obj_repr(arr, header_components, sections) File ~/code/xarray/xarray/core/formatting_html.py:195, in _mapping_section(mapping, name, details_func, max_items_collapse, expand_option_name, enabled) 188 expanded = _get_boolean_with_default( 189 expand_option_name, n_items < max_items_collapse 190 ) 191 collapsed = not expanded 193 return collapsible_section( 194 name, --> 195 details=details_func(mapping), 196 n_items=n_items, 197 enabled=enabled, 198 collapsed=collapsed, 199 ) File ~/code/xarray/xarray/core/formatting_html.py:155, in summarize_indexes(indexes) 154 def summarize_indexes(indexes): --> 155 indexes_li = """".join( 156 f""
  • {summarize_index(v, i)}
  • "" 157 for v, i in indexes.items() 158 ) 159 return f"""" File ~/code/xarray/xarray/core/formatting_html.py:156, in (.0) 154 def summarize_indexes(indexes): 155 indexes_li = """".join( --> 156 f""
  • {summarize_index(v, i)}
  • "" 157 for v, i in indexes.items() 158 ) 159 return f""
      {indexes_li}
    "" File ~/code/xarray/xarray/core/formatting_html.py:140, in summarize_index(coord_names, index) 138 index_id = f""index-{uuid.uuid4()}"" 139 preview = escape(inline_index_repr(index)) --> 140 details = short_index_repr_html(index) 142 data_icon = _icon(""icon-database"") 144 return ( 145 f""
    {name}
    "" 146 f""
    {preview}
    "" (...) 150 f""
    {details}
    "" 151 ) File ~/code/xarray/xarray/core/formatting_html.py:132, in short_index_repr_html(index) 129 if hasattr(index, ""_repr_html_""): 130 return index._repr_html_() --> 132 return f""
    {escape(repr(index))}
    "" File ~/code/xarray/xarray/core/indexes.py:547, in PandasIndex.__repr__(self) 546 def __repr__(self): --> 547 return f""PandasIndex({repr(self.index)})"" File ~/code/xarray/xarray/coding/cftimeindex.py:353, in CFTimeIndex.__repr__(self) 345 end_str = format_times( 346 self.values[-REPR_ELLIPSIS_SHOW_ITEMS_FRONT_END:], 347 display_width, 348 offset=offset, 349 first_row_offset=offset, 350 ) 351 datastr = ""\n"".join([front_str, f""{' '*offset}..."", end_str]) --> 353 attrs_str = format_attrs(self) 354 # oneliner only if smaller than display_width 355 full_repr_str = f""{klass_name}([{datastr}], {attrs_str})"" File ~/code/xarray/xarray/coding/cftimeindex.py:272, in format_attrs(index, separator) 267 def format_attrs(index, separator="", ""): 268 """"""Format attributes of CFTimeIndex for __repr__."""""" 269 attrs = { 270 ""dtype"": f""'{index.dtype}'"", 271 ""length"": f""{len(index)}"", --> 272 ""calendar"": f""'{index.calendar}'"", 273 ""freq"": f""'{index.freq}'"" if len(index) >= 3 else None, 274 } 276 attrs_str = [f""{k}={v}"" for k, v in attrs.items()] 277 attrs_str = f""{separator}"".join(attrs_str) File ~/code/xarray/xarray/coding/cftimeindex.py:698, in CFTimeIndex.calendar(self) 695 """"""The calendar used by the datetimes in the index."""""" 696 from .times import infer_calendar_name --> 698 return infer_calendar_name(self) File ~/code/xarray/xarray/coding/times.py:374, in infer_calendar_name(dates) 371 return sample.calendar 373 # Error raise if dtype is neither datetime or ""O"", if cftime is not importable, and if element of 'O' dtype is not cftime. --> 374 raise ValueError(""Array does not contain datetime objects."") ValueError: Array does not contain datetime objects. ``` ### Anything else we need to know? Bisected to 7379923de756a2bcc59044d548f8ab7a68b91d4e use `_repr_inline_` for indexes that define it. ### Environment
    ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2070561434,PR_kwDOAMm_X85je1rK,8598,small string fixes,10194086,closed,0,,,1,2024-01-08T14:20:56Z,2024-01-08T16:59:27Z,2024-01-08T16:53:00Z,MEMBER,,0,pydata/xarray/pulls/8598," ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8598/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1959175248,I_kwDOAMm_X850xqRQ,8367,`da.xindexes` or `da.indexes` raises an error if there are none (in the repr),10194086,closed,0,,,1,2023-10-24T12:45:12Z,2023-12-06T17:06:16Z,2023-12-06T17:06:16Z,MEMBER,,,,"### What happened? `da.xindexes` or `da.indexes` raises an error when trying to generate the repr if there are no coords (indexes) ### What did you expect to happen? Displaying an empty Mappable? ### Minimal Complete Verifiable Example ```Python xr.DataArray([3, 5]).indexes xr.DataArray([3, 5]).xindexes ``` ### MVCE confirmation - [x] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray. - [x] Complete example — the example is self-contained, including all data and the text of any traceback. - [x] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result. - [x] New issue — a search of GitHub Issues suggests this is not a duplicate. - [x] Recent environment — the issue occurs with the latest version of xarray and its dependencies. ### Relevant log output ```Python Out[9]: --------------------------------------------------------------------------- ValueError Traceback (most recent call last) File ~/.conda/envs/xarray_dev/lib/python3.10/site-packages/IPython/core/formatters.py:708, in PlainTextFormatter.__call__(self, obj) 701 stream = StringIO() 702 printer = pretty.RepresentationPrinter(stream, self.verbose, 703 self.max_width, self.newline, 704 max_seq_length=self.max_seq_length, 705 singleton_pprinters=self.singleton_printers, 706 type_pprinters=self.type_printers, 707 deferred_pprinters=self.deferred_printers) --> 708 printer.pretty(obj) 709 printer.flush() 710 return stream.getvalue() File ~/.conda/envs/xarray_dev/lib/python3.10/site-packages/IPython/lib/pretty.py:410, in RepresentationPrinter.pretty(self, obj) 407 return meth(obj, self, cycle) 408 if cls is not object \ 409 and callable(cls.__dict__.get('__repr__')): --> 410 return _repr_pprint(obj, self, cycle) 412 return _default_pprint(obj, self, cycle) 413 finally: File ~/.conda/envs/xarray_dev/lib/python3.10/site-packages/IPython/lib/pretty.py:778, in _repr_pprint(obj, p, cycle) 776 """"""A pprint that just redirects to the normal repr function."""""" 777 # Find newlines and replace them with p.break_() --> 778 output = repr(obj) 779 lines = output.splitlines() 780 with p.group(): File ~/code/xarray/xarray/core/indexes.py:1659, in Indexes.__repr__(self) 1657 def __repr__(self): 1658 indexes = formatting._get_indexes_dict(self) -> 1659 return formatting.indexes_repr(indexes) File ~/code/xarray/xarray/core/formatting.py:474, in indexes_repr(indexes, max_rows) 473 def indexes_repr(indexes, max_rows: int | None = None) -> str: --> 474 col_width = _calculate_col_width(chain.from_iterable(indexes)) 476 return _mapping_repr( 477 indexes, 478 ""Indexes"", (...) 482 max_rows=max_rows, 483 ) File ~/code/xarray/xarray/core/formatting.py:341, in _calculate_col_width(col_items) 340 def _calculate_col_width(col_items): --> 341 max_name_length = max(len(str(s)) for s in col_items) if col_items else 0 342 col_width = max(max_name_length, 7) + 6 343 return col_width ValueError: max() arg is an empty sequence ``` ### Anything else we need to know? _No response_ ### Environment
    INSTALLED VERSIONS ------------------ commit: ccc8f9987b553809fb6a40c52fa1a8a8095c8c5f python: 3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:40:32) [GCC 12.3.0] python-bits: 64 OS: Linux OS-release: 6.2.0-35-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.14.2 libnetcdf: 4.9.2 xarray: 2023.9.1.dev8+gf6d69a1f pandas: 2.1.1 numpy: 1.24.4 scipy: 1.11.3 netCDF4: 1.6.4 pydap: installed h5netcdf: 1.2.0 h5py: 3.9.0 Nio: None zarr: 2.16.1 cftime: 1.6.2 nc_time_axis: 1.4.1 PseudoNetCDF: 3.2.2 iris: 3.7.0 bottleneck: 1.3.7 dask: 2023.9.2 distributed: None matplotlib: 3.8.0 cartopy: 0.22.0 seaborn: 0.12.2 numbagg: 0.2.2 fsspec: 2023.9.2 cupy: None pint: 0.20.1 sparse: 0.14.0 flox: 0.7.2 numpy_groupies: 0.10.1 setuptools: 68.2.2 pip: 23.2.1 conda: None pytest: 7.4.2 mypy: 1.5.1 IPython: 8.15.0 sphinx: None
    ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8367/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1917660013,PR_kwDOAMm_X85bc7Pv,8246,update pytest config and un-xfail some tests,10194086,closed,0,,,1,2023-09-28T14:21:58Z,2023-09-30T01:26:39Z,2023-09-30T01:26:35Z,MEMBER,,0,pydata/xarray/pulls/8246," - [ ] Towards #8239 - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` This partly updates the pytest config as suggested in #8239 and un-xfails some tests (or xfails the tests more precisely). See https://github.com/pydata/xarray/issues/8239#issuecomment-1739363809 for why we cannot exactly follow the suggestions given in #8239 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1371397741,I_kwDOAMm_X85Rvd5t,7027,"don't apply `weighted`, `groupby`, etc. to `DataArray` without `dims`?",10194086,open,0,,,1,2022-09-13T12:44:34Z,2023-08-26T19:13:39Z,,MEMBER,,,,"### What is your issue? Applying e.g. `ds.weighted(weights).mean()` applies the operation over all `DataArray` objects - even if they don't have the dimensions over which it is applied (or is a scalar variable). I don't think this is wanted. ```python import xarray as xr air = xr.tutorial.open_dataset(""air_temperature"") air.attrs = {} # add variable without dims air[""foo""] = 5 print(""resample"") print(air.resample(time=""MS"").mean(dim=""time"").foo.dims) print(""groupby"") print(air.groupby(""time.year"").mean(dim=""time"").foo.dims) print(""weighted"") print(air.weighted(weights=air.time.dt.year).mean(""lat"").foo.dims) print(""where"") print(air.where(air.air > 5).foo.dims) ``` Results ``` resample ('time',) groupby ('year',) weighted ('time',) ``` Related #6952 - I am sure there are other issues, but couldn't find them quickly... `rolling` and `coarsen` don't seem to do this. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7027/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue 1466191758,PR_kwDOAMm_X85Dylku,7326,fix doctests: supress urllib3 warning,10194086,closed,0,,,1,2022-11-28T10:40:46Z,2022-12-05T20:11:16Z,2022-11-28T19:31:03Z,MEMBER,,0,pydata/xarray/pulls/7326," - [x] Closes #7322 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7326/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1118802352,PR_kwDOAMm_X84xzhTi,6212,better warning filter for assert_*,10194086,closed,0,,,1,2022-01-31T00:22:37Z,2022-09-05T07:52:09Z,2022-09-05T07:52:06Z,MEMBER,,0,pydata/xarray/pulls/6212,"In #4864 I added a a decorator for the `xarray.testing.assert_*` functions to ensure warnings that were to errors (`pytest.mark.filterwarnings(""error"")`) do not error in `assert_*` (see https://github.com/pydata/xarray/pull/4760#issuecomment-774101639). As a solution I added https://github.com/pydata/xarray/blob/5470d933452d88deb17cc9294a164c4a03f55dec/xarray/testing.py#L32 However, this is sub-optimal because this now removes all `ignore` filters! As dask stuff only gets evaluated in `assert_*` filters like `warnings.filterwarnings(""ignore"", ""Mean of empty slice"")` don't work for dask arrays! I thought of setting ```python warnings.simplefilter(""ignore"") ``` but this could suppress warnings we want to keep. So now I remove all `""error""` warning filters and keep the rest. Note that the original filters get restored after `with warnings.catch_warnings():`. (). --- I am not sure I expressed myself very clearly... let me know and I can try again. @keewis you had a look at #4864 maybe you can review this PR as well? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1355361572,PR_kwDOAMm_X84-Brev,6966,enable pydap in tests again,10194086,closed,0,,,1,2022-08-30T08:18:07Z,2022-09-01T10:16:05Z,2022-09-01T10:16:03Z,MEMBER,,0,pydata/xarray/pulls/6966," #5844 excluded pydap from our tests - but the new version has been released in the meantime (on conda not on pypi, though, pydap/pydap#268) - so let's see if this still works.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 685739084,MDU6SXNzdWU2ODU3MzkwODQ=,4375,allow using non-dimension coordinates in polyfit,10194086,open,0,,,1,2020-08-25T19:40:55Z,2022-04-09T02:58:48Z,,MEMBER,,,," `polyfit` currently only allows to fit along a dimension and not along a non-dimension coordinate (or a virtual coordinate) Example: ```python da = xr.DataArray( [1, 3, 2], dims=[""x""], coords=dict(x=[""a"", ""b"", ""c""], y=(""x"", [0, 1, 2])) ) print(da) da.polyfit(""y"", 1) ``` Output: ```python array([1, 3, 2]) Coordinates: * x (x) in 5 print(da) 6 ----> 7 da.polyfit(""y"", 1) ~/.conda/envs/ipcc_ar6/lib/python3.7/site-packages/xarray/core/dataarray.py in polyfit(self, dim, deg, skipna, rcond, w, full, cov) 3507 """""" 3508 return self._to_temp_dataset().polyfit( -> 3509 dim, deg, skipna=skipna, rcond=rcond, w=w, full=full, cov=cov 3510 ) 3511 ~/.conda/envs/ipcc_ar6/lib/python3.7/site-packages/xarray/core/dataset.py in polyfit(self, dim, deg, skipna, rcond, w, full, cov) 6005 skipna_da = skipna 6006 -> 6007 x = get_clean_interp_index(self, dim, strict=False) 6008 xname = ""{}_"".format(self[dim].name) 6009 order = int(deg) + 1 ~/.conda/envs/ipcc_ar6/lib/python3.7/site-packages/xarray/core/missing.py in get_clean_interp_index(arr, dim, use_coordinate, strict) 246 247 if use_coordinate is True: --> 248 index = arr.get_index(dim) 249 250 else: # string ~/.conda/envs/ipcc_ar6/lib/python3.7/site-packages/xarray/core/common.py in get_index(self, key) 378 """""" 379 if key not in self.dims: --> 380 raise KeyError(key) 381 382 try: KeyError: 'y' ``` **Describe the solution you'd like** Would be nice if that worked. **Describe alternatives you've considered** One could just set the non-dimension coordinate as index, e.g.: `da = da.set_index(x=""y"")` **Additional context** Allowing this *may* be as easy as replacing https://github.com/pydata/xarray/blob/9c85dd5f792805bea319f01f08ee51b83bde0f3b/xarray/core/missing.py#L248 by ``` index = arr[dim] ``` but I might be missing something. Or probably a `use_coordinate` must be threaded through to `get_clean_interp_index` (although I am a bit confused by this argument). ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue 1150251120,I_kwDOAMm_X85Ej3Bw,6304,add join argument to xr.broadcast?,10194086,open,0,,,1,2022-02-25T09:52:14Z,2022-02-25T21:50:16Z,,MEMBER,,,,"### Is your feature request related to a problem? `xr.broadcast` always does an outer join: https://github.com/pydata/xarray/blob/de965f342e1c9c5de92ab135fbc4062e21e72453/xarray/core/alignment.py#L702 https://github.com/pydata/xarray/blob/de965f342e1c9c5de92ab135fbc4062e21e72453/xarray/core/alignment.py#L768 This is not how the (default) broadcasting (arithmetic join) works, e.g. the following first does an inner join and then broadcasts: ```python import xarray as xr da1 = xr.DataArray([[0, 1, 2]], dims=(""y"", ""x""), coords={""x"": [0, 1, 2]}) da2 = xr.DataArray([0, 1, 2, 3, 4], dims=""x"", coords={""x"": [0, 1, 2, 3, 4]}) da1 + da2 ``` ``` array([[0, 2, 4]]) Coordinates: * x (x) int64 0 1 2 Dimensions without coordinates: y ``` ### Describe the solution you'd like Add a `join` argument to `xr.broadcast`. I would propose to leave the default as is ```python def broadcast(*args, exclude=None, join=""outer""): args = align(*args, join=join, copy=False, exclude=exclude) ``` ### Describe alternatives you've considered - We could make `broadcast` respect `options -> arithmetic_join` but that would be a breaking change and I am not sure how the deprecation should/ would be handled... - We could leave it as is. ### Additional context - `xr.broadcast` should not be used often because this is should happen automatically in most cases - in #6059 I use `broadcast` because I couldn't get it to work otherwise (maybe there is a better way?). However, the ""outer elements"" are immediately discarded again - so it's kind of pointless to do an outer join. ```python import numpy as np import xarray as xr da = xr.DataArray(np.arange(6).reshape(3, 2), coords={""dim_0"": [0, 1, 2]}) w = xr.DataArray([1, 1, 1, 1, 1, 1], coords={""dim_0"": [0, 1, 2, 4, 5, 6]}) da.weighted(w).quantile(0.5) ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6304/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue 1125661464,PR_kwDOAMm_X84yJ3Rz,6248,test bottleneck master in upstream CI [test-upstream] [skip-ci],10194086,closed,0,,,1,2022-02-07T08:25:35Z,2022-02-07T09:05:28Z,2022-02-07T09:05:24Z,MEMBER,,0,pydata/xarray/pulls/6248," - [x] Closes #6186 pydata/bottleneck#378 was merged - so this should work again.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6248/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1118836906,PR_kwDOAMm_X84xzojx,6213,fix or suppress test warnings,10194086,closed,0,,,1,2022-01-31T01:34:20Z,2022-02-01T09:40:15Z,2022-02-01T09:40:11Z,MEMBER,,0,pydata/xarray/pulls/6213,"Fixes or suppresses a number of warnings that turn up in our upstream CI. `pd.Index.is_monotonic` is an alias for `pd.Index.is_monotonic_increasing` and does _not_ stand for `pd.Index.is_monotonic_increasing or pd.Index.is_monotonic_decreasing`. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1088419434,PR_kwDOAMm_X84wQ-nD,6107,is_dask_collection: micro optimization,10194086,closed,0,,,1,2021-12-24T15:04:42Z,2022-01-26T08:41:28Z,2021-12-29T16:27:55Z,MEMBER,,0,pydata/xarray/pulls/6107,"In #6096 I realized that `DuckArrayModule(""dask"")` is called a lot in our tests - 145'835 times. Most of those are from `is_dask_collection` (`is_duck_dask_array`) This change avoids that the instance needs to be built every time. ```python import xarray as xr %timeit xr.core.pycompat.DuckArrayModule(""dask"").available %timeit xr.core.pycompat.dsk.available ``` ``` 18.9 µs ± 97.7 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each) 77.1 ns ± 1.22 ns per loop (mean ± std. dev. of 7 runs, 10000000 loops each) ``` Which leads to an incredible speed up of our tests of about 2.7 s :grin: ((18.9 - 0.0771) * 145835 / 1e6).","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1107431006,PR_kwDOAMm_X84xOsZX,6171,unpin dask again,10194086,closed,0,,,1,2022-01-18T22:37:31Z,2022-01-26T08:41:02Z,2022-01-18T23:39:12Z,MEMBER,,0,pydata/xarray/pulls/6171," - dask 2022.01 is out, so we can remove the pin","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6171/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 877166445,MDExOlB1bGxSZXF1ZXN0NjMxMTcwNzI4,5265,Warn ignored keep attrs,10194086,closed,0,,,1,2021-05-06T07:20:16Z,2021-10-18T14:06:37Z,2021-05-06T16:31:05Z,MEMBER,,0,pydata/xarray/pulls/5265," - [x] Part of #4513 - [x] Tests added - [x] Passes `pre-commit run --all-files` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` This PR warns when passing ` keep_attrs` to `resample(..., keep_attrs=True)` and `rolling_exp(..., keep_attrs=True)` as they have no effect (rightfully). Also removes `keep_attrs` from the docstring of `resample`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 794344392,MDExOlB1bGxSZXF1ZXN0NTYxODc2OTg5,4845,iris update doc url,10194086,closed,0,,,1,2021-01-26T15:51:18Z,2021-10-18T14:06:31Z,2021-01-26T17:30:20Z,MEMBER,,0,pydata/xarray/pulls/4845," iris moved its documentation to https://scitools-iris.readthedocs.io/en/stable/","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4845/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 684430261,MDExOlB1bGxSZXF1ZXN0NDcyMzE4MzUw,4371,mention all ignored flake8 errors,10194086,closed,0,,,1,2020-08-24T07:17:03Z,2021-10-18T14:06:18Z,2020-08-24T10:45:05Z,MEMBER,,0,pydata/xarray/pulls/4371,"and put the comment on the same line ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4371/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 913958248,MDExOlB1bGxSZXF1ZXN0NjYzOTE2NDQw,5451,Silence some test warnings,10194086,closed,0,,,1,2021-06-07T21:12:50Z,2021-06-09T17:55:48Z,2021-06-09T17:27:21Z,MEMBER,,0,pydata/xarray/pulls/5451," Silences a number of warnings that accumulated in our test suite (c.f. #3266). The changes are mostly unrelated but small. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5451/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 819884612,MDExOlB1bGxSZXF1ZXN0NTgyOTE5ODUy,4982,pin netCDF4=1.5.3 in min-all-deps,10194086,closed,0,,,1,2021-03-02T10:36:18Z,2021-03-08T09:10:20Z,2021-03-08T00:20:38Z,MEMBER,,0,pydata/xarray/pulls/4982," - [x] Closes #4970 The clean thing here would be to update `min_deps_check.py` so it works properly for this case. I am not sure it's really worth it.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4982/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 817951965,MDU6SXNzdWU4MTc5NTE5NjU=,4970,minimum version and non-semantic versioning (netCDF4),10194086,closed,0,,,1,2021-02-27T15:33:48Z,2021-03-08T00:20:38Z,2021-03-08T00:20:38Z,MEMBER,,,,"We currently pin netCDF4 to [version 1.5](https://github.com/pydata/xarray/blob/48378c4b11c5c2672ff91396d4284743165b4fbe/ci/requirements/py37-min-all-deps.yml#L28). However, I think netCDF4 does not really follow semantic versioning, e.g. python 2 support was dropped in version 1.5.6. So they may actually be doing something like `1.major.minor[.patch]` - I asked about their versioning scheme in Unidata/netcdf4-python#1090. So I wonder if we would need to pin netCDF to version to version 1.5.4.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 814806676,MDU6SXNzdWU4MTQ4MDY2NzY=,4945,Upstream CI failing silently,10194086,closed,0,,,1,2021-02-23T20:30:29Z,2021-02-24T08:14:00Z,2021-02-24T08:14:00Z,MEMBER,,,,"The last 5 days our Upstream CI failed silently with a timeout after 6h: https://github.com/pydata/xarray/actions/workflows/upstream-dev-ci.yaml?query=branch%3Amaster+event%3Aschedule This was probably caused by #4934. As mentioned in dask/dask#4934 this is probably dask/dask#6738 which was merged 5 days ago.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4945/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 811379942,MDExOlB1bGxSZXF1ZXN0NTc1OTE3NjYz,4923,[skip-ci] doc: fix pynio warning,10194086,closed,0,,,1,2021-02-18T19:09:00Z,2021-02-18T19:23:23Z,2021-02-18T19:23:20Z,MEMBER,,0,pydata/xarray/pulls/4923," Small doc fix, see http://xarray.pydata.org/en/stable/io.html#formats-supported-by-pynio (the `..note::` did not get picked up) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4923/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 803760517,MDExOlB1bGxSZXF1ZXN0NTY5NjM1MTIx,4883,update pre-commit hooks (mypy),10194086,closed,0,,,1,2021-02-08T17:13:07Z,2021-02-08T17:47:18Z,2021-02-08T17:42:41Z,MEMBER,,0,pydata/xarray/pulls/4883," mypy v0.800 was ignoring our config (https://github.com/pydata/xarray/pull/4874#issuecomment-774771523)... Adding an (empty) `[mypy]` section to `setup.cfg` seems to do the trick, this is in contrast to the [documentation](https://mypy.readthedocs.io/en/stable/config_file.html#config-file-format) and may get changed again (c.f. python/mypy#9940), but it does not hurt either. @keewis ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4883/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 778022297,MDExOlB1bGxSZXF1ZXN0NTQ4MTg2MzYw,4759,coords: retain str dtype,10194086,closed,0,,,1,2021-01-04T11:17:53Z,2021-01-13T17:18:28Z,2021-01-13T17:09:06Z,MEMBER,,0,pydata/xarray/pulls/4759," - [x] Closes #2658, closes #4543 - [x] Tests added - [x] Passes `isort . && black . && mypy . && flake8` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` `pd.Index(""a"")` has dtype `object`. Therefore string coords change their dtype on certain operations - e.g. `align`, `__setitem__` (& `assign`), `IndexVariable.concat`. This can be avoided by using the coords instead of the index in some cases but in two instances it was unavoidable to cast a `pd.Index` back to a `np.array`. I probably did not catch all of these conversions. What I am not sure: does this somehow contradict the index refactor? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4759/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 766450436,MDU6SXNzdWU3NjY0NTA0MzY=,4690,Linux py36-bare-minimum will likely fail,10194086,closed,0,,,1,2020-12-14T13:41:56Z,2020-12-14T19:47:59Z,2020-12-14T19:47:59Z,MEMBER,,,,"The Linux py36-bare-minimum will likely fail with a `TypeError: 'NoneType' object is not callable` due to python/importlib_metadata#270 This could be fixed by adding `typing_extensions` to the CI or maybe they fix that upstream.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 745628177,MDExOlB1bGxSZXF1ZXN0NTIzMTU5MTYz,4590,suppress 'ambiguous reference date string' warning,10194086,closed,0,,,1,2020-11-18T12:44:52Z,2020-12-13T13:16:55Z,2020-11-19T16:46:39Z,MEMBER,,0,pydata/xarray/pulls/4590," Follow up to #4506 Suppress some `""Ambiguous reference date string""` warnings. See e.g.: https://dev.azure.com/xarray/xarray/_build/results?buildId=4233&view=logs&j=2280efed-fda1-53bd-9213-1fa8ec9b4fa8&t=175181ee-1928-5a6b-f537-168f7a8b7c2d&l=361 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4590/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 752959867,MDExOlB1bGxSZXF1ZXN0NTI5MTM5MjQ1,4617,weighted: de-parameterize tests,10194086,closed,0,,,1,2020-11-29T17:01:36Z,2020-12-01T09:06:34Z,2020-12-01T09:06:31Z,MEMBER,,0,pydata/xarray/pulls/4617,I feel I overdid it a bit with the parameterizations of the weighted tests. This brings the number of tests of `test_weighted.py` from about 1000 down to less than 200 without changing the coverage.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4617/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 749724093,MDExOlB1bGxSZXF1ZXN0NTI2NDkzNTUy,4606,update sphinx to v3.3,10194086,closed,0,,,1,2020-11-24T13:51:31Z,2020-11-24T15:02:00Z,2020-11-24T14:52:10Z,MEMBER,,0,pydata/xarray/pulls/4606," - [x] Closes #4487 Thanks for the pointer @keewis. Let's see if this works.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 742479353,MDExOlB1bGxSZXF1ZXN0NTIwNjA1NDUy,4581,update mypy to 0.790,10194086,closed,0,,,1,2020-11-13T14:14:51Z,2020-11-13T20:09:17Z,2020-11-13T19:38:06Z,MEMBER,,0,pydata/xarray/pulls/4581," - [x] Closes #4571 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 738893294,MDExOlB1bGxSZXF1ZXN0NTE3NjUwMjk1,4568,pd.Index: replace set operations,10194086,closed,0,,,1,2020-11-09T10:18:32Z,2020-11-09T21:26:44Z,2020-11-09T19:08:36Z,MEMBER,,0,pydata/xarray/pulls/4568," - [x] Closes #4565 - [x] Passes `isort . && black . && mypy . && flake8` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` Pandas will change `pd.Index.__or__` and `pd.Index.__and__`. Use `pd.Index.union` and `pd.Index.intersection` inseated. There should be no change in functionality. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 731209147,MDExOlB1bGxSZXF1ZXN0NTExMzYxMTYz,4546,maint: pandas can now index with np.timedelta64,10194086,closed,0,,,1,2020-10-28T08:11:20Z,2020-10-30T14:04:20Z,2020-10-30T10:32:50Z,MEMBER,,0,pydata/xarray/pulls/4546,"pandas-dev/pandas#20393 was resolved in pandas version 0.23, so this should no longer be necessary. The relevant test is https://github.com/pydata/xarray/blob/063606b90946d869e90a6273e2e18ed24bffb052/xarray/tests/test_dataarray.py#L955-L957 so let's see if this passes","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4546/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 724019666,MDExOlB1bGxSZXF1ZXN0NTA1NDgxOTU1,4520,doc.yml: pin eccodes,10194086,closed,0,,,1,2020-10-18T14:30:04Z,2020-10-18T17:48:04Z,2020-10-18T15:29:47Z,MEMBER,,0,pydata/xarray/pulls/4520," Fixes the failing doc builds, see #4521 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 718869334,MDU6SXNzdWU3MTg4NjkzMzQ=,4502,dask dev upstream test failiures,10194086,closed,0,,,1,2020-10-11T16:12:13Z,2020-10-16T12:17:58Z,2020-10-16T12:17:57Z,MEMBER,,,,"Our upstream tests fail due to a change in dask. The likely culprit is dask/dask#6680 - `dask.array.zeros_like` does now take the `meta` keyword and thus it tries to coerce `str` to a `bool` which fails (and it didn't do before). The following should trigger the error: ```python import xarray as xr import dask da = xr.DataArray(dask.array.array("""")) xr.zeros_like(da, dtype=bool) ``` Note that `zeros_like` is called in `isnull` (which triggered the test failures): https://github.com/pydata/xarray/blob/080caf4246fe2f4d6aa0c5dcb65a99b376fa669b/xarray/core/duck_array_ops.py#L100-L102 **What happened**: ```python /home/vsts/work/1/s/xarray/tests/test_duck_array_ops.py:499: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /home/vsts/work/1/s/xarray/testing.py:139: in compat_variable return a.dims == b.dims and (a._data is b._data or equiv(a.data, b.data)) /home/vsts/work/1/s/xarray/testing.py:36: in _data_allclose_or_equiv return duck_array_ops.array_equiv(arr1, arr2) /home/vsts/work/1/s/xarray/core/duck_array_ops.py:246: in array_equiv flag_array = (arr1 == arr2) | (isnull(arr1) & isnull(arr2)) /home/vsts/work/1/s/xarray/core/duck_array_ops.py:104: in isnull return zeros_like(data, dtype=bool) /home/vsts/work/1/s/xarray/core/duck_array_ops.py:56: in f return wrapped(*args, **kwargs) /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/dask/array/creation.py:174: in zeros_like return zeros( /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/dask/array/wrap.py:78: in wrap_func_shape_as_first_arg return Array(dsk, name, chunks, dtype=dtype, meta=kwargs.get(""meta"", None)) /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/dask/array/core.py:1083: in __new__ meta = meta_from_array(meta, dtype=dtype) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ x = array('', dtype=' meta = meta.astype(dtype) E ValueError: invalid literal for int() with base 10: '' ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 718926885,MDExOlB1bGxSZXF1ZXN0NTAxMjA2Mjk5,4503,combine_by_coords: error on differing calendars,10194086,closed,0,,,1,2020-10-11T21:06:37Z,2020-10-13T07:03:37Z,2020-10-13T07:03:24Z,MEMBER,,0,pydata/xarray/pulls/4503," - [x] Closes #4495 - [x] Tests added - [x] Passes `isort . && black . && mypy . && flake8` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` Let's see if all tests pass or if the `numeric_only` keyword is new in pandas.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 715012735,MDExOlB1bGxSZXF1ZXN0NDk3OTk0MDM5,4488,unpin matplotlib for docs again,10194086,closed,0,,,1,2020-10-05T17:06:07Z,2020-10-06T14:43:54Z,2020-10-06T14:43:50Z,MEMBER,,0,pydata/xarray/pulls/4488," - [x] Closes #4342 mpl 3.3.2 is out so we can try to unpin the version of the docs again. #4313","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4488/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 681627420,MDU6SXNzdWU2ODE2Mjc0MjA=,4352,da.sum(min_count=1) errors for integer data,10194086,closed,0,,,1,2020-08-19T07:52:35Z,2020-10-02T09:28:27Z,2020-10-02T09:28:27Z,MEMBER,,,," **What happened**: `da.sum(min_count=1)` returns a `TypeError` if `da` has an integer dtype. Of course min_count is not necessary for integer data as it cannot contain `NaN`. **What you expected to happen**: `min_count` should be ignored **Minimal Complete Verifiable Example**: ```python import xarray as xr da = xr.DataArray([[1, 2, 3], [4, 5, 6]]) da.sum(min_count=1) ``` **Anything else we need to know?**: Full traceback
    ```python In [37]: da.sum(min_count=1) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in ----> 1 da.sum(min_count=1) ~/code/xarray/xarray/core/common.py in wrapped_func(self, dim, axis, skipna, **kwargs) 44 45 def wrapped_func(self, dim=None, axis=None, skipna=None, **kwargs): ---> 46 return self.reduce(func, dim, axis, skipna=skipna, **kwargs) 47 48 else: ~/code/xarray/xarray/core/dataarray.py in reduce(self, func, dim, axis, keep_attrs, keepdims, **kwargs) 2336 """""" 2337 -> 2338 var = self.variable.reduce(func, dim, axis, keep_attrs, keepdims, **kwargs) 2339 return self._replace_maybe_drop_dims(var) 2340 ~/code/xarray/xarray/core/variable.py in reduce(self, func, dim, axis, keep_attrs, keepdims, allow_lazy, **kwargs) 1591 data = func(input_data, axis=axis, **kwargs) 1592 else: -> 1593 data = func(input_data, **kwargs) 1594 1595 if getattr(data, ""shape"", ()) == self.shape: ~/code/xarray/xarray/core/duck_array_ops.py in f(values, axis, skipna, **kwargs) 310 311 try: --> 312 return func(values, axis=axis, **kwargs) 313 except AttributeError: 314 if not isinstance(values, dask_array_type): ~/code/xarray/xarray/core/duck_array_ops.py in f(*args, **kwargs) 46 else: 47 wrapped = getattr(eager_module, name) ---> 48 return wrapped(*args, **kwargs) 49 50 else: <__array_function__ internals> in sum(*args, **kwargs) TypeError: _sum_dispatcher() got an unexpected keyword argument 'min_count' ```
    **Environment**:
    Output of xr.show_versions() INSTALLED VERSIONS ------------------ commit: a7fb5a9fa1a2b829181ea9e4986b959f315350dd python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 5.4.0-42-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.15.2.dev64+g2542a63f pandas: 0.25.3 numpy: 1.17.3 scipy: 1.3.1 netCDF4: 1.5.4 pydap: installed h5netcdf: 0.7.4 h5py: 2.10.0 Nio: None zarr: 2.3.2 cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: installed rasterio: 1.1.0 cfgrib: 0.9.5.4 iris: None bottleneck: 1.2.1 dask: 2.6.0 distributed: 2.6.0 matplotlib: 3.3.1 cartopy: 0.18.0 seaborn: 0.9.0 numbagg: None pint: None setuptools: 49.6.0.post20200814 pip: 19.3.1 conda: None pytest: 5.2.2 IPython: 7.17.0 sphinx: None
    ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4352/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 679852412,MDExOlB1bGxSZXF1ZXN0NDY4NTE1NTMz,4346,annotate concat,10194086,closed,0,,,1,2020-08-16T23:52:14Z,2020-08-19T20:32:42Z,2020-08-19T20:32:37Z,MEMBER,,0,pydata/xarray/pulls/4346," - [x] Closes #4238 - [x] Passes `isort . && black . && mypy . && flake8` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` The issue mentions `xr.Dataset.to_dataframe` and `xr.concat`. `xr.Dataset.to_dataframe` was [annotated](https://github.com/pydata/xarray/blob/26547d19d477cc77461c09b3aadd55f7eb8b4dbf/xarray/core/dataset.py#L4566) in #4333, so only `xr.concat` is left. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 592629388,MDU6SXNzdWU1OTI2MjkzODg=,3927,use level of a MultiIndex for plotting?,10194086,closed,0,,,1,2020-04-02T13:24:05Z,2020-05-25T16:32:15Z,2020-05-25T16:32:15Z,MEMBER,,,,"It would be nice to be able to use a level of a MultiIndex for plotting. #### MCVE Code Sample ```python import numpy as np import xarray as xr da = xr.DataArray( np.random.randn(10), dims=""x"", coords=dict( a=(""x"", np.arange(10, 20)), b=(""x"", np.arange(1, 11)) ) ) da = da.set_index(x=[""a"", ""b""]) da ``` This creates the following DataArray ```python array([-1.34516338, 0.97644817, -0.24032189, -0.70112418, -0.8686898 , -0.55607078, 0.56618151, 1.62847463, 0.84947296, -0.5775504 ]) Coordinates: * x (x) MultiIndex - a (x) int64 10 11 12 13 14 15 16 17 18 19 - b (x) int64 1 2 3 4 5 6 7 8 9 10 ``` Is there a way to plot a line using one of the levels of the MultiIindex? ```python da.plot(x=""a"") ``` returns ```python ValueError: x must be either None or one of ('x') ``` ```python da.plot() ``` returns ```python TypeError: Plotting requires coordinates to be numeric or dates of type np.datetime64, datetime.datetime, cftime.datetime or pd.Interval. ``` (which makes sense). If `da` is a 2D Variable the error is ```python ValueError: x and y must be coordinate variables ``` #### Expected Output A line plot #### Versions
    Output of `xr.show_versions()` INSTALLED VERSIONS ------------------ commit: b3bafeefbd6e6d70bce505ae1f0d9d5a2b015089 python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.15.0-91-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.11.1+335.gb0c336f6 pandas: 0.25.3 numpy: 1.17.3 scipy: 1.3.1 netCDF4: 1.5.3 pydap: installed h5netcdf: 0.7.4 h5py: 2.10.0 Nio: None zarr: 2.3.2 cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: installed rasterio: 1.1.0 cfgrib: 0.9.5.4 iris: None bottleneck: 1.2.1 dask: 2.6.0 distributed: 2.6.0 matplotlib: 3.1.2 cartopy: 0.17.0 seaborn: 0.9.0 numbagg: None pint: None setuptools: 41.6.0.post20191101 pip: 19.3.1 conda: None pytest: 5.2.2 IPython: 7.9.0 sphinx: None
    ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3927/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 539010474,MDU6SXNzdWU1MzkwMTA0NzQ=,3634,"""ValueError: Percentiles must be in the range [0, 100]""",10194086,closed,0,,,1,2019-12-17T11:34:35Z,2019-12-17T13:50:06Z,2019-12-17T13:50:06Z,MEMBER,,,,"#### MCVE Code Sample ```python import xarray as xr da = xr.DataArray([0, 1, 2]) da.quantile(q=50) >>> ValueError: Percentiles must be in the range [0, 100] ``` #### Expected Output ```python ValueError: Quantiles must be in the range [0, 1] ``` #### Problem Description By wrapping `np.nanpercentile` (xref: #3559) we also get the numpy error. However, the error message is wrong as xarray needs it to be in 0..1. BTW: thanks for #3559, makes my life easier! #### Output of ``xr.show_versions()`` --- Edit: uses `nanpercentile` internally.
    INSTALLED VERSIONS ------------------ commit: None python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp151.28.36-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.14.1+28.gf2b2f9f6 (current master) pandas: 0.25.2 numpy: 1.17.3 scipy: 1.3.1 netCDF4: 1.5.3 pydap: None h5netcdf: 0.7.4 h5py: 2.10.0 Nio: None zarr: None cftime: 1.0.4.2 nc_time_axis: 1.2.0 PseudoNetCDF: None rasterio: 1.1.1 cfgrib: None iris: None bottleneck: 1.2.1 dask: 2.6.0 distributed: 2.6.0 matplotlib: 3.1.2 cartopy: 0.17.0 seaborn: 0.9.0 numbagg: None setuptools: 41.4.0 pip: 19.3.1 conda: None pytest: 5.2.2 IPython: 7.9.0 sphinx: 2.2.1
    ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3634/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 490229690,MDU6SXNzdWU0OTAyMjk2OTA=,3284,specifying list of colors does not work for me,10194086,closed,0,,,1,2019-09-06T09:36:08Z,2019-09-09T18:31:16Z,2019-09-09T18:31:16Z,MEMBER,,,,"#### MCVE Code Sample ```python import xarray as xr import numpy as np airtemps = xr.tutorial.load_dataset('air_temperature') air = airtemps.air.isel(time=0) levels = np.arange(225, 301, 25) colors = ['#ffffb2', '#fecc5c', '#fd8d3c', '#e31a1c'] # this does not work for me air.plot.pcolormesh(levels=levels, colors=colors) ``` #### Expected Output Should create a plot with the specified colors. According to the the docstring this should work. Or maybe I am doing something wrong here? #### Problem Description Instead I get the following error: ```python /usr/local/Miniconda3-envs/envs/2019/envs/iacpy3_2019/lib/python3.7/site-packages/xarray/plot/utils.py in _process_cmap_cbar_kwargs(func, kwargs, data) 683 # colors is only valid when levels is supplied or the plot is of type 684 # contour or contourf --> 685 if colors and (('contour' not in func.__name__) and (not levels)): 686 raise ValueError(""Can only specify colors with contour or levels"") 687 ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() ``` Instead I use the following, which works: ```python air.plot.pcolormesh(levels=levels, cmap=colors) ``` #### Output of ``xr.show_versions()``
    INSTALLED VERSIONS ------------------ commit: None python: 3.7.1 | packaged by conda-forge | (default, Feb 18 2019, 01:42:00) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp151.28.13-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.6.2 xarray: 0.12.1 pandas: 0.24.2 numpy: 1.16.2 scipy: 1.2.1 netCDF4: 1.5.0.1 pydap: installed h5netcdf: 0.7.1 h5py: 2.9.0 Nio: None zarr: 2.3.1 cftime: 1.0.3.4 nc_time_axis: 1.2.0 PseudonetCDF: None rasterio: 1.0.22 cfgrib: 0.9.7 iris: 2.2.0 bottleneck: 1.2.1 dask: 1.1.5 distributed: 1.26.1 matplotlib: 3.0.3 cartopy: 0.17.0 seaborn: 0.9.0 setuptools: 41.0.0 pip: 19.0.3 conda: None pytest: 4.4.0 IPython: 7.4.0 sphinx: 2.0.1
    ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 84105613,MDExOlB1bGxSZXF1ZXN0MzY3NTA1NTM=,421,allow 'nearest' slice indexing,10194086,closed,0,,,1,2015-06-02T16:56:45Z,2015-06-12T22:10:03Z,2015-06-12T22:10:03Z,MEMBER,,0,pydata/xarray/pulls/421,"see #418 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 67343193,MDU6SXNzdWU2NzM0MzE5Mw==,387,Accessing virtual_variables,10194086,closed,0,,,1,2015-04-09T11:41:08Z,2015-04-09T13:45:29Z,2015-04-09T13:45:29Z,MEMBER,,,,"Virtual variables such as time.date can not be accessed by dot notation. ``` In [81]: ts.time.date --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) in () ----> 1 ts.time.date /home/mathause/.local/lib/python2.7/site-packages/xray/core/common.pyc in __getattr__(self, name) 115 pass 116 raise AttributeError(""%r object has no attribute %r"" % --> 117 (type(self).__name__, name)) 118 119 def __dir__(self): AttributeError: 'DataArray' object has no attribute 'date' ``` Compare ``` ts.__getattr__('time.date') ``` So maybe time.date is not a useful name. time_date? Or don't show these attributes to the user? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/387/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue