id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 2123950388,PR_kwDOAMm_X85mT6XD,8720,groupby: Dispatch quantile to flox.,2448579,closed,0,,,7,2024-02-07T21:42:42Z,2024-03-26T15:08:32Z,2024-03-26T15:08:30Z,MEMBER,,0,pydata/xarray/pulls/8720,"- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` @aulemahal would you be able to test against xclim's test suite. I imagine you're doing a bunch of grouped quantiles.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8720/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2184830377,PR_kwDOAMm_X85pjN8A,8829,"Revert ""Do not attempt to broadcast when global option ``arithmetic_b…",2448579,closed,0,,,7,2024-03-13T20:27:12Z,2024-03-20T15:30:12Z,2024-03-15T03:59:07Z,MEMBER,,0,pydata/xarray/pulls/8829,"…roadcast=False`` (#8784)"" This reverts commit 11f89ecdd41226cf93da8d1e720d2710849cd23e. Reverting #8784 Sadly that PR broke a lot of tests by breaking `create_test_data` with ``` from xarray.tests import create_test_data create_test_data() ``` ``` --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data() File [~/repos/xarray/xarray/tests/__init__.py:329](http://localhost:8888/lab/workspaces/auto-P/tree/repos/devel/arraylake/~/repos/xarray/xarray/tests/__init__.py#line=328), in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords[""numbers""] = (""dim3"", numbers_values) 328 obj.encoding = {""foo"": ""bar""} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj AssertionError: ``` Somehow that code changes whether `IndexVariable.values` returns a writeable numpy array. I spent some time debugging but couldn't figure it out. cc @etienneschalk ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8829/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 638947370,MDU6SXNzdWU2Mzg5NDczNzA=,4156,writing sparse to netCDF,2448579,open,0,,,7,2020-06-15T15:33:23Z,2024-01-09T10:14:00Z,,MEMBER,,,,"I haven't looked at this too closely but it appears that this is a way to save MultiIndexed datasets to netCDF. So we may be able to do `sparse -> multiindex -> netCDF` http://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.html#compression-by-gathering cc @fujiisoup ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue 1976752481,PR_kwDOAMm_X85ekPdj,8412,Minimize duplication in `map_blocks` task graph,2448579,closed,0,,,7,2023-11-03T18:30:02Z,2024-01-03T04:10:17Z,2024-01-03T04:10:15Z,MEMBER,,0,pydata/xarray/pulls/8412,"Builds on #8560 - [x] Closes #8409 - [x] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` cc @max-sixty ``` print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).map_blocks(lambda x: x)))) # 779354739 -> 47699827 print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).drop_vars(da.indexes).map_blocks(lambda x: x)))) # 15981508 ``` This is a quick attempt. I think we can generalize this to minimize duplication. The downside is that the graphs are not totally embarrassingly parallel any more. This PR: ![image](https://github.com/pydata/xarray/assets/2448579/6e10d00a-53d5-42b9-8564-2008c6b65fbb) vs main: ![image](https://github.com/pydata/xarray/assets/2448579/cb0c8c56-e636-45c5-9c0e-b37c64ed0c04) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8412/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1672288892,I_kwDOAMm_X85jrRp8,7764,Support opt_einsum in xr.dot,2448579,closed,0,,,7,2023-04-18T03:29:48Z,2023-10-28T03:31:06Z,2023-10-28T03:31:06Z,MEMBER,,,,"### Is your feature request related to a problem? Shall we support [opt_einsum](https://dgasmith.github.io/opt_einsum/) as an optional backend for `xr.dot`? `opt_einsum.contract` is a drop-in replacement for `np.einsum` so this monkey-patch works today ``` xr.core.duck_array_ops.einsum = opt_einsum.contract ``` ### Describe the solution you'd like Add a `backend` kwarg with options `""numpy""` and `""opt_einsum""`, with the default being `""numpy""` ### Describe alternatives you've considered We could create a new package but it seems a bit silly. ### Additional context _No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7764/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1916012703,I_kwDOAMm_X85yNAif,8239,Address repo-review suggestions,2448579,open,0,,,7,2023-09-27T17:18:40Z,2023-10-02T20:24:34Z,,MEMBER,,,,"### What is your issue? Here's the output from the Scientific Python [Repo Review](https://repo-review.readthedocs.io/) tool. There's an online version [here](https://learn.scientific-python.org/development/guides/repo-review/?repo=pydata%2Fxarray&branch=main). On mac I run ``` pipx run 'sp-repo-review[cli]' --format html --show err gh:pydata/xarray@main | pbcopy ``` A lot of these seem fairly easy to fix. I'll note that there's a large number of `mypy` config suggestions.

General

?NameDescription
PY007 Supports an easy task runner (nox or tox)

Projects must have a noxfile.py or tox.ini to encourage new contributors.

PyProject

See https://github.com/pydata/xarray/issues/8239#issuecomment-1739363809
?NameDescription
PP305 Specifies xfail_strict

xfail_strict should be set. You can manually specify if a check should be strict when setting each xfail.

[tool.pytest.ini_options]
xfail_strict = true
PP308 Specifies useful pytest summary

-ra should be in addopts = [...] (print summary of all fails/errors).

[tool.pytest.ini_options]
addops = ["-ra", "--strict-config", "--strict-markers"]

Pre-commit

?NameDescription
PC110 Uses black

Use https://github.com/psf/black-pre-commit-mirror instead of https://github.com/psf/black in .pre-commit-config.yaml

PC160 Uses codespell

Must have https://github.com/codespell-project/codespell repo in .pre-commit-config.yaml

PC170 Uses PyGrep hooks (only needed if RST present)

Must have https://github.com/pre-commit/pygrep-hooks repo in .pre-commit-config.yaml

PC180 Uses prettier

Must have https://github.com/pre-commit/mirrors-prettier repo in .pre-commit-config.yaml

PC191 Ruff show fixes if fixes enabled

If --fix is present, --show-fixes must be too.

PC901 Custom pre-commit CI message

Should have something like this in .pre-commit-config.yaml:

ci:
  autoupdate_commit_msg: 'chore: update pre-commit hooks'

MyPy

?NameDescription
MY101 MyPy strict mode

Must have strict in the mypy config. MyPy is best with strict or nearly strict configuration. If you are happy with the strictness of your settings already, ignore this check or set strict = false explicitly.

[tool.mypy]
strict = true
MY103 MyPy warn unreachable

Must have warn_unreachable = true to pass this check. There are occasionally false positives (often due to platform or Python version static checks), so it's okay to ignore this check. But try it first - it can catch real bugs too.

[tool.mypy]
warn_unreachable = true
MY104 MyPy enables ignore-without-code

Must have "ignore-without-code" in enable_error_code = [...]. This will force all skips in your project to include the error code, which makes them more readable, and avoids skipping something unintended.

[tool.mypy]
enable_error_code = ["ignore-without-code", "redundant-expr", "truthy-bool"]
MY105 MyPy enables redundant-expr

Must have "redundant-expr" in enable_error_code = [...]. This helps catch useless lines of code, like checking the same condition twice.

[tool.mypy]
enable_error_code = ["ignore-without-code", "redundant-expr", "truthy-bool"]
MY106 MyPy enables truthy-bool

Must have "truthy-bool" in enable_error_code = []. This catches mistakes in using a value as truthy if it cannot be falsey.

[tool.mypy]
enable_error_code = ["ignore-without-code", "redundant-expr", "truthy-bool"]

Ruff

?NameDescription
RF101 Bugbear must be selected

Must select the flake8-bugbear B checks. Recommended:

[tool.ruff]
select = [
  "B",  # flake8-bugbear
]
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8239/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue 1409811164,I_kwDOAMm_X85UCALc,7162,copy of custom index does not align with original,2448579,closed,0,,,7,2022-10-14T20:17:22Z,2023-03-24T20:37:13Z,2023-03-24T20:37:12Z,MEMBER,,,,"### What happened? MY prototype CRSIndex is broken on the release version: https://github.com/dcherian/crsindex/blob/main/crsindex.ipynb under heading ""BROKEN: Successfully align with a copy of itself"" The cell's code is : ``` copy = newds.copy(deep=True) xr.align(copy, newds) ``` which should always work. @headtr1ck is https://github.com/pydata/xarray/pull/7140 to blame? ### Environment
INSTALLED VERSIONS ------------------ commit: None python: 3.10.6 | packaged by conda-forge | (main, Aug 22 2022, 20:43:44) [Clang 13.0.1 ] python-bits: 64 OS: Darwin OS-release: 21.6.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.12.2 libnetcdf: 4.8.1 xarray: 2022.10.0 pandas: 1.5.0 numpy: 1.23.3 scipy: 1.9.1 netCDF4: 1.6.0 pydap: None h5netcdf: 1.0.2 h5py: 3.7.0 Nio: None zarr: 2.13.3 cftime: 1.6.2 nc_time_axis: 1.4.1 PseudoNetCDF: 3.2.2 rasterio: 1.3.2 cfgrib: 0.9.10.2 iris: 3.3.1 bottleneck: 1.3.5 dask: 2022.9.2 distributed: 2022.9.2 matplotlib: 3.6.1 cartopy: 0.21.0 seaborn: 0.12.0 numbagg: 0.2.1 fsspec: 2022.8.2 cupy: None pint: 0.19.2 sparse: 0.13.0 flox: 0.6.0 numpy_groupies: 0.9.19 setuptools: 65.5.0 pip: 22.2.2 conda: None pytest: 7.1.3 IPython: 8.5.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7162/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 540451721,MDExOlB1bGxSZXF1ZXN0MzU1MjU4NjMy,3646,[WIP] GroupBy plotting,2448579,open,0,,,7,2019-12-19T17:26:39Z,2022-06-09T14:50:17Z,,MEMBER,,1,pydata/xarray/pulls/3646," - [x] Tests added - [x] Passes `black . && mypy . && flake8` - [ ] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API This adds plotting methods to GroupBy objects so that it's easy to plot each group as a facet. I'm finding this super helpful in my current research project. It's pretty self-contained, mostly just adding `map_groupby*` methods to `FacetGrid`. But that's because I make `GroupBy` mimic the underlying `DataArray` by adding `coords`, `attrs` and `__getitem__`. This still needs more tests but I would like feedback on the feature and the implementation. ## Example ``` python import numpy as np import xarray as xr time = np.arange(80) da = xr.DataArray(5 * np.sin(2*np.pi*time/10), coords={""time"": time}, dims=""time"") da[""period""] = da.time.where((time % 10) == 0).ffill(""time"")/10 da.plot() ``` ![image](https://user-images.githubusercontent.com/2448579/71194665-49f45c00-2284-11ea-96e5-9a5daec1b3a9.png) ``` python da.groupby(""period"").plot(col=""period"", col_wrap=4) ``` ![image](https://user-images.githubusercontent.com/2448579/107123905-a1290780-685d-11eb-9bae-831a7513aaed.png) ``` python da = da.expand_dims(y=10) da.groupby(""period"").plot(col=""period"", col_wrap=4, sharex=False, sharey=True, robust=True) ``` ![image](https://user-images.githubusercontent.com/2448579/71194716-5c6e9580-2284-11ea-832a-c4e7d9296390.png) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3646/reactions"", ""total_count"": 3, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 507599878,MDExOlB1bGxSZXF1ZXN0MzI4NTU4Mjg3,3406,Drop groups associated with nans in group variable,2448579,closed,0,,,7,2019-10-16T04:04:46Z,2022-01-05T18:57:07Z,2019-10-28T23:46:41Z,MEMBER,,0,pydata/xarray/pulls/3406," - [x] Closes #2383 - [x] Tests added - [x] Passes `black . && mypy . && flake8` - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 819003369,MDExOlB1bGxSZXF1ZXN0NTgyMTc1Mjg5,4977,Use numpy & dask sliding_window_view for rolling,2448579,closed,0,,,7,2021-03-01T15:54:22Z,2021-03-26T19:50:53Z,2021-03-26T19:50:50Z,MEMBER,,0,pydata/xarray/pulls/4977," Should merge after https://github.com/dask/dask/pull/7234 is merged - [x] Closes #3277, closes #2531, closes #2532, closes #2514 - [x] Tests added - [x] Passes `pre-commit run --all-files` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4977/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 573972855,MDExOlB1bGxSZXF1ZXN0MzgyMzgyODA5,3818,map_blocks: Allow passing dask-backed objects in args,2448579,closed,0,,,7,2020-03-02T13:26:12Z,2020-06-11T18:22:42Z,2020-06-07T16:13:35Z,MEMBER,,0,pydata/xarray/pulls/3818," - [x] Tests added - [x] Passes `isort -rc . && black . && mypy . && flake8` - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API It parses `args` and breaks any xarray objects into appropriate blocks before passing them on to the user function. e.g. ```python da1 = xr.DataArray( np.ones((10, 20)), dims=[""x"", ""y""], coords={""x"": np.arange(10), ""y"": np.arange(20)} ).chunk({""x"": 5, ""y"": 4}) da1 def sumda(da1, da2): #print(da1.shape) #print(da2.shape) return da1 - da2 da3 = (da1 + 1).isel(x=1, drop=True).rename({""y"": ""k""}) mapped = xr.map_blocks(sumda, da1, args=[da3]) xr.testing.assert_equal(da1-da3, mapped) # passes ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3818/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 570190199,MDU6SXNzdWU1NzAxOTAxOTk=,3796,RTD failing yet again,2448579,closed,0,,,7,2020-02-24T22:35:52Z,2020-03-24T22:23:00Z,2020-03-24T22:23:00Z,MEMBER,,,,"memory consumption errors as usual. @keewis I remember you had an idea for using pip instead of conda?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3796/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 558230156,MDExOlB1bGxSZXF1ZXN0MzY5NjYwMzg2,3737,Fix/rtd,2448579,closed,0,,,7,2020-01-31T16:22:38Z,2020-03-19T19:30:50Z,2020-01-31T17:10:02Z,MEMBER,,0,pydata/xarray/pulls/3737," 1. python 3.8 is not allowed on RTD (yet) 2. I pinned a few versions (based on the env solution obtained locally). This seems to have fixed the memory problem. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 475730053,MDExOlB1bGxSZXF1ZXN0MzAzNDIzNjI0,3175,Add join='override',2448579,closed,0,,,7,2019-08-01T14:53:52Z,2019-08-16T22:26:54Z,2019-08-16T22:26:45Z,MEMBER,,0,pydata/xarray/pulls/3175,"This adds `join='override'` which checks that indexes along a dimension are of the same size and overwrites those indices with indices from the first object. Definitely need help, feedback. - [x] With #3102 this partially helps with #2039, #2217 - [x] Tests added - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 322200856,MDExOlB1bGxSZXF1ZXN0MTg3MzkwNjcz,2118,"Add ""awesome xarray"" list to faq.",2448579,closed,0,,,7,2018-05-11T07:45:59Z,2018-05-14T21:19:51Z,2018-05-14T21:04:31Z,MEMBER,,0,pydata/xarray/pulls/2118,"partially addresses #1850 closes #946 I tried to make an ""awesome xarray"" list by doing a github search for 'xarray'. I only put packages that looked like they were intended for general use. Also, I moved the list from `internals.rst` to `faq.rst`. Let me know if there are any I've missed or any that should be added. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull