id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 2261855627,PR_kwDOAMm_X85togwQ,8969,CI: python 3.12 by default.,2448579,closed,0,,,2,2024-04-24T17:49:25Z,2024-04-29T16:21:20Z,2024-04-29T16:21:08Z,MEMBER,,0,pydata/xarray/pulls/8969,"1. Now that numba supports 3.12. 2. Disabled `pint` on the main environment since it doesn't work. Pint is still installed in the `all-but-dask` env, which is still runs python 3.11 for this reason. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8969/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1574694462,I_kwDOAMm_X85d2-4-,7513,"intermittent failures with h5netcdf, h5py on macos",2448579,closed,0,,,5,2023-02-07T16:58:43Z,2024-04-28T23:35:21Z,2024-04-28T23:35:21Z,MEMBER,,,,"### What is your issue? cc @hmaarrfk @kmuehlbauer Passed: https://github.com/pydata/xarray/actions/runs/4115923717/jobs/7105298426 Failed: https://github.com/pydata/xarray/actions/runs/4115946392/jobs/7105345290 Versions: ``` h5netcdf 1.1.0 pyhd8ed1ab_0 conda-forge h5py 3.8.0 nompi_py310h5555e59_100 conda-forge hdf4 4.2.15 h7aa5921_5 conda-forge hdf5 1.12.2 nompi_h48135f9_101 conda-forge ``` ``` =================================== FAILURES =================================== _____________ test_open_mfdataset_manyfiles[h5netcdf-20-True-5-5] ______________ [gw1] darwin -- Python 3.10.9 /Users/runner/micromamba-root/envs/xarray-tests/bin/python readengine = 'h5netcdf', nfiles = 20, parallel = True, chunks = 5 file_cache_maxsize = 5 @requires_dask @pytest.mark.filterwarnings(""ignore:use make_scale(name) instead"") def test_open_mfdataset_manyfiles( readengine, nfiles, parallel, chunks, file_cache_maxsize ): # skip certain combinations skip_if_not_engine(readengine) if ON_WINDOWS: pytest.skip(""Skipping on Windows"") randdata = np.random.randn(nfiles) original = Dataset({""foo"": (""x"", randdata)}) # test standard open_mfdataset approach with too many files with create_tmp_files(nfiles) as tmpfiles: writeengine = readengine if readengine != ""pynio"" else ""netcdf4"" # split into multiple sets of temp files for ii in original.x.values: subds = original.isel(x=slice(ii, ii + 1)) if writeengine != ""zarr"": subds.to_netcdf(tmpfiles[ii], engine=writeengine) else: # if writeengine == ""zarr"": subds.to_zarr(store=tmpfiles[ii]) # check that calculation on opened datasets works properly > with open_mfdataset( tmpfiles, combine=""nested"", concat_dim=""x"", engine=readengine, parallel=parallel, chunks=chunks if (not chunks and readengine != ""zarr"") else ""auto"", ) as actual: /Users/runner/work/xarray/xarray/xarray/tests/test_backends.py:3267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /Users/runner/work/xarray/xarray/xarray/backends/api.py:991: in open_mfdataset datasets, closers = dask.compute(datasets, closers) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py:599: in compute results = schedule(dsk, keys, **kwargs) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py:89: in get results = get_async( /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:511: in get_async raise_exception(exc, tb) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:319: in reraise raise exc /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:224: in execute_task result = _execute_task(task, data) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py:119: in _execute_task return func(*(_execute_task(a, cache) for a in args)) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py:72: in apply return func(*args, **kwargs) /Users/runner/work/xarray/xarray/xarray/backends/api.py:526: in open_dataset backend_ds = backend.open_dataset( /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:417: in open_dataset ds = store_entrypoint.open_dataset( /Users/runner/work/xarray/xarray/xarray/backends/store.py:32: in open_dataset vars, attrs = store.load() /Users/runner/work/xarray/xarray/xarray/backends/common.py:129: in load (_decode_variable_name(k), v) for k, v in self.get_variables().items() /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:220: in get_variables return FrozenDict( /Users/runner/work/xarray/xarray/xarray/core/utils.py:471: in FrozenDict return Frozen(dict(*args, **kwargs)) /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:221: in (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items() /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:200: in open_store_variable elif var.compression is not None: /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/h5netcdf/core.py:394: in compression return self._h5ds.compression _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <[AttributeError(""'NoneType' object has no attribute '_root'"") raised in repr()] Variable object at 0x151378970> @property def _h5ds(self): # Always refer to the root file and store not h5py object # subclasses: > return self._root._h5file[self._h5path] E AttributeError: 'NoneType' object has no attribute '_h5file' ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7513/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2261844699,PR_kwDOAMm_X85toeXT,8968,Bump dependencies incl `pandas>=2`,2448579,closed,0,,,0,2024-04-24T17:42:19Z,2024-04-27T14:17:16Z,2024-04-27T14:17:16Z,MEMBER,,0,pydata/xarray/pulls/8968," - [ ] Closes #xxxx - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8968/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2261917442,PR_kwDOAMm_X85touYl,8971,Delete pynio backend.,2448579,closed,0,,,2,2024-04-24T18:25:26Z,2024-04-25T14:38:23Z,2024-04-25T14:23:59Z,MEMBER,,0,pydata/xarray/pulls/8971," - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2228266052,PR_kwDOAMm_X85r24hE,8913,Update hypothesis action to always save the cache,2448579,closed,0,,,0,2024-04-05T15:09:35Z,2024-04-05T16:51:05Z,2024-04-05T16:51:03Z,MEMBER,,0,pydata/xarray/pulls/8913,Update the cache always.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8913/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2136709010,I_kwDOAMm_X85_W5eS,8753,Lazy Loading with `DataArray` vs. `Variable`,2448579,closed,0,,,0,2024-02-15T14:42:24Z,2024-04-04T16:46:54Z,2024-04-04T16:46:54Z,MEMBER,,,,"### Discussed in https://github.com/pydata/xarray/discussions/8751
Originally posted by **ilan-gold** February 15, 2024 My goal is to get a dataset from [custom io-zarr backend lazy-loaded](https://docs.xarray.dev/en/stable/internals/how-to-add-new-backend.html#how-to-support-lazy-loading). But when I declare a `DataArray` based on the `Variable` which uses `LazilyIndexedArray`, everything is read in. Is this expected? I specifically don't want to have to use dask if possible. I have seen https://github.com/aurghs/xarray-backend-tutorial/blob/main/2.Backend_with_Lazy_Loading.ipynb but it's a little bit different. While I have a custom backend array inheriting from `ZarrArrayWrapper`, this example using `ZarrArrayWrapper` directly still highlights the same unexpected behavior of everything being read in. ```python import zarr import xarray as xr from tempfile import mkdtemp import numpy as np from pathlib import Path from collections import defaultdict class AccessTrackingStore(zarr.DirectoryStore): def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) self._access_count = {} self._accessed = defaultdict(set) def __getitem__(self, key): for tracked in self._access_count: if tracked in key: self._access_count[tracked] += 1 self._accessed[tracked].add(key) return super().__getitem__(key) def get_access_count(self, key): return self._access_count[key] def set_key_trackers(self, keys_to_track): if isinstance(keys_to_track, str): keys_to_track = [keys_to_track] for k in keys_to_track: self._access_count[k] = 0 def get_subkeys_accessed(self, key): return self._accessed[key] orig_path = Path(mkdtemp()) z = zarr.group(orig_path / ""foo.zarr"") z['array'] = np.random.randn(1000, 1000) store = AccessTrackingStore(orig_path / ""foo.zarr"") store.set_key_trackers(['array']) z = zarr.group(store) arr = xr.backends.zarr.ZarrArrayWrapper(z['array']) lazy_arr = xr.core.indexing.LazilyIndexedArray(arr) # just `.zarray` var = xr.Variable(('x', 'y'), lazy_arr) print('Variable read in ', store.get_subkeys_accessed('array')) # now everything is read in da = xr.DataArray(var) print('DataArray read in ', store.get_subkeys_accessed('array')) ```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8753/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2136724736,PR_kwDOAMm_X85m_MtN,8754,Don't access data when creating DataArray from Variable.,2448579,closed,0,,,2,2024-02-15T14:48:32Z,2024-04-04T16:46:54Z,2024-04-04T16:46:53Z,MEMBER,,0,pydata/xarray/pulls/8754," - [x] Closes #8753 This seems to have been around since 2016-ish, so presumably our backend code path is passing arrays around, not Variables. cc @ilan-gold","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8754/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2224300175,PR_kwDOAMm_X85rpG4S,8907,Trigger hypothesis stateful tests nightly,2448579,closed,0,,,0,2024-04-04T02:16:59Z,2024-04-04T02:17:49Z,2024-04-04T02:17:47Z,MEMBER,,0,pydata/xarray/pulls/8907," - [ ] Closes #xxxx - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8907/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2098659175,PR_kwDOAMm_X85k-T6b,8658,Stateful tests with Dataset,2448579,closed,0,,,8,2024-01-24T16:34:59Z,2024-04-03T21:29:38Z,2024-04-03T21:29:36Z,MEMBER,,0,pydata/xarray/pulls/8658,"I was curious to see if the hypothesis [stateful testing](https://hypothesis.readthedocs.io/en/latest/stateful.html) would catch an inconsistent sequence of index manipulation operations like #8646. Turns out `rename_vars` is basically broken? (filed #8659) :P PS: this [blog post](https://hypothesis.works/articles/how-not-to-die-hard-with-hypothesis/) is amazing. ``` E state = DatasetStateMachine() E state.assert_invariants() E > === E E E Dimensions: () E Data variables: E *empty* E === E E E > vars: ('1', '1_') E state.add_dim_coord(var= E array([0], dtype=uint32)) E state.assert_invariants() E > === E E E Dimensions: (1: 1) E Coordinates: E * 1 (1) uint32 0 E Data variables: E 1_ (1) uint32 0 E === E E E > renaming 1 to 0 E state.rename_vars(newname='0') E state.assert_invariants() E > === E E E Dimensions: (1: 1) E Coordinates: E * 0 (1) uint32 0 E Dimensions without coordinates: 1 E Data variables: E 1_ (1) uint32 0 E === E E E state.teardown() ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8658/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2000205407,PR_kwDOAMm_X85fzupc,8467,[skip-ci] dev whats-new,2448579,closed,0,,,0,2023-11-18T03:59:29Z,2024-04-03T21:08:45Z,2023-11-18T15:20:37Z,MEMBER,,0,pydata/xarray/pulls/8467," - [ ] Closes #xxxx - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1989233637,PR_kwDOAMm_X85fOdAk,8446,Remove PseudoNetCDF,2448579,closed,0,,,0,2023-11-12T04:29:50Z,2024-04-03T21:08:44Z,2023-11-13T21:53:56Z,MEMBER,,0,pydata/xarray/pulls/8446,"joining the party - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8446/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 1, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2064698904,PR_kwDOAMm_X85jLHsQ,8584,Silence a bunch of CachingFileManager warnings,2448579,closed,0,,,1,2024-01-03T21:57:07Z,2024-04-03T21:08:27Z,2024-01-03T22:52:58Z,MEMBER,,0,pydata/xarray/pulls/8584,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2102850331,PR_kwDOAMm_X85lMW8k,8674,Fix negative slicing of Zarr arrays,2448579,closed,0,,,0,2024-01-26T20:22:21Z,2024-04-03T21:08:26Z,2024-02-10T02:57:32Z,MEMBER,,0,pydata/xarray/pulls/8674,"Closes #8252 Closes #3921 - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2148245262,PR_kwDOAMm_X85nmmqX,8777,Return a dataclass from Grouper.factorize,2448579,closed,0,,,0,2024-02-22T05:41:29Z,2024-04-03T21:08:25Z,2024-03-15T04:47:30Z,MEMBER,,0,pydata/xarray/pulls/8777,"Toward #8510, builds on #8776","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8777/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2148164557,PR_kwDOAMm_X85nmU5w,8775,[skip-ci] NamedArray: Add lazy indexing array refactoring plan,2448579,closed,0,,,0,2024-02-22T04:25:49Z,2024-04-03T21:08:21Z,2024-02-23T22:20:09Z,MEMBER,,0,pydata/xarray/pulls/8775,"This adds a proposal for decoupling the lazy indexing array machinery, indexing adapter machinery, and Variable's setitem and getitem methods, so that the latter can be migrated to NamedArray. cc @andersy005 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8775/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 2, ""eyes"": 0}",,,13221727,pull 2198991054,PR_kwDOAMm_X85qTNFP,8861,upstream-dev CI: Fix interp and cumtrapz,2448579,closed,0,,,0,2024-03-21T02:49:40Z,2024-04-03T21:08:17Z,2024-03-21T04:16:45Z,MEMBER,,0,pydata/xarray/pulls/8861," - [x] xref #8844 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8861/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2215539648,PR_kwDOAMm_X85rLW_p,8891,2024.03.0: Add whats-new,2448579,closed,0,,,0,2024-03-29T15:01:35Z,2024-03-29T17:07:19Z,2024-03-29T17:07:17Z,MEMBER,,0,pydata/xarray/pulls/8891,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8891/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2206047573,PR_kwDOAMm_X85qrHyn,8875,Optimize writes to existing Zarr stores.,2448579,closed,0,,,0,2024-03-25T15:32:47Z,2024-03-29T14:35:30Z,2024-03-29T14:35:29Z,MEMBER,,0,pydata/xarray/pulls/8875,We need to read existing variables to make sure we append or write to a region with the right encoding. Currently we decode all arrays in a Zarr group. Instead only decode those arrays for which we require encoding information.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8875/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2066510805,I_kwDOAMm_X857LHPV,8589,"Don't overwrite indexes for region writes, always",2448579,closed,0,,,2,2024-01-04T23:52:18Z,2024-03-27T16:24:37Z,2024-03-27T16:24:36Z,MEMBER,,,,"### What happened? Currently we don't overwrite indexes when `region=""auto""` https://github.com/pydata/xarray/blob/e6ccedb56ed4bc8d0b7c1f16ab325795330fb19a/xarray/backends/api.py#L1769-L1770 I propose we do this for all region writes and completely disallow modifying indexes with a region write. This would match the `map_blocks` model, where all indexes are specified in the `template` and no changes by the mapped function are allowed. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8589/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2206385638,PR_kwDOAMm_X85qsSKm,8877,Don't allow overwriting indexes with region writes,2448579,closed,0,,,0,2024-03-25T18:13:19Z,2024-03-27T16:24:37Z,2024-03-27T16:24:35Z,MEMBER,,0,pydata/xarray/pulls/8877,"- [x] Closes #8589 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` cc @slevang ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8877/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2123950388,PR_kwDOAMm_X85mT6XD,8720,groupby: Dispatch quantile to flox.,2448579,closed,0,,,7,2024-02-07T21:42:42Z,2024-03-26T15:08:32Z,2024-03-26T15:08:30Z,MEMBER,,0,pydata/xarray/pulls/8720,"- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` @aulemahal would you be able to test against xclim's test suite. I imagine you're doing a bunch of grouped quantiles.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8720/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2184830377,PR_kwDOAMm_X85pjN8A,8829,"Revert ""Do not attempt to broadcast when global option ``arithmetic_b…",2448579,closed,0,,,7,2024-03-13T20:27:12Z,2024-03-20T15:30:12Z,2024-03-15T03:59:07Z,MEMBER,,0,pydata/xarray/pulls/8829,"…roadcast=False`` (#8784)"" This reverts commit 11f89ecdd41226cf93da8d1e720d2710849cd23e. Reverting #8784 Sadly that PR broke a lot of tests by breaking `create_test_data` with ``` from xarray.tests import create_test_data create_test_data() ``` ``` --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data() File [~/repos/xarray/xarray/tests/__init__.py:329](http://localhost:8888/lab/workspaces/auto-P/tree/repos/devel/arraylake/~/repos/xarray/xarray/tests/__init__.py#line=328), in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords[""numbers""] = (""dim3"", numbers_values) 328 obj.encoding = {""foo"": ""bar""} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj AssertionError: ``` Somehow that code changes whether `IndexVariable.values` returns a writeable numpy array. I spent some time debugging but couldn't figure it out. cc @etienneschalk ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8829/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2189750364,PR_kwDOAMm_X85p0Epw,8847,pandas 3 MultiIndex fixes,2448579,closed,0,,,0,2024-03-16T03:51:06Z,2024-03-20T15:00:20Z,2024-03-20T15:00:18Z,MEMBER,,0,pydata/xarray/pulls/8847,"xref #8844 Closes https://github.com/xarray-contrib/flox/issues/342","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8847/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2189738090,PR_kwDOAMm_X85p0CKq,8846,Support pandas copy-on-write behaviour,2448579,closed,0,,,2,2024-03-16T03:14:46Z,2024-03-18T16:00:15Z,2024-03-18T16:00:12Z,MEMBER,,0,pydata/xarray/pulls/8846,"- [x] Closes #8843 - [x] Tests added ```python import numpy as np import pandas as pd pd.set_option(""mode.copy_on_write"", True) from xarray.core.variable import _possibly_convert_objects string_var = np.array([""a"", ""bc"", ""def""], dtype=object) datetime_var = np.array( [""2019-01-01"", ""2019-01-02"", ""2019-01-03""], dtype=""datetime64[ns]"" ) assert _possibly_convert_objects(string_var).flags.writeable assert _possibly_convert_objects(datetime_var).flags.writeable ``` The core issue is that we now get read-only arrays back from pandas here: https://github.com/pydata/xarray/blob/fbcac7611bf9a16750678f93483d3dbe0e261a0a/xarray/core/variable.py#L197-L212 @phofl is this expected?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8846/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2188936276,I_kwDOAMm_X86CeIRU,8843,Get ready for pandas 3 copy-on-write,2448579,closed,0,,,2,2024-03-15T15:51:36Z,2024-03-18T16:00:14Z,2024-03-18T16:00:14Z,MEMBER,,,,"### What is your issue? This line fails with `pd.set_options(""mode.copy_on_write"", True)` https://github.com/pydata/xarray/blob/c9d3084e98d38a7a9488380789a8d0acfde3256f/xarray/tests/__init__.py#L329 We'll need to fix this before Pandas 3 is released in April: https://github.com/pydata/xarray/blob/c9d3084e98d38a7a9488380789a8d0acfde3256f/xarray/tests/__init__.py#L329 Here's a test ```python def example(): obj = Dataset() obj[""dim2""] = (""dim2"", 0.5 * np.arange(9)) obj[""time""] = (""time"", pd.date_range(""2000-01-01"", periods=20) print({k: v.data.flags for k, v in obj.variables.items()}) return obj example() pd.set_options(""mode.copy_on_write"", True) example() ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8843/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2098659703,I_kwDOAMm_X859FwF3,8659,renaming index variables with `rename_vars` seems buggy,2448579,closed,0,,,1,2024-01-24T16:35:18Z,2024-03-15T19:21:51Z,2024-03-15T19:21:51Z,MEMBER,,,,"### What happened? (xref #8658) I'm not sure what the expected behaviour is here: ```python import xarray as xr import numpy as np from xarray.testing import _assert_internal_invariants ds = xr.Dataset() ds.coords[""1""] = (""1"", np.array([1], dtype=np.uint32)) ds[""1_""] = (""1"", np.array([1], dtype=np.uint32)) ds = ds.rename_vars({""1"": ""0""}) ds ``` It looks like this sequence of operations creates a default index But then ```python from xarray.testing import _assert_internal_invariants _assert_internal_invariants(ds, check_default_indexes=True) ``` fails with ``` ... File ~/repos/xarray/xarray/testing/assertions.py:301, in _assert_indexes_invariants_checks(indexes, possible_coord_variables, dims, check_default) 299 if check_default: 300 defaults = default_indexes(possible_coord_variables, dims) --> 301 assert indexes.keys() == defaults.keys(), (set(indexes), set(defaults)) 302 assert all(v.equals(defaults[k]) for k, v in indexes.items()), ( 303 indexes, 304 defaults, 305 ) AssertionError: ({'0'}, set()) ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8659/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2187682227,PR_kwDOAMm_X85ps6tB,8839,[skip-ci] Fix upstream-dev env,2448579,closed,0,,,0,2024-03-15T04:08:58Z,2024-03-15T04:37:59Z,2024-03-15T04:37:58Z,MEMBER,,0,pydata/xarray/pulls/8839," upstream-dev env is broken - [x] Closes #8623 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8839/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2187646833,PR_kwDOAMm_X85psy9g,8837,Add dask-expr for windows envs,2448579,closed,0,,,0,2024-03-15T03:27:48Z,2024-03-15T04:06:05Z,2024-03-15T04:06:03Z,MEMBER,,0,pydata/xarray/pulls/8837," - [x] Closes #8830 - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8837/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2184871888,I_kwDOAMm_X86COn_Q,8830,"failing tests, all envs",2448579,closed,0,,,1,2024-03-13T20:56:34Z,2024-03-15T04:06:04Z,2024-03-15T04:06:04Z,MEMBER,,,,"### What happened? All tests are failing because of an error in `create_test_data` ``` from xarray.tests import create_test_data create_test_data() ``` ``` --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data() File [~/repos/xarray/xarray/tests/__init__.py:329](http://localhost:8888/lab/workspaces/auto-P/tree/repos/devel/arraylake/~/repos/xarray/xarray/tests/__init__.py#line=328), in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords[""numbers""] = (""dim3"", numbers_values) 328 obj.encoding = {""foo"": ""bar""} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj AssertionError: ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2184606202,PR_kwDOAMm_X85picsD,8827,Add `dask-expr` to environment-3.12.yml,2448579,closed,0,,,0,2024-03-13T18:07:27Z,2024-03-13T20:20:46Z,2024-03-13T20:20:45Z,MEMBER,,0,pydata/xarray/pulls/8827,xref https://github.com/pydata/xarray/actions/runs/8269168819/job/22623800305?pr=8777,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8827/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1308371056,I_kwDOAMm_X85N_Chw,6806,"New alignment option: ""exact"" without broadcasting OR Turn off automatic broadcasting",2448579,closed,0,,,9,2022-07-18T18:43:31Z,2024-03-13T15:36:35Z,2024-03-13T15:36:35Z,MEMBER,,,,"### Is your feature request related to a problem? If we have two objects with dims `x` and `x1`, then `xr.align(..., join=""exact"")` will pass because these dimensions are broadcastable. I'd like a stricter option (`join=""strict""`?) that disallows broadcasting. ### Describe the solution you'd like ```python xr.align( xr.DataArray([1], dims=""x""), xr.DataArray([1], dims=""x1""), join=""strict"", ) ``` would raise an error. It'd be nice to have this as a built-in option so we can use ``` python with xr.set_options(arithmetic_join=""strict""): ... ``` ### Describe alternatives you've considered An alternative would be to allow control over automatic broadcasting through the `set_options` context manager., but that seems like it would be more complicated to implement. ### Additional context This turns up in staggered grid calculations with xgcm where it is easy to mistakenly construct very high-dimensional arrays because of automatic broadcasting.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6806/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2148242023,PR_kwDOAMm_X85nml9d,8776,Refactor Grouper objects,2448579,closed,0,,,0,2024-02-22T05:38:09Z,2024-03-07T21:50:07Z,2024-03-07T21:50:04Z,MEMBER,,0,pydata/xarray/pulls/8776,"Some refactoring towards the Grouper refactor described in #8510 1. Rename to Resampler from ResampleGrouper 2. Refactor to a single ""ResolvedGrouper"" object that encapsulates the underling Grouper/Resampler object: UniqueGrouper, BinGrouper, or TimeResampler.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8776/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2021858121,PR_kwDOAMm_X85g81wJ,8510,Grouper object design doc,2448579,closed,0,,,6,2023-12-02T04:56:54Z,2024-03-06T02:27:07Z,2024-03-06T02:27:04Z,MEMBER,,0,pydata/xarray/pulls/8510,"xref #8509, #6610 **Rendered version** [here](https://github.com/dcherian/xarray/blob/grouper-proposal/design_notes/grouper_objects.md) ----- @pydata/xarray I've been poking at this on and off for a year now and finally figured out how to do it cleanly (#8509). I wrote up a design doc for `Grouper` objects that allow custom conversion of DataArrays to integer group codes, following the NEP template (which is absolutely great!). Such Grouper objects allow us to generalize the GroupBy interface to a much larger class of problems, and eventually provide a nice path to grouping by multiple variables (#6610) #8509 implements two custom Groupers for you to try out :) ```python import xarray as xr from xarray.core.groupers import SeasonGrouper, SeasonResampler ds = xr.tutorial.open_dataset(""air_temperature"") # custom seasons! ds.air.groupby(time=SeasonGrouper([""JF"", ""MAM"", ""JJAS"", ""OND""])).mean() ds.air.resample(time=SeasonResampler([""DJF"", ""MAM"", ""JJAS"", ""ON""])).count() ``` All comments are welcome, 1. there are a couple of specific API and design decisions to be made. I'll make some comments pointing these out. 2. I'm also curious about what `Grouper` objects we should provide in Xarray. ----- cc @ilan-gold @ivirshup @aulemahal @tomvothecoder @jbusecke @katiedagon - it would be good to hear what ""Groupers"" would be useful for your work / projects. I bet you already have examples that fit this proposal","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8510/reactions"", ""total_count"": 8, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 8, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2135011804,I_kwDOAMm_X85_QbHc,8748,release v2024.02.0,2448579,closed,0,14808389,,0,2024-02-14T19:08:38Z,2024-02-18T22:52:15Z,2024-02-18T22:52:15Z,MEMBER,,,,"### What is your issue? Thanks to @keewis for volunteering at today's meeting :()","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8748/reactions"", ""total_count"": 3, ""+1"": 0, ""-1"": 0, ""laugh"": 1, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2102852029,PR_kwDOAMm_X85lMXU0,8675,Fix NetCDF4 C version detection,2448579,closed,0,,,1,2024-01-26T20:23:54Z,2024-01-27T01:28:51Z,2024-01-27T01:28:49Z,MEMBER,,0,pydata/xarray/pulls/8675,"This fixes the failure locally for me. cc @max-sixty ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2098626592,PR_kwDOAMm_X85k-Mnt,8657,groupby: Don't set `method` by default on flox>=0.9,2448579,closed,0,,,0,2024-01-24T16:20:57Z,2024-01-26T16:54:25Z,2024-01-26T16:54:23Z,MEMBER,,0,pydata/xarray/pulls/8657," - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8657/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2064313690,I_kwDOAMm_X857Cu1a,8580,add py3.12 CI and update pyproject.toml,2448579,closed,0,,,2,2024-01-03T16:26:47Z,2024-01-17T21:54:13Z,2024-01-17T21:54:13Z,MEMBER,,,,"### What is your issue? We haven't done this yet! https://github.com/pydata/xarray/blob/d87ba61c957fc3af77251ca6db0f6bccca1acb82/pyproject.toml#L11-L15","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8580/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2086607437,I_kwDOAMm_X858XxpN,8616, new release 2024.01.0,2448579,closed,0,,,0,2024-01-17T17:03:20Z,2024-01-17T19:21:12Z,2024-01-17T19:21:12Z,MEMBER,,,,"### What is your issue? Thanks @TomNicholas for volunteering to drive this release!","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8616/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 1, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2021763059,PR_kwDOAMm_X85g8iNJ,8507,Deprecate `squeeze` in GroupBy.,2448579,closed,0,,,2,2023-12-02T00:21:43Z,2024-01-08T03:08:47Z,2024-01-08T01:05:23Z,MEMBER,,0,pydata/xarray/pulls/8507,"- [x] xref #2157, xref #1460 - [x] Closes #8518, closes #8263 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2065788086,PR_kwDOAMm_X85jOw74,8585,Enable Zarr V3 tests in all CI runs.,2448579,closed,0,,,0,2024-01-04T14:45:44Z,2024-01-05T17:53:08Z,2024-01-05T17:53:06Z,MEMBER,,0,pydata/xarray/pulls/8585,"🤦🏾‍♂️ Spotted in https://github.com/pydata/xarray/pull/8460 Builds on #8586 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2064420057,I_kwDOAMm_X857DIzZ,8581,bump min versions,2448579,closed,0,,,0,2024-01-03T17:45:10Z,2024-01-05T16:13:16Z,2024-01-05T16:13:15Z,MEMBER,,,,"### What is your issue? Looks like we can bump a number of min versions: ``` Package Required Policy Status ----------------- -------------------- -------------------- ------ cartopy 0.20 (2021-09-17) 0.21 (2022-09-10) < dask-core 2022.7 (2022-07-08) 2022.12 (2022-12-02) < distributed 2022.7 (2022-07-08) 2022.12 (2022-12-02) < flox 0.5 (2022-05-03) 0.6 (2022-10-12) < iris 3.2 (2022-02-15) 3.4 (2022-12-01) < matplotlib-base 3.5 (2021-11-18) 3.6 (2022-09-16) < numba 0.55 (2022-01-14) 0.56 (2022-09-28) < numpy 1.22 (2022-01-03) 1.23 (2022-06-23) < packaging 21.3 (2021-11-18) 22.0 (2022-12-08) < pandas 1.4 (2022-01-22) 1.5 (2022-09-19) < scipy 1.8 (2022-02-06) 1.9 (2022-07-30) < seaborn 0.11 (2020-09-08) 0.12 (2022-09-06) < typing_extensions 4.3 (2022-07-01) 4.4 (2022-10-07) < zarr 2.12 (2022-06-23) 2.13 (2022-09-27) < ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8581/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2065809896,PR_kwDOAMm_X85jO1oX,8586,Bump min deps.,2448579,closed,0,,,0,2024-01-04T14:59:05Z,2024-01-05T16:13:16Z,2024-01-05T16:13:14Z,MEMBER,,0,pydata/xarray/pulls/8586," - [x] Closes #8581 - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2066129022,PR_kwDOAMm_X85jP678,8587,Silence another warning in test_backends.py,2448579,closed,0,,,1,2024-01-04T18:20:49Z,2024-01-05T16:13:05Z,2024-01-05T16:13:03Z,MEMBER,,0,pydata/xarray/pulls/8587,"Using 255 as fillvalue for int8 arrays will not be allowed any more. Previously this overflowed to -1. Now specify that instead. On numpy 1.24.4 ``` >>> np.array([255], dtype=""i1"") DeprecationWarning: NumPy will stop allowing conversion of out-of-bound Python integers to integer arrays. The conversion of 255 to int8 will fail in the future. array([-1], dtype=int8) ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2052694433,PR_kwDOAMm_X85ilhQm,8565,Faster encoding functions.,2448579,closed,0,,,1,2023-12-21T16:05:02Z,2024-01-04T14:25:45Z,2024-01-04T14:25:43Z,MEMBER,,0,pydata/xarray/pulls/8565,"Spotted when profiling some write workloads. 1. Speeds up the check for multi-index 2. Speeds up one string encoder by not re-creating variables when not necessary. @benbovy is there a better way?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1976752481,PR_kwDOAMm_X85ekPdj,8412,Minimize duplication in `map_blocks` task graph,2448579,closed,0,,,7,2023-11-03T18:30:02Z,2024-01-03T04:10:17Z,2024-01-03T04:10:15Z,MEMBER,,0,pydata/xarray/pulls/8412,"Builds on #8560 - [x] Closes #8409 - [x] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` cc @max-sixty ``` print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).map_blocks(lambda x: x)))) # 779354739 -> 47699827 print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).drop_vars(da.indexes).map_blocks(lambda x: x)))) # 15981508 ``` This is a quick attempt. I think we can generalize this to minimize duplication. The downside is that the graphs are not totally embarrassingly parallel any more. This PR: ![image](https://github.com/pydata/xarray/assets/2448579/6e10d00a-53d5-42b9-8564-2008c6b65fbb) vs main: ![image](https://github.com/pydata/xarray/assets/2448579/cb0c8c56-e636-45c5-9c0e-b37c64ed0c04) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8412/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2052610515,PR_kwDOAMm_X85ilOq9,8564,Fix mypy type ignore,2448579,closed,0,,,1,2023-12-21T15:15:26Z,2023-12-21T15:41:13Z,2023-12-21T15:24:52Z,MEMBER,,0,pydata/xarray/pulls/8564,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2047617215,PR_kwDOAMm_X85iUJ7y,8560,Adapt map_blocks to use new Coordinates API,2448579,closed,0,,,0,2023-12-18T23:11:55Z,2023-12-20T17:11:18Z,2023-12-20T17:11:16Z,MEMBER,,0,pydata/xarray/pulls/8560,"Fixes roundtripping of string dtype indexes - [x] Tests added ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2033054792,PR_kwDOAMm_X85hi5U2,8532,Whats-new for 2023.12.0,2448579,closed,0,,,0,2023-12-08T17:29:47Z,2023-12-08T19:36:28Z,2023-12-08T19:36:26Z,MEMBER,,0,pydata/xarray/pulls/8532," - [ ] Closes #xxxx - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2021754904,PR_kwDOAMm_X85g8gnU,8506,Deprecate `squeeze` in GroupBy.,2448579,closed,0,,,1,2023-12-02T00:08:50Z,2023-12-02T00:13:36Z,2023-12-02T00:13:36Z,MEMBER,,0,pydata/xarray/pulls/8506,"- [x] Closes #2157 - [ ] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` Could use a close-ish review.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1989588884,I_kwDOAMm_X852lreU,8448,mypy 1.7.0 raising errors,2448579,closed,0,,,0,2023-11-12T21:41:43Z,2023-12-01T22:02:22Z,2023-12-01T22:02:22Z,MEMBER,,,,"### What happened? ``` xarray/namedarray/core.py:758: error: Value of type Never is not indexable [index] xarray/core/alignment.py:684: error: Unused ""type: ignore"" comment [unused-ignore] xarray/core/alignment.py:1156: error: Unused ""type: ignore"" comment [unused-ignore] xarray/core/dataset.py: note: In member ""sortby"" of class ""Dataset"": xarray/core/dataset.py:7967: error: Incompatible types in assignment (expression has type ""tuple[Alignable, ...]"", variable has type ""tuple[DataArray, ...]"") [assignment] xarray/core/dataset.py:7979: error: ""Alignable"" has no attribute ""isel"" [attr-defined] ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8448/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 2003229011,PR_kwDOAMm_X85f9xPG,8472,Avoid duplicate Zarr array read,2448579,closed,0,,,0,2023-11-21T00:16:34Z,2023-12-01T02:58:22Z,2023-12-01T02:47:03Z,MEMBER,,0,pydata/xarray/pulls/8472,"We already get the underlying Zarr array in https://github.com/pydata/xarray/blob/bb8511e0894020e180d95d2edb29ed4036ac6447/xarray/backends/zarr.py#L529-L531 and then pass it to `open_store_variable`. Just pass that array down to `ZarrArrayWrapper` instead of reading it from the datastore again.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8472/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 2015530249,PR_kwDOAMm_X85gnO8L,8489,Minor to_zarr optimizations,2448579,closed,0,,,0,2023-11-28T23:56:32Z,2023-12-01T02:20:19Z,2023-12-01T02:18:18Z,MEMBER,,0,pydata/xarray/pulls/8489,Avoid repeatedly pinging a remote store by requesting keys at one go.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8489/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1615596004,I_kwDOAMm_X85gTAnk,7596,illustrate time offset arithmetic,2448579,closed,0,,,2,2023-03-08T16:54:15Z,2023-11-29T01:31:45Z,2023-11-29T01:31:45Z,MEMBER,,,,"### Is your feature request related to a problem? We should document changing the time vector using pandas date offsets [here](https://docs.xarray.dev/en/stable/user-guide/time-series.html#time-series-data) This is particularly useful for centering the time stamps after a resampling operation. Related: - CFTime offsets: https://github.com/pydata/xarray/issues/5687 - `loffset` deprecation: https://github.com/pydata/xarray/pull/7444 ### Describe the solution you'd like _No response_ ### Describe alternatives you've considered _No response_ ### Additional context _No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7596/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1997656427,PR_kwDOAMm_X85frEdb,8461,2023.11.0 Whats-new,2448579,closed,0,,,0,2023-11-16T19:55:12Z,2023-11-17T21:02:22Z,2023-11-17T21:02:20Z,MEMBER,,0,pydata/xarray/pulls/8461,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1997136566,PR_kwDOAMm_X85fpRL3,8458,Pin mypy < 1.7,2448579,closed,0,,,0,2023-11-16T15:31:26Z,2023-11-16T17:29:04Z,2023-11-16T17:29:03Z,MEMBER,,0,pydata/xarray/pulls/8458,"xref #8448 get back to green checks for now.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1995186323,PR_kwDOAMm_X85finmE,8452,[skip-ci] Small updates to IO docs.,2448579,closed,0,,,0,2023-11-15T17:05:47Z,2023-11-16T15:19:59Z,2023-11-16T15:19:57Z,MEMBER,,0,pydata/xarray/pulls/8452,Also fixes the RTD failure on main,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8452/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1989227042,PR_kwDOAMm_X85fObtL,8445,Pin pint to >=0.22,2448579,closed,0,,,3,2023-11-12T03:58:40Z,2023-11-13T19:39:54Z,2023-11-13T19:39:53Z,MEMBER,,0,pydata/xarray/pulls/8445,"- [x] Closes https://github.com/pydata/xarray/issues/7971 - [x] Closes https://github.com/pydata/xarray/issues/8437. We were previously pinned to `<0.21` Removing the pin didn't change the env but with `>=0.21` we get `0.22` which works. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1989212292,PR_kwDOAMm_X85fOYwT,8444,Remove keep_attrs from resample signature,2448579,closed,0,,,1,2023-11-12T02:57:59Z,2023-11-12T22:53:36Z,2023-11-12T22:53:35Z,MEMBER,,0,pydata/xarray/pulls/8444," - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1635949876,PR_kwDOAMm_X85MpxlL,7659,Redo whats-new for 2023.03.0,2448579,closed,0,,,0,2023-03-22T15:02:38Z,2023-11-06T04:25:54Z,2023-03-22T15:42:49Z,MEMBER,,0,pydata/xarray/pulls/7659,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7659/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1630533356,PR_kwDOAMm_X85MXo4e,7643,Whats-new for release 2023.03.0,2448579,closed,0,,,0,2023-03-18T19:14:55Z,2023-11-06T04:25:53Z,2023-03-20T15:57:36Z,MEMBER,,0,pydata/xarray/pulls/7643,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7643/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",,,13221727,pull 1471673992,PR_kwDOAMm_X85EFDiU,7343,Fix mypy failures,2448579,closed,0,,,1,2022-12-01T17:16:44Z,2023-11-06T04:25:52Z,2022-12-01T18:25:07Z,MEMBER,,0,pydata/xarray/pulls/7343,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7343/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1533942791,PR_kwDOAMm_X85HahUq,7440,v2023.01.0 whats-new,2448579,closed,0,,,0,2023-01-15T18:20:28Z,2023-11-06T04:25:52Z,2023-01-18T21:18:49Z,MEMBER,,0,pydata/xarray/pulls/7440,Should update the date and delete empty sections before merging,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7440/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1942666419,PR_kwDOAMm_X85cxojc,8304,Move Variable aggregations to NamedArray,2448579,closed,0,,,6,2023-10-13T21:31:01Z,2023-11-06T04:25:43Z,2023-10-17T19:14:12Z,MEMBER,,0,pydata/xarray/pulls/8304,"- [x] fix breaking attrs test - [x] Look at `numeric_only` in NAMED_ARRAY_OBJECT - xref #8238 - [ ] Tests added","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1689364566,PR_kwDOAMm_X85PbeOv,7796,Speed up .dt accessor by preserving Index objects.,2448579,closed,0,,,1,2023-04-29T04:22:10Z,2023-11-06T04:25:42Z,2023-05-16T17:55:48Z,MEMBER,,0,pydata/xarray/pulls/7796," - [ ] Closes #xxxx - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7796/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1973472421,PR_kwDOAMm_X85eZF4x,8400,Better attr diff for `testing.assert_identical`,2448579,closed,0,,,2,2023-11-02T04:15:09Z,2023-11-04T20:25:37Z,2023-11-04T20:25:36Z,MEMBER,,0,pydata/xarray/pulls/8400,"- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` This gives us better reprs where only differing attributes are shown in the diff. On main: ``` ... Differing coordinates: L * x (x) - [x] Closes #7764 - [x] Closes #8017 - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1672288892,I_kwDOAMm_X85jrRp8,7764,Support opt_einsum in xr.dot,2448579,closed,0,,,7,2023-04-18T03:29:48Z,2023-10-28T03:31:06Z,2023-10-28T03:31:06Z,MEMBER,,,,"### Is your feature request related to a problem? Shall we support [opt_einsum](https://dgasmith.github.io/opt_einsum/) as an optional backend for `xr.dot`? `opt_einsum.contract` is a drop-in replacement for `np.einsum` so this monkey-patch works today ``` xr.core.duck_array_ops.einsum = opt_einsum.contract ``` ### Describe the solution you'd like Add a `backend` kwarg with options `""numpy""` and `""opt_einsum""`, with the default being `""numpy""` ### Describe alternatives you've considered We could create a new package but it seems a bit silly. ### Additional context _No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7764/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1954535213,PR_kwDOAMm_X85dZT47,8351,"[skip-ci] Add benchmarks for Dataset binary ops, chunk",2448579,closed,0,,,1,2023-10-20T15:31:36Z,2023-10-20T18:08:40Z,2023-10-20T18:08:38Z,MEMBER,,0,pydata/xarray/pulls/8351,"xref #8339 xref #8350 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1954360112,PR_kwDOAMm_X85dYtpz,8349,[skip-ci] dev whats-new,2448579,closed,0,,,1,2023-10-20T14:02:07Z,2023-10-20T17:28:19Z,2023-10-20T14:54:30Z,MEMBER,,0,pydata/xarray/pulls/8349," - [ ] Closes #xxxx - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1950480317,PR_kwDOAMm_X85dLkAj,8334,Whats-new: 2023.10.0,2448579,closed,0,,,1,2023-10-18T19:22:06Z,2023-10-19T16:00:00Z,2023-10-19T15:59:58Z,MEMBER,,0,pydata/xarray/pulls/8334,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8334/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1944347086,PR_kwDOAMm_X85c2nyz,8316,Enable numbagg for reductions,2448579,closed,0,,,3,2023-10-16T04:46:10Z,2023-10-18T14:54:48Z,2023-10-18T10:39:30Z,MEMBER,,0,pydata/xarray/pulls/8316," - [ ] Tests added - check bottleneck tests - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8316/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1861954973,PR_kwDOAMm_X85YhnBZ,8100,Remove np.asarray in formatting.py,2448579,closed,0,,,2,2023-08-22T18:08:33Z,2023-10-18T13:31:25Z,2023-10-18T10:40:38Z,MEMBER,,0,pydata/xarray/pulls/8100,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1942673099,PR_kwDOAMm_X85cxp-D,8305,Update labeler.yml to add NamedArray,2448579,closed,0,,,0,2023-10-13T21:39:56Z,2023-10-14T06:47:08Z,2023-10-14T06:47:07Z,MEMBER,,0,pydata/xarray/pulls/8305," - [ ] Closes #xxxx - [ ] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [ ] New functions/methods are listed in `api.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1908084109,I_kwDOAMm_X85xuw2N,8223,release 2023.09.0,2448579,closed,0,,,6,2023-09-22T02:29:30Z,2023-09-26T08:12:46Z,2023-09-26T08:12:46Z,MEMBER,,,,"We've accumulated a nice number of changes. Can someone volunteer to do a release in the next few days? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1874773066,PR_kwDOAMm_X85ZMtUP,8126,Allow creating DataArrays with nD coordinate variables,2448579,closed,0,,,0,2023-08-31T04:40:37Z,2023-09-22T12:48:38Z,2023-09-22T12:48:34Z,MEMBER,,0,pydata/xarray/pulls/8126,"- [x] Closes #8106 - [x] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` cc @blaylockbk ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1175093771,I_kwDOAMm_X85GCoIL,6391,apply_ufunc and Datasets with variables without the core dimension,2448579,closed,0,,,5,2022-03-21T09:13:02Z,2023-09-17T08:20:15Z,2023-09-17T08:20:14Z,MEMBER,,,,"### Is your feature request related to a problem? Consider this example ```python ds = xr.Dataset({""a"": (""x"", [1, 2, 3]), ""b"": (""y"", [1, 2, 3])}) xr.apply_ufunc(np.mean, ds, input_core_dims=[[""x""]]) ``` This raises ``` ValueError: operand to apply_ufunc has required core dimensions ['x'], but some of these dimensions are absent on an input variable: ['x'] ``` because core dimension `x` is missing on variable `b`. This behaviour makes it annoying to use `apply_ufunc` on Datasets. ### Describe the solution you'd like Add a new kwarg to `apply_ufunc` called `missing_core_dim` that controls how to handle variables without *all* input core dimensions. This kwarg could take one of two values: 1. `""raise""` - raise an error, current behaviour 2. `""copy""` - skip applying the function and copy the variable from input to output. 3. `""drop""`- skip applying the function and drop the variable. ### Describe alternatives you've considered _No response_ ### Additional context _No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6391/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1874695065,I_kwDOAMm_X85vvZOZ,8125,failing tests with pandas 2.1,2448579,closed,0,,,10,2023-08-31T02:42:32Z,2023-09-15T13:12:02Z,2023-09-15T13:12:02Z,MEMBER,,,,"### What happened? See https://github.com/pydata/xarray/pull/8101 ``` FAILED xarray/tests/test_missing.py::test_interpolate_pd_compat - ValueError: 'fill_value' is not a valid keyword for DataFrame.interpolate FAILED xarray/tests/test_missing.py::test_interpolate_pd_compat_non_uniform_index - ValueError: 'fill_value' is not a valid keyword for DataFrame.interpolate ``` and this doctest ``` FAILED xarray/core/dataarray.py::xarray.core.dataarray.DataArray.to_unstacked_dataset ``` @pydata/xarray can someone take a look please?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1812504689,I_kwDOAMm_X85sCKBx,8006,Fix documentation about datetime_unit of xarray.DataArray.differentiate,2448579,closed,0,,,0,2023-07-19T18:31:10Z,2023-09-01T09:37:15Z,2023-09-01T09:37:15Z,MEMBER,,,,"Should say that `Y` and `M` cannot be supported with `datetime64` ### Discussed in https://github.com/pydata/xarray/discussions/8000
Originally posted by **jesieleo** July 19, 2023 I have a piece of data that looks like this ``` Dimensions: (time: 612, LEV: 15, latitude: 20, longitude: 357) Coordinates: * time (time) datetime64[ns] 1960-01-15 1960-02-15 ... 2010-12-15 * LEV (LEV) float64 5.01 15.07 25.28 35.76 ... 149.0 171.4 197.8 229.5 * latitude (latitude) float64 -4.75 -4.25 -3.75 -3.25 ... 3.75 4.25 4.75 * longitude (longitude) float64 114.2 114.8 115.2 115.8 ... 291.2 291.8 292.2 Data variables: u (time, LEV, latitude, longitude) float32 ... Attributes: (12/30) cdm_data_type: Grid Conventions: COARDS, CF-1.6, ACDD-1.3 creator_email: chepurin@umd.edu creator_name: APDRC creator_type: institution creator_url: https://www.atmos.umd.edu/~ocean/ ... ... standard_name_vocabulary: CF Standard Name Table v29 summary: Simple Ocean Data Assimilation (SODA) soda po... time_coverage_end: 2010-12-15T00:00:00Z time_coverage_start: 1983-01-15T00:00:00Z title: SODA soda pop2.2.4 [TIME][LEV][LAT][LON] Westernmost_Easting: 118.25 ``` when i try to use xarray.DataArray.differentiate `data.u.differentiate('time',datetime_unit='M')` will appear ``` Traceback (most recent call last): File """", line 1, in File ""D:\Anaconda3\lib\site-packages\xarray\core\dataarray.py"", line 3609, in differentiate ds = self._to_temp_dataset().differentiate(coord, edge_order, datetime_unit) File ""D:\Anaconda3\lib\site-packages\xarray\core\dataset.py"", line 6372, in differentiate coord_var = coord_var._to_numeric(datetime_unit=datetime_unit) File ""D:\Anaconda3\lib\site-packages\xarray\core\variable.py"", line 2428, in _to_numeric numeric_array = duck_array_ops.datetime_to_numeric( File ""D:\Anaconda3\lib\site-packages\xarray\core\duck_array_ops.py"", line 466, in datetime_to_numeric array = array / np.timedelta64(1, datetime_unit) TypeError: Cannot get a common metadata divisor for Numpy datatime metadata [ns] and [M] because they have incompatible nonlinear base time units. ``` Would you please told me is this a BUG?
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1861692451,PR_kwDOAMm_X85YgtYD,8098,[skip-ci] dev whats-new,2448579,closed,0,,,0,2023-08-22T15:20:54Z,2023-08-22T20:46:29Z,2023-08-22T20:46:29Z,MEMBER,,0,pydata/xarray/pulls/8098,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8098/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1603957501,I_kwDOAMm_X85fmnL9,7573,Add optional min versions to conda-forge recipe (`run_constrained`),2448579,closed,0,,,4,2023-02-28T23:12:15Z,2023-08-21T16:12:34Z,2023-08-21T16:12:21Z,MEMBER,,,,"### Is your feature request related to a problem? I opened this PR to add minimum versions for our optional dependencies: https://github.com/conda-forge/xarray-feedstock/pull/84/files to prevent issues like #7467 I think we'd need a policy to choose which ones to list. Here's the current list: ``` run_constrained: - bottleneck >=1.3 - cartopy >=0.20 - cftime >=1.5 - dask-core >=2022.1 - distributed >=2022.1 - flox >=0.5 - h5netcdf >=0.13 - h5py >=3.6 - hdf5 >=1.12 - iris >=3.1 - matplotlib-base >=3.5 - nc-time-axis >=1.4 - netcdf4 >=1.5.7 - numba >=0.55 - pint >=0.18 - scipy >=1.7 - seaborn >=0.11 - sparse >=0.13 - toolz >=0.11 - zarr >=2.10 ``` Some examples to think about: 1. `iris` seems like a bad one to force. It seems like people might use Iris and Xarray independently and Xarray shouldn't force a minimum version. 2. For backends, I arbitrarily kept `netcdf4`, `h5netcdf` and `zarr`. 3. It seems like we should keep array types: so `dask`, `sparse`, `pint`. ### Describe the solution you'd like _No response_ ### Describe alternatives you've considered _No response_ ### Additional context _No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7573/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1855338426,PR_kwDOAMm_X85YLRQH,8081,Add 2023.08.0 whats-new,2448579,closed,0,,,0,2023-08-17T16:36:06Z,2023-08-18T20:12:27Z,2023-08-18T20:12:25Z,MEMBER,,0,pydata/xarray/pulls/8081,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1829952467,PR_kwDOAMm_X85W1yq4,8033,Reduce pre-commit update frequency to monthly from weekly.,2448579,closed,0,,,0,2023-07-31T20:16:05Z,2023-08-01T16:48:12Z,2023-08-01T16:48:10Z,MEMBER,,0,pydata/xarray/pulls/8033,We could even go down to `quarterly`,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8033/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1642301775,PR_kwDOAMm_X85M-3H-,7684,Automatically chunk `other` in GroupBy binary ops.,2448579,closed,0,,,2,2023-03-27T15:15:22Z,2023-07-28T03:12:20Z,2023-07-27T16:41:33Z,MEMBER,,0,pydata/xarray/pulls/7684,"- [x] Closes #7683 - [x] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1642299599,I_kwDOAMm_X85h44DP,7683,automatically chunk in groupby binary ops,2448579,closed,0,,,0,2023-03-27T15:14:09Z,2023-07-27T16:41:35Z,2023-07-27T16:41:34Z,MEMBER,,,,"### What happened? From https://discourse.pangeo.io/t/xarray-unable-to-allocate-memory-how-to-size-up-problem/3233/4 Consider ``` python # ds is dataset with big dask arrays mean = ds.groupby(""time.day"").mean() mean.to_netcdf() mean = xr.open_dataset(...) ds.groupby(""time.day"") - mean ``` In `GroupBy._binary_op` https://github.com/pydata/xarray/blob/39caafae4452f5327a7cd671b18d4bb3eb3785ba/xarray/core/groupby.py#L616 we will eagerly construct `other` that is of the same size as `ds`. ### What did you expect to happen? I think the only solution is to automatically chunk if `ds` has dask arrays, and `other` (or `mean`) isn't backed by dask arrays. A chunk size of `1` seems sensible. ### Minimal Complete Verifiable Example _No response_ ### MVCE confirmation - [ ] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray. - [ ] Complete example — the example is self-contained, including all data and the text of any traceback. - [ ] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result. - [ ] New issue — a search of GitHub Issues suggests this is not a duplicate. ### Relevant log output _No response_ ### Anything else we need to know? _No response_ ### Environment
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1789989152,I_kwDOAMm_X85qsREg,7962,Better chunk manager error,2448579,closed,0,,,4,2023-07-05T17:27:25Z,2023-07-24T22:26:14Z,2023-07-24T22:26:13Z,MEMBER,,,,"### What happened? I just ran in to this error in an environment without dask. ``` TypeError: Could not find a Chunk Manager which recognises type ``` I think we could easily recommend the user to install a package that provides `dask` by looking at `type(array).__name__`. This would make the message a lot friendlier ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7962/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1675073096,PR_kwDOAMm_X85OrnNQ,7769,Fix groupby_bins when labels are specified,2448579,closed,0,,,2,2023-04-19T14:49:23Z,2023-07-22T01:01:34Z,2023-04-20T17:17:16Z,MEMBER,,0,pydata/xarray/pulls/7769,"- [x] Closes #7766 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` @gsieros can you try this out please? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7769/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1812646094,PR_kwDOAMm_X85V7g7q,8007,Update copyright year in README,2448579,closed,0,,,0,2023-07-19T20:00:50Z,2023-07-20T21:13:27Z,2023-07-20T21:13:26Z,MEMBER,,0,pydata/xarray/pulls/8007,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8007/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1806239984,PR_kwDOAMm_X85Vl5Ch,7989,Allow opening datasets with nD dimenson coordinate variables.,2448579,closed,0,,,5,2023-07-15T17:33:18Z,2023-07-19T19:06:25Z,2023-07-19T18:25:33Z,MEMBER,,0,pydata/xarray/pulls/7989," - [x] Closes #2233 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` Avoid automatic creating of Index variable when nD variable shares name with one of its dimensions. Closes #2233 ```python url = ""http://www.smast.umassd.edu:8080/thredds/dodsC/FVCOM/NECOFS/Forecasts/NECOFS_GOM3_FORECAST.nc"" ds = xr.open_dataset(url, engine=""netcdf4"") display(ds) xr.testing._assert_internal_invariants(ds, check_default_indexes=False) ! no raise on #7368 ``` ~The internal invariants assert fails on `main` but succeeds on #7368~. EDIT: now fixed the invariants check. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7989/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 1, ""eyes"": 0}",,,13221727,pull 1797636782,I_kwDOAMm_X85rJcKu,7976,Explore updating colormap code,2448579,closed,0,,,0,2023-07-10T21:51:30Z,2023-07-11T13:49:54Z,2023-07-11T13:49:53Z,MEMBER,,,,"### What is your issue? See https://github.com/matplotlib/matplotlib/issues/16296 Looks like the MPL API may have advanced enough that we can delete some of our use of private attributes.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7976/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1736542260,PR_kwDOAMm_X85R6fac,7888,"Add cfgrib,ipywidgets to doc env",2448579,closed,0,,,3,2023-06-01T15:11:10Z,2023-06-16T14:14:01Z,2023-06-16T14:13:59Z,MEMBER,,0,pydata/xarray/pulls/7888," - [x] Closes #7841 - [x] Closes #7892 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7888/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1600382587,PR_kwDOAMm_X85Kyh9V,7561,Introduce Grouper objects internally,2448579,closed,0,,,4,2023-02-27T03:11:36Z,2023-06-14T21:18:26Z,2023-05-04T02:35:57Z,MEMBER,,0,pydata/xarray/pulls/7561,"Builds on the refactoring in #7206 - [x] xref #6610 - [x] Use TimeResampleGrouper ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7561/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1701070898,PR_kwDOAMm_X85QCzA1,7830,Fix .groupby(multi index level),2448579,closed,0,,,0,2023-05-08T23:16:07Z,2023-06-06T00:21:36Z,2023-06-06T00:21:31Z,MEMBER,,0,pydata/xarray/pulls/7830,"- [x] Closes #6836 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1639732867,PR_kwDOAMm_X85M2fjy,7670,Delete built-in cfgrib backend,2448579,closed,0,,,5,2023-03-24T16:53:56Z,2023-06-01T15:22:33Z,2023-03-29T15:19:51Z,MEMBER,,0,pydata/xarray/pulls/7670,"- [x] Closes #7199 - [x] Tests ~added~ deleted - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1688781350,PR_kwDOAMm_X85PZf3R,7795,"[skip-ci] Add cftime groupby, resample benchmarks",2448579,closed,0,,,8,2023-04-28T15:49:39Z,2023-05-24T16:07:58Z,2023-05-02T15:56:55Z,MEMBER,,0,pydata/xarray/pulls/7795,"xref #7730 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7795/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1692597701,I_kwDOAMm_X85k4v3F,7808,Default behaviour of `min_count` wrong with flox,2448579,closed,0,,,0,2023-05-02T15:04:11Z,2023-05-10T02:39:45Z,2023-05-10T02:39:45Z,MEMBER,,,,"### What happened? ```python with xr.set_options(display_style=""text"", use_flox=False): with xr.set_options(use_flox=False): display( xr.DataArray( data=np.array([np.nan, 1, 1, np.nan, 1, 1]), dims=""x"", coords={""labels"": (""x"", np.array([1, 2, 3, 1, 2, 3]))}, ) .groupby(""labels"") .sum() ) with xr.set_options(use_flox=True): display( xr.DataArray( data=np.array([np.nan, 1, 1, np.nan, 1, 1]), dims=""x"", coords={""labels"": (""x"", np.array([1, 2, 3, 1, 2, 3]))}, ) .groupby(""labels"") .sum() ) ``` ``` # without flox array([0., 2., 2.]) Coordinates: * labels (labels) int64 1 2 3 # with flox array([nan, 2., 2.]) Coordinates: * labels (labels) int64 1 2 3 ``` ### What did you expect to happen? The same answer. We should set `min_count=0` when `min_count is None` ### Minimal Complete Verifiable Example _No response_ ### MVCE confirmation - [ ] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray. - [ ] Complete example — the example is self-contained, including all data and the text of any traceback. - [ ] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result. - [ ] New issue — a search of GitHub Issues suggests this is not a duplicate. ### Relevant log output _No response_ ### Anything else we need to know? _No response_ ### Environment
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7808/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1692612622,PR_kwDOAMm_X85PmMOy,7809,Fix `min_count` behaviour with flox.,2448579,closed,0,,,0,2023-05-02T15:13:17Z,2023-05-10T02:39:45Z,2023-05-10T02:39:43Z,MEMBER,,0,pydata/xarray/pulls/7809," - [x] Closes #7808 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7809/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1689773381,PR_kwDOAMm_X85PctlP,7798,Fix groupby binary ops when grouped array is subset relative to other,2448579,closed,0,,,3,2023-04-30T04:14:14Z,2023-05-03T12:58:35Z,2023-05-02T14:48:42Z,MEMBER,,0,pydata/xarray/pulls/7798," - [x] Closes #7797 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` cc @slevang ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7798/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1632422255,PR_kwDOAMm_X85Md6iW,7650,Pin pandas < 2,2448579,closed,0,,,3,2023-03-20T16:03:58Z,2023-04-25T13:42:48Z,2023-03-22T14:53:53Z,MEMBER,,0,pydata/xarray/pulls/7650,"Pandas is expecting to release v2 in two weeks (pandas-dev/pandas#46776 (comment)). But we are still incompatible with their main branch: - #7441 - #7420 This PR pins pandas to `<2`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1677167290,PR_kwDOAMm_X85Oyokd,7775,[skip-ci] dev whats-new,2448579,closed,0,,,0,2023-04-20T17:54:27Z,2023-04-20T21:08:14Z,2023-04-20T21:08:11Z,MEMBER,,0,pydata/xarray/pulls/7775,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7775/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1677161134,PR_kwDOAMm_X85OynVg,7774,[skip-ci] Release 2023.04.2,2448579,closed,0,,,0,2023-04-20T17:49:46Z,2023-04-20T18:26:39Z,2023-04-20T18:26:37Z,MEMBER,,0,pydata/xarray/pulls/7774,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7774/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull