id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type
2064698904,PR_kwDOAMm_X85jLHsQ,8584,Silence a bunch of CachingFileManager warnings,2448579,closed,0,,,1,2024-01-03T21:57:07Z,2024-04-03T21:08:27Z,2024-01-03T22:52:58Z,MEMBER,,0,pydata/xarray/pulls/8584,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2098659703,I_kwDOAMm_X859FwF3,8659,renaming index variables with `rename_vars` seems buggy,2448579,closed,0,,,1,2024-01-24T16:35:18Z,2024-03-15T19:21:51Z,2024-03-15T19:21:51Z,MEMBER,,,,"### What happened?
(xref #8658)
I'm not sure what the expected behaviour is here:
```python
import xarray as xr
import numpy as np
from xarray.testing import _assert_internal_invariants
ds = xr.Dataset()
ds.coords[""1""] = (""1"", np.array([1], dtype=np.uint32))
ds[""1_""] = (""1"", np.array([1], dtype=np.uint32))
ds = ds.rename_vars({""1"": ""0""})
ds
```
It looks like this sequence of operations creates a default index
But then
```python
from xarray.testing import _assert_internal_invariants
_assert_internal_invariants(ds, check_default_indexes=True)
```
fails with
```
...
File ~/repos/xarray/xarray/testing/assertions.py:301, in _assert_indexes_invariants_checks(indexes, possible_coord_variables, dims, check_default)
299 if check_default:
300 defaults = default_indexes(possible_coord_variables, dims)
--> 301 assert indexes.keys() == defaults.keys(), (set(indexes), set(defaults))
302 assert all(v.equals(defaults[k]) for k, v in indexes.items()), (
303 indexes,
304 defaults,
305 )
AssertionError: ({'0'}, set())
```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8659/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
2184871888,I_kwDOAMm_X86COn_Q,8830,"failing tests, all envs",2448579,closed,0,,,1,2024-03-13T20:56:34Z,2024-03-15T04:06:04Z,2024-03-15T04:06:04Z,MEMBER,,,,"### What happened?
All tests are failing because of an error in `create_test_data`
```
from xarray.tests import create_test_data
create_test_data()
```
```
---------------------------------------------------------------------------
AssertionError Traceback (most recent call last)
Cell In[3], line 2
1 from xarray.tests import create_test_data
----> 2 create_test_data()
File [~/repos/xarray/xarray/tests/__init__.py:329](http://localhost:8888/lab/workspaces/auto-P/tree/repos/devel/arraylake/~/repos/xarray/xarray/tests/__init__.py#line=328), in create_test_data(seed, add_attrs, dim_sizes)
327 obj.coords[""numbers""] = (""dim3"", numbers_values)
328 obj.encoding = {""foo"": ""bar""}
--> 329 assert all(var.values.flags.writeable for var in obj.variables.values())
330 return obj
AssertionError:
```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
2102852029,PR_kwDOAMm_X85lMXU0,8675,Fix NetCDF4 C version detection,2448579,closed,0,,,1,2024-01-26T20:23:54Z,2024-01-27T01:28:51Z,2024-01-27T01:28:49Z,MEMBER,,0,pydata/xarray/pulls/8675,"This fixes the failure locally for me.
cc @max-sixty ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2066129022,PR_kwDOAMm_X85jP678,8587,Silence another warning in test_backends.py,2448579,closed,0,,,1,2024-01-04T18:20:49Z,2024-01-05T16:13:05Z,2024-01-05T16:13:03Z,MEMBER,,0,pydata/xarray/pulls/8587,"Using 255 as fillvalue for int8 arrays will not be allowed any more. Previously this overflowed to -1. Now specify that instead.
On numpy 1.24.4
```
>>> np.array([255], dtype=""i1"")
DeprecationWarning: NumPy will stop allowing conversion of out-of-bound Python integers to integer arrays. The conversion of 255 to int8 will fail in the future.
array([-1], dtype=int8)
```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2052694433,PR_kwDOAMm_X85ilhQm,8565,Faster encoding functions.,2448579,closed,0,,,1,2023-12-21T16:05:02Z,2024-01-04T14:25:45Z,2024-01-04T14:25:43Z,MEMBER,,0,pydata/xarray/pulls/8565,"Spotted when profiling some write workloads.
1. Speeds up the check for multi-index
2. Speeds up one string encoder by not re-creating variables when not necessary.
@benbovy is there a better way?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2052610515,PR_kwDOAMm_X85ilOq9,8564,Fix mypy type ignore,2448579,closed,0,,,1,2023-12-21T15:15:26Z,2023-12-21T15:41:13Z,2023-12-21T15:24:52Z,MEMBER,,0,pydata/xarray/pulls/8564,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
2021754904,PR_kwDOAMm_X85g8gnU,8506,Deprecate `squeeze` in GroupBy.,2448579,closed,0,,,1,2023-12-02T00:08:50Z,2023-12-02T00:13:36Z,2023-12-02T00:13:36Z,MEMBER,,0,pydata/xarray/pulls/8506,"- [x] Closes #2157
- [ ] Tests added
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`
Could use a close-ish review.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1989212292,PR_kwDOAMm_X85fOYwT,8444,Remove keep_attrs from resample signature,2448579,closed,0,,,1,2023-11-12T02:57:59Z,2023-11-12T22:53:36Z,2023-11-12T22:53:35Z,MEMBER,,0,pydata/xarray/pulls/8444,"
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1471673992,PR_kwDOAMm_X85EFDiU,7343,Fix mypy failures,2448579,closed,0,,,1,2022-12-01T17:16:44Z,2023-11-06T04:25:52Z,2022-12-01T18:25:07Z,MEMBER,,0,pydata/xarray/pulls/7343,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7343/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1689364566,PR_kwDOAMm_X85PbeOv,7796,Speed up .dt accessor by preserving Index objects.,2448579,closed,0,,,1,2023-04-29T04:22:10Z,2023-11-06T04:25:42Z,2023-05-16T17:55:48Z,MEMBER,,0,pydata/xarray/pulls/7796,"
- [ ] Closes #xxxx
- [ ] Tests added
- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`
- [ ] New functions/methods are listed in `api.rst`
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7796/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1954535213,PR_kwDOAMm_X85dZT47,8351,"[skip-ci] Add benchmarks for Dataset binary ops, chunk",2448579,closed,0,,,1,2023-10-20T15:31:36Z,2023-10-20T18:08:40Z,2023-10-20T18:08:38Z,MEMBER,,0,pydata/xarray/pulls/8351,"xref #8339
xref #8350
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1954360112,PR_kwDOAMm_X85dYtpz,8349,[skip-ci] dev whats-new,2448579,closed,0,,,1,2023-10-20T14:02:07Z,2023-10-20T17:28:19Z,2023-10-20T14:54:30Z,MEMBER,,0,pydata/xarray/pulls/8349,"
- [ ] Closes #xxxx
- [ ] Tests added
- [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst`
- [ ] New functions/methods are listed in `api.rst`
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1950480317,PR_kwDOAMm_X85dLkAj,8334,Whats-new: 2023.10.0,2448579,closed,0,,,1,2023-10-18T19:22:06Z,2023-10-19T16:00:00Z,2023-10-19T15:59:58Z,MEMBER,,0,pydata/xarray/pulls/8334,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/8334/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1666853925,PR_kwDOAMm_X85OQT4o,7753,Add benchmark against latest release on main.,2448579,closed,0,,,1,2023-04-13T17:35:33Z,2023-04-18T22:08:58Z,2023-04-18T22:08:56Z,MEMBER,,0,pydata/xarray/pulls/7753,This adds a benchmark of `HEAD` vs the latest tag on `main`.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7753/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1658287592,PR_kwDOAMm_X85N0Ad4,7735,Avoid recasting a CFTimeIndex,2448579,closed,0,,,1,2023-04-07T02:45:55Z,2023-04-11T21:12:07Z,2023-04-11T21:12:05Z,MEMBER,,0,pydata/xarray/pulls/7735,"xref #7730
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1232835029,PR_kwDOAMm_X843qWEU,6592,Restore old MultiIndex dropping behaviour,2448579,closed,0,,,1,2022-05-11T15:26:44Z,2022-10-18T19:15:42Z,2022-05-11T18:04:41Z,MEMBER,,0,pydata/xarray/pulls/6592,"
- [x] Closes #6505
- [x] Tests added","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1315480779,I_kwDOAMm_X85OaKTL,6817,wrong mean of complex values,2448579,closed,0,,,1,2022-07-22T23:09:47Z,2022-07-23T02:03:11Z,2022-07-23T02:03:11Z,MEMBER,,,,"### What happened?
Seen in #4972
``` python
import xarray as xr
import numpy as np
array = np.array([0. +0.j, 0.+np.nan * 1j], dtype=np.complex64)
var = xr.Variable(""x"", array)
print(var.mean().data)
print(array.mean())
```
```
0j
(nan+nanj)
```
### What did you expect to happen?
_No response_
### Minimal Complete Verifiable Example
_No response_
### MVCE confirmation
- [ ] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
- [ ] Complete example — the example is self-contained, including all data and the text of any traceback.
- [ ] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result.
- [ ] New issue — a search of GitHub Issues suggests this is not a duplicate.
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6817/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1306904506,PR_kwDOAMm_X847g-W3,6798,Drop multi-indexes when assigning to a multi-indexed variable,2448579,closed,0,,,1,2022-07-16T21:13:05Z,2022-07-21T14:46:59Z,2022-07-21T14:46:58Z,MEMBER,,0,pydata/xarray/pulls/6798,"- [x] Closes #6505
- [x] Tests added","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6798/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1221258144,PR_kwDOAMm_X843FiC3,6539,Direct usage questions to GH discussions,2448579,closed,0,,,1,2022-04-29T16:55:22Z,2022-04-30T02:03:46Z,2022-04-30T02:03:45Z,MEMBER,,0,pydata/xarray/pulls/6539,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6539/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1207159549,I_kwDOAMm_X85H88r9,6497,restrict stale bot,2448579,closed,0,,,1,2022-04-18T15:25:56Z,2022-04-18T16:11:11Z,2022-04-18T16:11:11Z,MEMBER,,,,"### What is your issue?
We have some stale issue but not that many.
Can we restrict the bot to only issues that are untagged, or tagged as ""usage question"" or are not assigned to a ""project"" instead? This might reduce a lot of the noise.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1200810062,PR_kwDOAMm_X842C5t3,6477,Propagate MultiIndex variables in broadcast,2448579,closed,0,,,1,2022-04-12T01:58:39Z,2022-04-13T14:49:35Z,2022-04-13T14:49:24Z,MEMBER,,0,pydata/xarray/pulls/6477,"xref #6293
- [x] Closes #6430
- [x] Tests added","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6477/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1188406993,I_kwDOAMm_X85G1abR,6430,Bug in broadcasting with multi-indexes,2448579,closed,0,,,1,2022-03-31T17:25:57Z,2022-04-13T14:49:23Z,2022-04-13T14:49:23Z,MEMBER,,,,"### What happened?
``` python
import numpy as np
import xarray as xr
ds = xr.Dataset(
{""foo"": ((""x"", ""y"", ""z""), np.ones((3, 4, 2)))},
{""x"": [""a"", ""b"", ""c""], ""y"": [1, 2, 3, 4]},
)
expected = ds.sum(""z"")
stacked = ds.stack(space=[""x"", ""y""])
broadcasted, _ = xr.broadcast(stacked, stacked.space)
stacked.sum(""z"").unstack(""space"") # works
broadcasted.sum(""z"").unstack(""space"") # error
```
```
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Input In [13], in
10 broadcasted, _ = xr.broadcast(stacked, stacked.space)
11 stacked.sum(""z"").unstack(""space"")
---> 12 broadcasted.sum(""z"").unstack(""space"")
File ~/work/python/xarray/xarray/core/dataset.py:4332, in Dataset.unstack(self, dim, fill_value, sparse)
4330 non_multi_dims = set(dims) - set(stacked_indexes)
4331 if non_multi_dims:
-> 4332 raise ValueError(
4333 ""cannot unstack dimensions that do not ""
4334 f""have exactly one multi-index: {tuple(non_multi_dims)}""
4335 )
4337 result = self.copy(deep=False)
4339 # we want to avoid allocating an object-dtype ndarray for a MultiIndex,
4340 # so we can't just access self.variables[v].data for every variable.
4341 # We only check the non-index variables.
4342 # https://github.com/pydata/xarray/issues/5902
ValueError: cannot unstack dimensions that do not have exactly one multi-index: ('space',)
```
### What did you expect to happen?
This should work.
### Minimal Complete Verifiable Example
_No response_
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
xarray main after the flexible indexes refactor","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1193704369,I_kwDOAMm_X85HJnux,6444,xr.where with scalar as second argument fails with keep_attrs=True,2448579,closed,0,,,1,2022-04-05T20:51:18Z,2022-04-12T02:12:39Z,2022-04-12T02:12:39Z,MEMBER,,,,"### What happened?
``` python
import xarray as xr
xr.where(xr.DataArray([1, 2, 3]) > 0, 1, 0)
```
fails with
```
1809 if keep_attrs is True:
1810 # keep the attributes of x, the second parameter, by default to
1811 # be consistent with the `where` method of `DataArray` and `Dataset`
-> 1812 keep_attrs = lambda attrs, context: attrs[1]
1814 # alignment for three arguments is complicated, so don't support it yet
1815 return apply_ufunc(
1816 duck_array_ops.where,
1817 cond,
(...)
1823 keep_attrs=keep_attrs,
1824 )
IndexError: list index out of range
```
The workaround is to pass `keep_attrs=False`
### What did you expect to happen?
_No response_
### Minimal Complete Verifiable Example
_No response_
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
xarray 2022.3.0","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
528168017,MDU6SXNzdWU1MjgxNjgwMTc=,3573,rasterio test failure,2448579,closed,0,,,1,2019-11-25T15:40:19Z,2022-04-09T01:17:32Z,2022-04-09T01:17:32Z,MEMBER,,,,"version
```
rasterio 1.1.1 py36h900e953_0 conda-forge
```
```
=================================== FAILURES ===================================
________________________ TestRasterio.test_rasterio_vrt ________________________
self =
def test_rasterio_vrt(self):
import rasterio
# tmp_file default crs is UTM: CRS({'init': 'epsg:32618'}
with create_tmp_geotiff() as (tmp_file, expected):
with rasterio.open(tmp_file) as src:
with rasterio.vrt.WarpedVRT(src, crs=""epsg:4326"") as vrt:
expected_shape = (vrt.width, vrt.height)
expected_crs = vrt.crs
expected_res = vrt.res
# Value of single pixel in center of image
lon, lat = vrt.xy(vrt.width // 2, vrt.height // 2)
> expected_val = next(vrt.sample([(lon, lat)]))
xarray/tests/test_backends.py:3966:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/share/miniconda/envs/xarray-tests/lib/python3.6/site-packages/rasterio/sample.py:43: in sample_gen
data = read(indexes, window=window, masked=masked, boundless=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> ???
E ValueError: WarpedVRT does not permit boundless reads
rasterio/_warp.pyx:978: ValueError
```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1001197796,I_kwDOAMm_X847rRDk,5804,vectorized groupby binary ops,2448579,closed,0,,,1,2021-09-20T17:04:47Z,2022-03-29T07:11:28Z,2022-03-29T07:11:28Z,MEMBER,,,,"By switching to `numpy_groupies` we are vectorizing our groupby reductions. I think we can do the same for groupby's binary ops.
Here's an example array
``` python
import numpy as np
import xarray as xr
%load_ext memory_profiler
N = 4 * 2000
da = xr.DataArray(
np.random.random((N, N)),
dims=(""x"", ""y""),
coords={""labels"": (""x"", np.repeat([""a"", ""b"", ""c"", ""d"", ""e"", ""f"", ""g"", ""h""], repeats=N//8))},
)
```
Consider this ""anomaly"" calculation, anomaly defined relative to the group mean
``` python
def anom_current(da):
grouped = da.groupby(""labels"")
mean = grouped.mean()
anom = grouped - mean
return anom
```
With this approach, we loop over each group and apply the binary operation:
https://github.com/pydata/xarray/blob/a1635d324753588e353e4e747f6058936fa8cf1e/xarray/core/computation.py#L502-L525
This saves some memory, but becomes slow for large number of groups.
We could instead do
```
def anom_vectorized(da):
mean = da.groupby(""labels"").mean()
mean_expanded = mean.sel(labels=da.labels)
anom = da - mean_expanded
return anom
```
Now we are faster, but construct an extra array as big as the original array (I think this is an OK tradeoff).
```
%timeit anom_current(da)
# 1.4 s ± 20.5 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
%timeit anom_vectorized(da)
# 937 ms ± 5.26 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
```
------
(I haven't experimented with dask yet, so the following is just a theory).
I think the real benefit comes with dask. Depending on where the groups are located relative to chunking, we could end up creating a lot of tiny chunks by splitting up existing chunks. With the vectorized approach we can do better.
Ideally we would reindex the ""mean"" dask array with a numpy-array-of-repeated-ints such that the chunking of `mean_expanded` exactly matches the chunking of `da` along the grouped dimension.
~In practice, [dask.array.take](https://docs.dask.org/en/latest/_modules/dask/array/routines.html#take) doesn't allow specifying ""output chunks"" so we'd end up chunking ""mean_expanded"" based on dask's automatic heuristics, and then rechunking again for the binary operation.~
Thoughts?
cc @rabernat","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5804/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
1174177534,I_kwDOAMm_X85F_Ib-,6381,vectorized indexing with DataArray should not preserve IndexVariable,2448579,closed,0,,,1,2022-03-19T05:08:39Z,2022-03-21T04:47:47Z,2022-03-21T04:47:47Z,MEMBER,,,,"### What happened?
After vectorized indexing a DataArray with dim `x`by a DataArray `z`, we get a DataArray with dim `z` and `x` as non-dim coordinate. But `x` is still an IndexVariable, not a normal variable.
### What did you expect to happen?
`x` should be a normal variable.
### Minimal Complete Verifiable Example
```python
import xarray as xr
xr.set_options(display_style=""text"")
da = xr.DataArray([1, 2, 3], dims=""x"", coords={""x"": [0, 1, 2]})
idxr = xr.DataArray([1], dims=""z"", name=""x"", coords={""z"": (""z"", [""a""])})
da.sel(x=idxr)
```
```
array([2])
Coordinates:
x (z) int64 1
* z (z)
array([1])
```
### Relevant log output
_No response_
### Anything else we need to know?
_No response_
### Environment
xarray main but this bug was present prior to the explicit indexes refactor.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6381/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
539171948,MDExOlB1bGxSZXF1ZXN0MzU0MTk0MDE0,3637,concat keeps attrs from first variable.,2448579,closed,0,,,1,2019-12-17T16:20:22Z,2022-01-05T18:57:38Z,2019-12-24T13:37:04Z,MEMBER,,0,pydata/xarray/pulls/3637," - [x] Closes #2060, closes #2575, xref #1614
- [x] Tests added
- [x] Passes `black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3637/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
664568885,MDExOlB1bGxSZXF1ZXN0NDU1Nzg5Mjk2,4259,Improve some error messages: apply_ufunc & set_options.,2448579,closed,0,,,1,2020-07-23T15:23:57Z,2022-01-05T18:57:23Z,2020-07-25T23:04:55Z,MEMBER,,0,pydata/xarray/pulls/4259,"Makes some error messages clearer
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4259/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
687325506,MDExOlB1bGxSZXF1ZXN0NDc0NzcwMjEy,4383,Dask/cleanup,2448579,closed,0,,,1,2020-08-27T15:14:19Z,2022-01-05T18:57:23Z,2020-09-02T20:03:03Z,MEMBER,,0,pydata/xarray/pulls/4383,"Some dask array cleanups
1. switch to using `dask.array.map_blocks` instead of `Array.map_blocks` (duck dask array compatibility)
2. Stop vendoring `meta_from_array` and `median`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4383/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
589835599,MDExOlB1bGxSZXF1ZXN0Mzk1MjgzNzU4,3916,facetgrid: fix case when vmin == vmax,2448579,closed,0,,,1,2020-03-29T16:59:14Z,2022-01-05T18:57:20Z,2020-04-03T19:48:55Z,MEMBER,,0,pydata/xarray/pulls/3916,"
- [x] Closes #3734
- [x] Tests added
- [x] Passes `isort -rc . && black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3916/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
490514191,MDExOlB1bGxSZXF1ZXN0MzE1MTA2NzA0,3288,Remove deprecated concat kwargs.,2448579,closed,0,,,1,2019-09-06T20:41:31Z,2022-01-05T18:57:02Z,2019-09-09T18:34:14Z,MEMBER,,0,pydata/xarray/pulls/3288,"
- [x] Passes `black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3288/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1038409453,PR_kwDOAMm_X84tyqhR,5905,[skip-ci] v0.20.0: whats-new for release,2448579,closed,0,,,1,2021-10-28T11:35:00Z,2022-01-05T18:56:55Z,2021-11-01T21:15:22Z,MEMBER,,0,pydata/xarray/pulls/5905,"Whats-new fixes for the release.
Feel free to push to this branch with more improvements.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5905/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1094502107,PR_kwDOAMm_X84wkWcB,6141,"Revert ""Deprecate bool(ds) (#6126)""",2448579,closed,0,,,1,2022-01-05T15:58:27Z,2022-01-05T16:57:33Z,2022-01-05T16:57:32Z,MEMBER,,0,pydata/xarray/pulls/6141,"This reverts commit d6ee8caa84b27d4635ec3384b1a06ef4ddf2d998.
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
1045266457,PR_kwDOAMm_X84uHoD5,5943,whats-new for 0.20.1,2448579,closed,0,,,1,2021-11-04T22:25:49Z,2021-11-05T17:00:24Z,2021-11-05T17:00:23Z,MEMBER,,0,pydata/xarray/pulls/5943,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5943/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
970815025,MDExOlB1bGxSZXF1ZXN0NzEyNzA3OTY2,5708,Add .git-blame-ignore-revs,2448579,closed,0,,,1,2021-08-14T04:04:10Z,2021-08-23T16:42:11Z,2021-08-23T16:42:09Z,MEMBER,,0,pydata/xarray/pulls/5708,"I found it useful to ignore big reformatting commits in git blame. See https://www.michaelheap.com/git-ignore-rev/
it's opt-in using a command-line flag or you can set
```
git config --global blame.ignoreRevsFile .git-blame-ignore-revs
```
Thoughts on adding it to the repo? If so, are there more commits we can add?
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
959380582,MDExOlB1bGxSZXF1ZXN0NzAyNTAzNTA0,5670,Flexible Indexes: Avoid len(index) in map_blocks,2448579,closed,0,,,1,2021-08-03T18:30:18Z,2021-08-05T13:28:48Z,2021-08-05T08:08:48Z,MEMBER,,0,pydata/xarray/pulls/5670,"xref https://github.com/pydata/xarray/pull/5636/files#r679823542
avoid `len(index)` in two places.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
926421732,MDExOlB1bGxSZXF1ZXN0Njc0NzI4MzAw,5506,Refactor dataset groupby tests,2448579,closed,0,,,1,2021-06-21T17:04:34Z,2021-06-22T16:26:16Z,2021-06-22T16:00:15Z,MEMBER,,0,pydata/xarray/pulls/5506,"
- xref #5409
Just moves the tests out, in preparation for numpy_groupies work
There are a few tests for `.assign` and `.fillna` (for e.g.) still present in `test_dataset`
The DataArray tests are not a simple copy and paste :(
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
891240764,MDU6SXNzdWU4OTEyNDA3NjQ=,5299,failing RTD build,2448579,closed,0,,,1,2021-05-13T17:50:37Z,2021-05-14T01:04:22Z,2021-05-14T01:04:22Z,MEMBER,,,,"The RTD build is failing on all PRs with
```
Sphinx parallel build error:
nbsphinx.NotebookError: UndefinedError in examples/ERA5-GRIB-example.ipynb:
'nbformat.notebooknode.NotebookNode object' has no attribute 'tags'
```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
850473442,MDU6SXNzdWU4NTA0NzM0NDI=,5113,docs sidebar formatting has changed,2448579,closed,0,,,1,2021-04-05T16:06:43Z,2021-04-19T02:35:34Z,2021-04-19T02:35:34Z,MEMBER,,,,"
**What happened**:
The formatting of section headings ""for users"", ""community"" etc. has changed: https://xarray.pydata.org/en/latest/

","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
819241806,MDU6SXNzdWU4MTkyNDE4MDY=,4980,fix bottleneck + Dask 1D rolling operations,2448579,closed,0,,,1,2021-03-01T20:38:34Z,2021-03-01T20:39:28Z,2021-03-01T20:39:27Z,MEMBER,,,,"Just as a reminder.
Right now all rolling operations with dask arrays use `.construct().reduce()`.
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4980/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
674414304,MDU6SXNzdWU2NzQ0MTQzMDQ=,4320,html repr doesn't work in some sphinx themes,2448579,closed,0,,,1,2020-08-06T15:45:54Z,2021-01-31T03:34:55Z,2021-01-31T03:34:54Z,MEMBER,,,,"Downstream issue: https://github.com/xarray-contrib/cf-xarray/issues/57
Example: no reprs displayed in https://cf-xarray.readthedocs.io/en/latest/examples/introduction.html
@benbovy's diagnosis:
> It looks like bootstrap 4 (used by sphinx-book-theme) forces all html elements with hidden attributes to be actually hidden (source), so the hack in pydata/xarray#4053 does not work here (the result is even worse).
> I guess that a workaround would be to add some custom CSS such as .xr-wrap { display: block !important }, assuming that custom CSS is loaded after Bootstrap's CSS. Not ideal, though, it looks like a hack on top of another hack.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4320/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
776595030,MDExOlB1bGxSZXF1ZXN0NTQ3MDUzOTM5,4744,Speed up Dataset._construct_dataarray,2448579,closed,0,,,1,2020-12-30T19:03:05Z,2021-01-05T17:32:16Z,2021-01-05T17:32:13Z,MEMBER,,0,pydata/xarray/pulls/4744,"
- [ ] Tests added
- [x] Passes `isort . && black . && mypy . && flake8`
- [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst`
Significantly speeds up `_construct_dataarray` by iterating over `._coord_names` instead of `.coords`. This avoids unnecessarily constructing a `DatasetCoordinates` object and massively speeds up repr construction for datasets with large numbers of variables.
Construct a 2000 variable dataset
```python
import numpy as np
import xarray as xr
a = np.arange(0, 2000)
b = np.core.defchararray.add(""long_variable_name"", a.astype(str))
coords = dict(time=np.array([0, 1]))
data_vars = dict()
for v in b:
data_vars[v] = xr.DataArray(
name=v,
data=np.array([3, 4]),
dims=[""time""],
coords=coords
)
ds0 = xr.Dataset(data_vars)
```
Before:
```
%timeit ds0['long_variable_name1999']
%timeit ds0.__repr__()
1.33 ms ± 23 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
2.66 s ± 52.7 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
```
After:
```
%timeit ds0['long_variable_name1999']
%timeit ds0.__repr__()
10.5 µs ± 203 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
84.2 ms ± 1.28 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)
```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4744/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
773121631,MDExOlB1bGxSZXF1ZXN0NTQ0MjYzMjMw,4722,Add Zenodo DOI badge,2448579,closed,0,,,1,2020-12-22T17:31:33Z,2020-12-23T17:07:09Z,2020-12-23T17:06:59Z,MEMBER,,0,pydata/xarray/pulls/4722,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
722539100,MDU6SXNzdWU3MjI1MzkxMDA=,4515,show dimension coordinates at top of coordinates repr,2448579,closed,0,,,1,2020-10-15T17:44:28Z,2020-11-06T18:49:55Z,2020-11-06T18:49:55Z,MEMBER,,,,"
**Is your feature request related to a problem? Please describe.**
I have datasets with lots of non-dim coord variables. Its annoying to search through and look at the dimension coordinates to get an idea of what subset of data I am looking at.

**Describe the solution you'd like**
I think we should show dimension coordinate variables at the top of the coordinates repr.
Example code
``` python
ds = xr.Dataset()
ds.coords[""as""] = 10
ds[""var""] = xr.DataArray(np.ones((10,)), dims=""x"", coords={""x"": np.arange(10)})
ds
```

Related #4409
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
677231206,MDExOlB1bGxSZXF1ZXN0NDY2MzgxNjY5,4335,Add @mathause to current core developers.,2448579,closed,0,,,1,2020-08-11T22:10:45Z,2020-08-11T22:51:35Z,2020-08-11T22:51:06Z,MEMBER,,0,pydata/xarray/pulls/4335,"
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4335/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
663833847,MDU6SXNzdWU2NjM4MzM4NDc=,4249,RTD PR builds are timing out,2448579,closed,0,,,1,2020-07-22T15:04:22Z,2020-07-22T21:17:59Z,2020-07-22T21:17:59Z,MEMBER,,,,"See https://readthedocs.org/projects/xray/builds/
There's no useful information in the logs AFAICT: e.g. https://readthedocs.org/projects/xray/builds/11504571/","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
617579699,MDU6SXNzdWU2MTc1Nzk2OTk=,4056,flake8 failure,2448579,closed,0,,,1,2020-05-13T16:16:20Z,2020-05-13T17:35:46Z,2020-05-13T17:35:46Z,MEMBER,,,,"flake8 is failing on master (https://dev.azure.com/xarray/xarray/_build/results?buildId=2820&view=logs&jobId=a577607c-d99b-546f-eeb4-2341e9a21630&j=a577607c-d99b-546f-eeb4-2341e9a21630&t=7308a173-bf34-5af1-b6d9-30c4d79bebeb) with
```
========================== Starting Command Output ===========================
/bin/bash --noprofile --norc /home/vsts/work/_temp/e6322963-dd1c-4887-ba6a-2aa7ec888f4c.sh
./xarray/backends/memory.py:43:32: E741 ambiguous variable name 'l'
./xarray/backends/common.py:244:32: E741 ambiguous variable name 'l'
##[error]Bash exited with code '1'.
Finishing: flake8 lint checks
```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4056/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
602771945,MDExOlB1bGxSZXF1ZXN0NDA1NzAwOTk4,3983,Better chunking error messages for zarr backend,2448579,closed,0,,,1,2020-04-19T17:19:53Z,2020-04-22T19:28:03Z,2020-04-22T19:27:59Z,MEMBER,,0,pydata/xarray/pulls/3983,Make some zarr error messages more helpful.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3983/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",,,13221727,pull
589833027,MDExOlB1bGxSZXF1ZXN0Mzk1MjgxODgw,3913,Use divergent colormap if lowest and highest level span 0,2448579,closed,0,,,1,2020-03-29T16:45:56Z,2020-04-07T15:59:12Z,2020-04-05T13:41:25Z,MEMBER,,0,pydata/xarray/pulls/3913,"
- [x] Closes #3524
- [x] Tests added
- [x] Passes `isort -rc . && black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3913/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
573964634,MDExOlB1bGxSZXF1ZXN0MzgyMzc1OTcz,3817,map_blocks: allow user function to add new unindexed dimension.,2448579,closed,0,,,1,2020-03-02T13:12:25Z,2020-03-21T19:51:12Z,2020-03-21T19:51:07Z,MEMBER,,0,pydata/xarray/pulls/3817," - [x] Tests added
- [x] Passes `isort -rc . && black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
Small change that makes `map_blocks` apply functions that add new unindexed dimensions.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3817/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
582474355,MDU6SXNzdWU1ODI0NzQzNTU=,3861,CI not running?,2448579,closed,0,,,1,2020-03-16T17:23:13Z,2020-03-17T13:18:07Z,2020-03-17T13:18:07Z,MEMBER,,,,"Looks like the last run was on Thursday: https://dev.azure.com/xarray/xarray/_build?definitionId=1&_a=summary&view=runs
No tests have been run for PRs #3826 #3836 #3858 and #3807 despite these having been updated recently.
There is a workaround posted at this Azure issue: https://status.dev.azure.com/_event/179641421 but it looks like a fix is coming soon.
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3861/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
564780280,MDExOlB1bGxSZXF1ZXN0Mzc0OTQ3OTE3,3769,concat now handles non-dim coordinates only present in one dataset,2448579,closed,0,,,1,2020-02-13T15:55:30Z,2020-02-23T20:48:19Z,2020-02-23T19:45:18Z,MEMBER,,0,pydata/xarray/pulls/3769,"
- [x] Tests added
- [x] Passes `isort -rc . && black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
```
da1 = xr.DataArray([1, 2, 3], dims=""x"", coords={""x"": [1, 2, 3], ""y"": 1})
da2 = xr.DataArray([4, 5, 6], dims=""x"", coords={""x"": [4, 5, 6]})
xr.concat([da1, da2], dim=""x"")
```
This use case is quite common since you can get `da1` from something like `bigger_da1.sel(y=1)`
On master this raises an uninformative `KeyError` because `'y'` is not present in all datasets. This is because `coords=""different""` by default which means that we are checking for equality. However `coords='different'`(and the equality check) is meaningless when the variable is only present in one of the objects to be concatenated.
This PR skips equality checking when a variable is only present in one dataset and raises a nicer error message when it is present in more than one but not all datasets.
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3769/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
534450028,MDExOlB1bGxSZXF1ZXN0MzUwMzQ1NDQ0,3605,fix dask master test,2448579,closed,0,,,1,2019-12-07T20:42:35Z,2019-12-09T15:40:38Z,2019-12-09T15:40:34Z,MEMBER,,0,pydata/xarray/pulls/3605,"
- [x] Closes #3603
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
530744399,MDExOlB1bGxSZXF1ZXN0MzQ3MzM3Njg5,3585,Add bottleneck & rasterio git tip to upstream-dev CI,2448579,closed,0,,,1,2019-12-01T14:57:02Z,2019-12-01T18:57:06Z,2019-12-01T18:57:03Z,MEMBER,,0,pydata/xarray/pulls/3585,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
526930276,MDExOlB1bGxSZXF1ZXN0MzQ0MzA2NjA0,3559,Reimplement quantile with apply_ufunc,2448579,closed,0,,,1,2019-11-22T01:16:29Z,2019-11-25T15:58:06Z,2019-11-25T15:57:49Z,MEMBER,,0,pydata/xarray/pulls/3559,"Adds support for dask arrays.
- [x] Closes #3326
- [x] Tests added
- [x] Passes `black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3559/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
522308594,MDExOlB1bGxSZXF1ZXN0MzQwNTMyNzUx,3519,"propagate indexes in to_dataset, from_dataset",2448579,closed,0,,,1,2019-11-13T15:49:35Z,2019-11-22T15:47:22Z,2019-11-22T15:47:18Z,MEMBER,,0,pydata/xarray/pulls/3519,happy to make changes!,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
507515103,MDExOlB1bGxSZXF1ZXN0MzI4NDkyMjY0,3403,Another groupby.reduce bugfix.,2448579,closed,0,,,1,2019-10-15T22:30:23Z,2019-10-25T21:01:16Z,2019-10-25T21:01:12Z,MEMBER,,0,pydata/xarray/pulls/3403," - [x] Closes #3402
- [x] Tests added
- [x] Passes `black . && mypy . && flake8`
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
433874617,MDU6SXNzdWU0MzM4NzQ2MTc=,2901,Link to dask documentation on chunks,2448579,closed,0,,,1,2019-04-16T16:29:13Z,2019-10-04T17:04:37Z,2019-10-04T17:04:37Z,MEMBER,,,,It would be good to link to https://docs.dask.org/en/latest/array-chunks.html in https://xarray.pydata.org/en/stable/dask.html#chunking-and-performance,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2901/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
500371226,MDExOlB1bGxSZXF1ZXN0MzIyODU3OTYy,3357,Add how do I ... section,2448579,closed,0,,,1,2019-09-30T16:00:34Z,2019-09-30T21:12:28Z,2019-09-30T21:12:23Z,MEMBER,,0,pydata/xarray/pulls/3357,"
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
Thoughts on adding something like this?

","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3357/reactions"", ""total_count"": 4, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
494681363,MDExOlB1bGxSZXF1ZXN0MzE4Mzg2NjI2,3314,move auto_combine deprecation to 0.14,2448579,closed,0,,,1,2019-09-17T15:04:38Z,2019-09-17T18:50:09Z,2019-09-17T18:50:06Z,MEMBER,,0,pydata/xarray/pulls/3314,"- [x] Closes #3280
This undoes the `auto_combine` deprecation until we figure out the best way to proceed.
(I am not very familiar with `auto_combine` so someone else should look over this. The tests all pass though...)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
385452676,MDExOlB1bGxSZXF1ZXN0MjM0NDIwNzYx,2581,fix examples,2448579,closed,0,,,1,2018-11-28T20:54:44Z,2019-08-15T15:33:09Z,2018-11-28T22:30:36Z,MEMBER,,0,pydata/xarray/pulls/2581," - [x] Closes #2580
Use `open_dataset.load()` instead of `load_dataset()`","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2581/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
328573601,MDExOlB1bGxSZXF1ZXN0MTkyMDc1OTM2,2210,Remove height=12in from facetgrid example plots.,2448579,closed,0,,,1,2018-06-01T15:56:52Z,2019-08-15T15:33:03Z,2018-06-01T16:15:50Z,MEMBER,,0,pydata/xarray/pulls/2210," - [x] Closes #2208 (remove if there is no corresponding issue, which should only be the case for minor changes)
This fixes it for me locally. The height was forced to be 12in, while width is 100%.
I'm not sure why this was added in the first place.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
322365966,MDExOlB1bGxSZXF1ZXN0MTg3NTE3MTUx,2120,Prevent Inf from screwing colorbar scale.,2448579,closed,0,,,1,2018-05-11T16:55:34Z,2019-08-15T15:32:51Z,2018-05-12T06:36:37Z,MEMBER,,0,pydata/xarray/pulls/2120," - [x] Tests added (for all bug fixes or enhancements)
- [x] Tests passed (for all non-documentation changes)
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
The current version uses `pd.isnull` to remove invalid values from input data when making a colorbar. `pd.isnull([np.inf])` is False which means `_determine_cmap_params` returns Inf for colorbar limits which screws everything up. This PR changes `pd.isnull` to `np.isfinite`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
445776616,MDExOlB1bGxSZXF1ZXN0MjgwMTQwODcz,2973,More support for missing_value.,2448579,closed,0,,,1,2019-05-19T03:41:56Z,2019-06-12T15:32:32Z,2019-06-12T15:32:27Z,MEMBER,,0,pydata/xarray/pulls/2973," - [x] Closes #2871
- [x] Tests added
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2973/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
331795471,MDExOlB1bGxSZXF1ZXN0MTk0NDM5NDgz,2229,Bugfix for faceting line plots.,2448579,closed,0,,,1,2018-06-13T00:04:43Z,2019-04-12T16:31:18Z,2018-06-20T16:26:37Z,MEMBER,,0,pydata/xarray/pulls/2229,"Closes #2239
Fixes a broken doc image: http://xarray.pydata.org/en/stable/plotting.html#id4
The tests passed previously because there was no metadata associated with the test DataArray. I've assigned some now, that should be good enough.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
391295532,MDExOlB1bGxSZXF1ZXN0MjM4ODU1NzM4,2608,.resample now supports loffset.,2448579,closed,0,,,1,2018-12-14T22:07:06Z,2019-04-12T16:29:09Z,2018-12-19T05:12:59Z,MEMBER,,0,pydata/xarray/pulls/2608," - [x] Tests added
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
391440525,MDExOlB1bGxSZXF1ZXN0MjM4OTQ5ODI2,2611,doc fixes.,2448579,closed,0,,,1,2018-12-16T06:47:07Z,2018-12-17T21:57:36Z,2018-12-17T21:57:36Z,MEMBER,,0,pydata/xarray/pulls/2611," - [x] Closes #2610
Quickfixes to make things work locally. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
389865283,MDU6SXNzdWUzODk4NjUyODM=,2600,Tests are failing on dask-dev,2448579,closed,0,,,1,2018-12-11T17:09:57Z,2018-12-12T03:13:30Z,2018-12-12T03:13:30Z,MEMBER,,,,"Sample error from https://travis-ci.org/pydata/xarray/jobs/466431752
```
_______________________ test_dataarray_with_dask_coords ________________________
def test_dataarray_with_dask_coords():
import toolz
x = xr.Variable('x', da.arange(8, chunks=(4,)))
y = xr.Variable('y', da.arange(8, chunks=(4,)) * 2)
data = da.random.random((8, 8), chunks=(4, 4)) + 1
array = xr.DataArray(data, dims=['x', 'y'])
array.coords['xx'] = x
array.coords['yy'] = y
assert dict(array.__dask_graph__()) == toolz.merge(data.__dask_graph__(),
x.__dask_graph__(),
y.__dask_graph__())
> (array2,) = dask.compute(array)
xarray/tests/test_dask.py:824:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:395: in compute
dsk = collections_to_dsk(collections, optimize_graph, **kwargs)
../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:187: in collections_to_dsk
for opt, val in groups.items()}
../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:187: in
for opt, val in groups.items()}
../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:212: in _extract_graph_and_keys
graph = merge(*graphs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
dicts = , kwargs = {}
factory = , rv = {}
d = ('arange-36f53ab1e6153a63bbf7f4f8ff56693c', 0)
def merge(*dicts, **kwargs):
"""""" Merge a collection of dictionaries
>>> merge({1: 'one'}, {2: 'two'})
{1: 'one', 2: 'two'}
Later dictionaries have precedence
>>> merge({1: 2, 3: 4}, {3: 3, 4: 4})
{1: 2, 3: 3, 4: 4}
See Also:
merge_with
""""""
if len(dicts) == 1 and not isinstance(dicts[0], dict):
dicts = dicts[0]
factory = _get_factory(merge, kwargs)
rv = factory()
for d in dicts:
> rv.update(d)
E ValueError: dictionary update sequence element #0 has length 39; 2 is required
../../../miniconda/envs/test_env/lib/python3.6/site-packages/toolz/dicttoolz.py:39: ValueError
```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue
374183531,MDExOlB1bGxSZXF1ZXN0MjI1OTQ0MTg4,2513,Make sure datetime object arrays are converted to datetime64,2448579,closed,0,,,1,2018-10-26T00:33:07Z,2018-10-27T16:34:58Z,2018-10-27T16:34:54Z,MEMBER,,0,pydata/xarray/pulls/2513," - [x] Closes #2512 (remove if there is no corresponding issue, which should only be the case for minor changes)
- [x] Tests added (for all bug fixes or enhancements)
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
364419418,MDExOlB1bGxSZXF1ZXN0MjE4NjA5NTgz,2444,facetgrid: properly support cbar_kwargs.,2448579,closed,0,,,1,2018-09-27T10:59:44Z,2018-10-25T16:06:57Z,2018-10-25T16:06:54Z,MEMBER,,0,pydata/xarray/pulls/2444," - [x] Closes #1504, closes #1717, closes #1735
- [x] Tests added (for all bug fixes or enhancements)
- [x] Tests passed (for all non-documentation changes)
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
#1735 is stalled, so I jumped in.
I've added an error if `cbar_ax` is provided as an option for `FacetGrid`. Don't think it's really needed.
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
356381494,MDExOlB1bGxSZXF1ZXN0MjEyNjU3NTU1,2395,Properly support user-provided norm.,2448579,closed,0,,,1,2018-09-03T07:04:45Z,2018-09-05T06:53:35Z,2018-09-05T06:53:30Z,MEMBER,,0,pydata/xarray/pulls/2395," - [x] Closes #2381 (remove if there is no corresponding issue, which should only be the case for minor changes)
- [x] Tests added (for all bug fixes or enhancements)
- [x] Tests passed (for all non-documentation changes)
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2395/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
282344188,MDExOlB1bGxSZXF1ZXN0MTU4NTQwODI1,1786,_plot2d: Rotate x-axis ticks if datetime subtype,2448579,closed,0,,,1,2017-12-15T07:43:19Z,2018-05-10T05:12:19Z,2018-01-03T16:37:56Z,MEMBER,,0,pydata/xarray/pulls/1786,"Rotate x-axis dateticks by default, just as for plot.line()
- [x] Tests added (for all bug fixes or enhancements)
- [x] Tests passed (for all non-documentation changes)
- [x] Passes ``git diff upstream/master **/*py | flake8 --diff`` (remove if you did not edit any Python files)
- [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1786/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull
308107438,MDExOlB1bGxSZXF1ZXN0MTc3MTI5MTA1,2012,Add weighted mean docs.,2448579,closed,0,,,1,2018-03-23T16:57:29Z,2018-03-23T22:55:32Z,2018-03-23T22:51:57Z,MEMBER,,0,pydata/xarray/pulls/2012,"I like @fujiisoup's weighted mean demo in this stack overflow example:
https://stackoverflow.com/questions/48510784/xarray-rolling-mean-with-weights
I thought it'd be a useful addition to the docs on rolling.
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2012/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull