home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

87 rows where comments = 1 and user = 2448579 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: draft, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 59
  • issue 28

state 2

  • closed 74
  • open 13

repo 1

  • xarray 87
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
2278499376 PR_kwDOAMm_X85uhFke 8997 Zarr: Optimize `region="auto"` detection dcherian 2448579 open 0     1 2024-05-03T22:13:18Z 2024-05-04T21:47:39Z   MEMBER   0 pydata/xarray/pulls/8997
  1. This moves the region detection code into ZarrStore so we only open the store once.
  2. Instead of opening the store as a dataset, construct a pd.Index directly to "auto"-infer the region.

The diff is large mostly because a bunch of code moved from backends/api.py to backends/zarr.py

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8997/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2248614324 I_kwDOAMm_X86GByG0 8952 `isel(multi_index_level_name = MultiIndex.level)` corrupts the MultiIndex dcherian 2448579 open 0     1 2024-04-17T15:41:39Z 2024-04-18T13:14:46Z   MEMBER      

What happened?

From https://github.com/pydata/xarray/discussions/8951

if d is a MultiIndex-ed dataset with levels (x, y, z), and m is a dataset with a single coord x m.isel(x=d.x) builds a dataset with a MultiIndex with levels (y, z). This seems like it should work.

cc @benbovy

What did you expect to happen?

No response

Minimal Complete Verifiable Example

```Python import pandas as pd, xarray as xr, numpy as np

xr.set_options(use_flox=True)

test = pd.DataFrame() test["x"] = np.arange(100) % 10 test["y"] = np.arange(100) test["z"] = np.arange(100) test["v"] = np.arange(100)

d = xr.Dataset.from_dataframe(test) d = d.set_index(index = ["x", "y", "z"]) print(d)

m = d.groupby("x").mean() print(m)

print(d.xindexes) print(m.isel(x=d.x).xindexes)

xr.align(d, m.isel(x=d.x))

res = d.groupby("x") - m

print(res)

```

<xarray.Dataset> Dimensions: (index: 100) Coordinates: * index (index) object MultiIndex * x (index) int64 0 1 2 3 4 5 6 7 8 9 0 1 2 ... 8 9 0 1 2 3 4 5 6 7 8 9 * y (index) int64 0 1 2 3 4 5 6 7 8 9 ... 90 91 92 93 94 95 96 97 98 99 * z (index) int64 0 1 2 3 4 5 6 7 8 9 ... 90 91 92 93 94 95 96 97 98 99 Data variables: v (index) int64 0 1 2 3 4 5 6 7 8 9 ... 90 91 92 93 94 95 96 97 98 99 <xarray.Dataset> Dimensions: (x: 10) Coordinates: * x (x) int64 0 1 2 3 4 5 6 7 8 9 Data variables: v (x) float64 45.0 46.0 47.0 48.0 49.0 50.0 51.0 52.0 53.0 54.0 Indexes: ┌ index PandasMultiIndex │ x │ y └ z Indexes: ┌ index PandasMultiIndex │ y └ z ValueError...

MVCE confirmation

  • [x] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [x] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [x] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [x] New issue — a search of GitHub Issues suggests this is not a duplicate.
  • [x] Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

No response

Anything else we need to know?

No response

Environment

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8952/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
2215762637 PR_kwDOAMm_X85rMHpN 8893 Avoid extra read from disk when creating Pandas Index. dcherian 2448579 open 0     1 2024-03-29T17:44:52Z 2024-04-08T18:55:09Z   MEMBER   0 pydata/xarray/pulls/8893
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8893/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2224297504 PR_kwDOAMm_X85rpGUH 8906 Add invariant check for IndexVariable.name dcherian 2448579 open 0     1 2024-04-04T02:13:33Z 2024-04-05T07:12:54Z   MEMBER   1 pydata/xarray/pulls/8906

@benbovy this seems to be the root cause of #8646, the variable name in Dataset._variables does not match IndexVariable.name.

A good number of tests seem to fail though, so not sure if this is a good chck.

  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8906/reactions",
    "total_count": 2,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 2,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2064698904 PR_kwDOAMm_X85jLHsQ 8584 Silence a bunch of CachingFileManager warnings dcherian 2448579 closed 0     1 2024-01-03T21:57:07Z 2024-04-03T21:08:27Z 2024-01-03T22:52:58Z MEMBER   0 pydata/xarray/pulls/8584  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8584/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2213636579 I_kwDOAMm_X86D8Wnj 8887 resetting multiindex may be buggy dcherian 2448579 open 0     1 2024-03-28T16:23:38Z 2024-03-29T07:59:22Z   MEMBER      

What happened?

Resetting a MultiIndex dim coordinate preserves the MultiIndex levels as IndexVariables. We should either reset the indexes for the multiindex level variables, or warn asking the users to do so

This seems to be the root cause exposed by https://github.com/pydata/xarray/pull/8809

cc @benbovy

What did you expect to happen?

No response

Minimal Complete Verifiable Example

```Python import numpy as np import xarray as xr

ND DataArray that gets stacked along a multiindex

da = xr.DataArray(np.ones((3, 3)), coords={"dim1": [1, 2, 3], "dim2": [4, 5, 6]}) da = da.stack(feature=["dim1", "dim2"])

Extract just the stacked coordinates for saving in a dataset

ds = xr.Dataset(data_vars={"feature": da.feature}) xr.testing.assertions._assert_internal_invariants(ds.reset_index(["feature", "dim1", "dim2"]), check_default_indexes=False) # succeeds xr.testing.assertions._assert_internal_invariants(ds.reset_index(["feature"]), check_default_indexes=False) # fails, but no warning either ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8887/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
2098659703 I_kwDOAMm_X859FwF3 8659 renaming index variables with `rename_vars` seems buggy dcherian 2448579 closed 0     1 2024-01-24T16:35:18Z 2024-03-15T19:21:51Z 2024-03-15T19:21:51Z MEMBER      

What happened?

(xref #8658)

I'm not sure what the expected behaviour is here:

```python import xarray as xr import numpy as np from xarray.testing import _assert_internal_invariants

ds = xr.Dataset() ds.coords["1"] = ("1", np.array([1], dtype=np.uint32)) ds["1_"] = ("1", np.array([1], dtype=np.uint32)) ds = ds.rename_vars({"1": "0"}) ds ```

It looks like this sequence of operations creates a default index

But then ```python from xarray.testing import _assert_internal_invariants

_assert_internal_invariants(ds, check_default_indexes=True) fails with ... File ~/repos/xarray/xarray/testing/assertions.py:301, in _assert_indexes_invariants_checks(indexes, possible_coord_variables, dims, check_default) 299 if check_default: 300 defaults = default_indexes(possible_coord_variables, dims) --> 301 assert indexes.keys() == defaults.keys(), (set(indexes), set(defaults)) 302 assert all(v.equals(defaults[k]) for k, v in indexes.items()), ( 303 indexes, 304 defaults, 305 )

AssertionError: ({'0'}, set()) ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8659/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
2184871888 I_kwDOAMm_X86COn_Q 8830 failing tests, all envs dcherian 2448579 closed 0     1 2024-03-13T20:56:34Z 2024-03-15T04:06:04Z 2024-03-15T04:06:04Z MEMBER      

What happened?

All tests are failing because of an error in create_test_data

from xarray.tests import create_test_data create_test_data()

```

AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data()

File ~/repos/xarray/xarray/tests/init.py:329, in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords["numbers"] = ("dim3", numbers_values) 328 obj.encoding = {"foo": "bar"} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj

AssertionError: ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8830/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
2102852029 PR_kwDOAMm_X85lMXU0 8675 Fix NetCDF4 C version detection dcherian 2448579 closed 0     1 2024-01-26T20:23:54Z 2024-01-27T01:28:51Z 2024-01-27T01:28:49Z MEMBER   0 pydata/xarray/pulls/8675

This fixes the failure locally for me.

cc @max-sixty

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8675/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2066129022 PR_kwDOAMm_X85jP678 8587 Silence another warning in test_backends.py dcherian 2448579 closed 0     1 2024-01-04T18:20:49Z 2024-01-05T16:13:05Z 2024-01-05T16:13:03Z MEMBER   0 pydata/xarray/pulls/8587

Using 255 as fillvalue for int8 arrays will not be allowed any more. Previously this overflowed to -1. Now specify that instead.

On numpy 1.24.4 ```

np.array([255], dtype="i1") DeprecationWarning: NumPy will stop allowing conversion of out-of-bound Python integers to integer arrays. The conversion of 255 to int8 will fail in the future.

array([-1], dtype=int8) ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8587/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2052694433 PR_kwDOAMm_X85ilhQm 8565 Faster encoding functions. dcherian 2448579 closed 0     1 2023-12-21T16:05:02Z 2024-01-04T14:25:45Z 2024-01-04T14:25:43Z MEMBER   0 pydata/xarray/pulls/8565

Spotted when profiling some write workloads. 1. Speeds up the check for multi-index 2. Speeds up one string encoder by not re-creating variables when not necessary.

@benbovy is there a better way?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8565/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2064480451 I_kwDOAMm_X857DXjD 8582 Adopt SPEC 0 instead of NEP-29 dcherian 2448579 open 0     1 2024-01-03T18:36:24Z 2024-01-03T20:12:05Z   MEMBER      

What is your issue?

https://docs.xarray.dev/en/stable/getting-started-guide/installing.html#minimum-dependency-versions says that we follow NEP-29, and I think our min versions script also does that.

I propose we follow https://scientific-python.org/specs/spec-0000/

In practice, I think this means we mostly drop Python versions earlier.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8582/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
2052610515 PR_kwDOAMm_X85ilOq9 8564 Fix mypy type ignore dcherian 2448579 closed 0     1 2023-12-21T15:15:26Z 2023-12-21T15:41:13Z 2023-12-21T15:24:52Z MEMBER   0 pydata/xarray/pulls/8564  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8564/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2021754904 PR_kwDOAMm_X85g8gnU 8506 Deprecate `squeeze` in GroupBy. dcherian 2448579 closed 0     1 2023-12-02T00:08:50Z 2023-12-02T00:13:36Z 2023-12-02T00:13:36Z MEMBER   0 pydata/xarray/pulls/8506
  • [x] Closes #2157
  • [ ] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

Could use a close-ish review.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8506/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1989212292 PR_kwDOAMm_X85fOYwT 8444 Remove keep_attrs from resample signature dcherian 2448579 closed 0     1 2023-11-12T02:57:59Z 2023-11-12T22:53:36Z 2023-11-12T22:53:35Z MEMBER   0 pydata/xarray/pulls/8444
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8444/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1471673992 PR_kwDOAMm_X85EFDiU 7343 Fix mypy failures dcherian 2448579 closed 0     1 2022-12-01T17:16:44Z 2023-11-06T04:25:52Z 2022-12-01T18:25:07Z MEMBER   0 pydata/xarray/pulls/7343  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7343/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1689364566 PR_kwDOAMm_X85PbeOv 7796 Speed up .dt accessor by preserving Index objects. dcherian 2448579 closed 0     1 2023-04-29T04:22:10Z 2023-11-06T04:25:42Z 2023-05-16T17:55:48Z MEMBER   0 pydata/xarray/pulls/7796
  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7796/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1954535213 PR_kwDOAMm_X85dZT47 8351 [skip-ci] Add benchmarks for Dataset binary ops, chunk dcherian 2448579 closed 0     1 2023-10-20T15:31:36Z 2023-10-20T18:08:40Z 2023-10-20T18:08:38Z MEMBER   0 pydata/xarray/pulls/8351

xref #8339 xref #8350

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8351/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1954360112 PR_kwDOAMm_X85dYtpz 8349 [skip-ci] dev whats-new dcherian 2448579 closed 0     1 2023-10-20T14:02:07Z 2023-10-20T17:28:19Z 2023-10-20T14:54:30Z MEMBER   0 pydata/xarray/pulls/8349
  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8349/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1950480317 PR_kwDOAMm_X85dLkAj 8334 Whats-new: 2023.10.0 dcherian 2448579 closed 0     1 2023-10-18T19:22:06Z 2023-10-19T16:00:00Z 2023-10-19T15:59:58Z MEMBER   0 pydata/xarray/pulls/8334  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8334/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 1,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1943543755 I_kwDOAMm_X85z2B_L 8310 pydata/xarray as monorepo for Xarray and NamedArray dcherian 2448579 open 0     1 2023-10-14T20:34:51Z 2023-10-14T21:29:11Z   MEMBER      

What is your issue?

As we work through refactoring for NamedArray, it's pretty clear that Xarray will depend pretty closely on many files in namedarray/. For example various utils.py, pycompat.py, *ops.py, formatting.py, formatting_html.py at least. This promises to be quite painful if we did break NamedArray out in to its own repo (particularly around typing, e.g. https://github.com/pydata/xarray/pull/8309)

I propose we use pydata/xarray as a monorepo that serves two packages: NamedArray and Xarray. - We can move as much as is needed to have NamedArray be independent of Xarray, but Xarray will depend quite closely on many utility functions in NamedArray. - We can release both at the same time similar to dask and distributed. - We can re-evaluate if and when NamedArray grows its own community.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8310/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
923355397 MDExOlB1bGxSZXF1ZXN0NjcyMTI5NzY4 5480 Implement weighted groupby dcherian 2448579 open 0     1 2021-06-17T02:57:17Z 2023-07-27T18:09:55Z   MEMBER   1 pydata/xarray/pulls/5480
  • xref #3937
  • [ ] Tests added
  • [ ] Passes pre-commit run --all-files
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst

Initial proof-of-concept. Suggestions to improve this are very welcome.

Here's some convenient testing code ``` python
import xarray as xr

ds = xr.tutorial.open_dataset('rasm').load() month_length = ds.time.dt.days_in_month weights = month_length.groupby('time.season') / month_length.groupby('time.season').sum()

actual = ds.weighted(month_length).groupby("time.season").mean() expected = (ds * weights).groupby('time.season').sum(skipna=False) xr.testing.assert_allclose(actual, expected) ```

I've added info to the repr python ds.weighted(month_length).groupby("time.season") WeightedDatasetGroupBy, grouped over 'season' 4 groups with labels 'DJF', 'JJA', 'MAM', 'SON'. weighted along dimensions: time by 'days_in_month'

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5480/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1822982776 I_kwDOAMm_X85sqIJ4 8023 Possible autoray integration dcherian 2448579 open 0     1 2023-07-26T18:57:59Z 2023-07-26T19:26:05Z   MEMBER      

I'm opening this issue for discussion really.

I stumbled on autoray (Github) by @jcmgray which provides an abstract interface to a number of array types.

What struck me was the very general lazy compute system. This opens up the possibility of lazy-but-not-dask computation.

Related: https://github.com/pydata/xarray/issues/2298 https://github.com/pydata/xarray/issues/1725 https://github.com/pydata/xarray/issues/5081

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8023/reactions",
    "total_count": 2,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 2
}
    xarray 13221727 issue
1666853925 PR_kwDOAMm_X85OQT4o 7753 Add benchmark against latest release on main. dcherian 2448579 closed 0     1 2023-04-13T17:35:33Z 2023-04-18T22:08:58Z 2023-04-18T22:08:56Z MEMBER   0 pydata/xarray/pulls/7753

This adds a benchmark of HEAD vs the latest tag on main.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7753/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1658287592 PR_kwDOAMm_X85N0Ad4 7735 Avoid recasting a CFTimeIndex dcherian 2448579 closed 0     1 2023-04-07T02:45:55Z 2023-04-11T21:12:07Z 2023-04-11T21:12:05Z MEMBER   0 pydata/xarray/pulls/7735

xref #7730

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7735/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1119647191 I_kwDOAMm_X85CvHXX 6220 [FEATURE]: Use fast path when grouping by unique monotonic decreasing variable dcherian 2448579 open 0     1 2022-01-31T16:24:29Z 2023-01-09T16:48:58Z   MEMBER      

Is your feature request related to a problem?

See https://github.com/pydata/xarray/pull/6213/files#r795716713

We check whether the by variable for groupby is unique and monotonically increasing. But the fast path would also apply to unique and monotonically decreasing variables.

Describe the solution you'd like

Update the condition to is_monotonic_increasing or is_monotonic_decreasing and add a test.

Describe alternatives you've considered

No response

Additional context

No response

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6220/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
1194945072 I_kwDOAMm_X85HOWow 6447 allow merging datasets where a variable might be a coordinate variable only in a subset of datasets dcherian 2448579 open 0     1 2022-04-06T17:53:51Z 2022-11-16T03:46:56Z   MEMBER      

Is your feature request related to a problem?

Here are two datasets, in one a is a data_var, in the other a is a coordinate variable. The following fails ``` python import xarray as xr

ds1 = xr.Dataset({"a": ('x', [1, 2, 3])}) ds2 = ds1.set_coords("a") ds2.update(ds1) with 649 ambiguous_coords = coord_names.intersection(noncoord_names) 650 if ambiguous_coords: --> 651 raise MergeError( 652 "unable to determine if these variables should be " 653 f"coordinates or not in the merged result: {ambiguous_coords}" 654 ) 656 attrs = merge_attrs( 657 [var.attrs for var in coerced if isinstance(var, (Dataset, DataArray))], 658 combine_attrs, 659 ) 661 return _MergeResult(variables, coord_names, dims, out_indexes, attrs)

MergeError: unable to determine if these variables should be coordinates or not in the merged result: {'a'} ```

Describe the solution you'd like

I think we should replace this error with a warning and arbitrarily choose to either convert a to a coordinate variable or a data variable.

Describe alternatives you've considered

No response

Additional context

No response

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6447/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
1232835029 PR_kwDOAMm_X843qWEU 6592 Restore old MultiIndex dropping behaviour dcherian 2448579 closed 0     1 2022-05-11T15:26:44Z 2022-10-18T19:15:42Z 2022-05-11T18:04:41Z MEMBER   0 pydata/xarray/pulls/6592
  • [x] Closes #6505
  • [x] Tests added
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6592/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1315480779 I_kwDOAMm_X85OaKTL 6817 wrong mean of complex values dcherian 2448579 closed 0     1 2022-07-22T23:09:47Z 2022-07-23T02:03:11Z 2022-07-23T02:03:11Z MEMBER      

What happened?

Seen in #4972

``` python import xarray as xr import numpy as np

array = np.array([0. +0.j, 0.+np.nan * 1j], dtype=np.complex64) var = xr.Variable("x", array) print(var.mean().data) print(array.mean()) ```

0j (nan+nanj)

What did you expect to happen?

No response

Minimal Complete Verifiable Example

No response

MVCE confirmation

  • [ ] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [ ] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [ ] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [ ] New issue — a search of GitHub Issues suggests this is not a duplicate.

Relevant log output

No response

Anything else we need to know?

No response

Environment

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6817/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1306904506 PR_kwDOAMm_X847g-W3 6798 Drop multi-indexes when assigning to a multi-indexed variable dcherian 2448579 closed 0     1 2022-07-16T21:13:05Z 2022-07-21T14:46:59Z 2022-07-21T14:46:58Z MEMBER   0 pydata/xarray/pulls/6798
  • [x] Closes #6505
  • [x] Tests added
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6798/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1221258144 PR_kwDOAMm_X843FiC3 6539 Direct usage questions to GH discussions dcherian 2448579 closed 0     1 2022-04-29T16:55:22Z 2022-04-30T02:03:46Z 2022-04-30T02:03:45Z MEMBER   0 pydata/xarray/pulls/6539  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6539/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1207159549 I_kwDOAMm_X85H88r9 6497 restrict stale bot dcherian 2448579 closed 0     1 2022-04-18T15:25:56Z 2022-04-18T16:11:11Z 2022-04-18T16:11:11Z MEMBER      

What is your issue?

We have some stale issue but not that many.

Can we restrict the bot to only issues that are untagged, or tagged as "usage question" or are not assigned to a "project" instead? This might reduce a lot of the noise.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6497/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1200810062 PR_kwDOAMm_X842C5t3 6477 Propagate MultiIndex variables in broadcast dcherian 2448579 closed 0     1 2022-04-12T01:58:39Z 2022-04-13T14:49:35Z 2022-04-13T14:49:24Z MEMBER   0 pydata/xarray/pulls/6477

xref #6293

  • [x] Closes #6430
  • [x] Tests added
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6477/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1188406993 I_kwDOAMm_X85G1abR 6430 Bug in broadcasting with multi-indexes dcherian 2448579 closed 0     1 2022-03-31T17:25:57Z 2022-04-13T14:49:23Z 2022-04-13T14:49:23Z MEMBER      

What happened?

``` python import numpy as np import xarray as xr

ds = xr.Dataset( {"foo": (("x", "y", "z"), np.ones((3, 4, 2)))}, {"x": ["a", "b", "c"], "y": [1, 2, 3, 4]}, ) expected = ds.sum("z") stacked = ds.stack(space=["x", "y"])

broadcasted, _ = xr.broadcast(stacked, stacked.space)

stacked.sum("z").unstack("space") # works broadcasted.sum("z").unstack("space") # error ```

```

ValueError Traceback (most recent call last) Input In [13], in <module> 10 broadcasted, _ = xr.broadcast(stacked, stacked.space) 11 stacked.sum("z").unstack("space") ---> 12 broadcasted.sum("z").unstack("space")

File ~/work/python/xarray/xarray/core/dataset.py:4332, in Dataset.unstack(self, dim, fill_value, sparse) 4330 non_multi_dims = set(dims) - set(stacked_indexes) 4331 if non_multi_dims: -> 4332 raise ValueError( 4333 "cannot unstack dimensions that do not " 4334 f"have exactly one multi-index: {tuple(non_multi_dims)}" 4335 ) 4337 result = self.copy(deep=False) 4339 # we want to avoid allocating an object-dtype ndarray for a MultiIndex, 4340 # so we can't just access self.variables[v].data for every variable. 4341 # We only check the non-index variables. 4342 # https://github.com/pydata/xarray/issues/5902

ValueError: cannot unstack dimensions that do not have exactly one multi-index: ('space',) ```

What did you expect to happen?

This should work.

Minimal Complete Verifiable Example

No response

Relevant log output

No response

Anything else we need to know?

No response

Environment

xarray main after the flexible indexes refactor

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6430/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1193704369 I_kwDOAMm_X85HJnux 6444 xr.where with scalar as second argument fails with keep_attrs=True dcherian 2448579 closed 0     1 2022-04-05T20:51:18Z 2022-04-12T02:12:39Z 2022-04-12T02:12:39Z MEMBER      

What happened?

``` python import xarray as xr

xr.where(xr.DataArray([1, 2, 3]) > 0, 1, 0) ```

fails with

`` 1809 if keep_attrs is True: 1810 # keep the attributes of x, the second parameter, by default to 1811 # be consistent with thewheremethod ofDataArrayandDataset` -> 1812 keep_attrs = lambda attrs, context: attrs[1] 1814 # alignment for three arguments is complicated, so don't support it yet 1815 return apply_ufunc( 1816 duck_array_ops.where, 1817 cond, (...) 1823 keep_attrs=keep_attrs, 1824 )

IndexError: list index out of range ```

The workaround is to pass keep_attrs=False

What did you expect to happen?

No response

Minimal Complete Verifiable Example

No response

Relevant log output

No response

Anything else we need to know?

No response

Environment

xarray 2022.3.0

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6444/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
528168017 MDU6SXNzdWU1MjgxNjgwMTc= 3573 rasterio test failure dcherian 2448579 closed 0     1 2019-11-25T15:40:19Z 2022-04-09T01:17:32Z 2022-04-09T01:17:32Z MEMBER      

version rasterio 1.1.1 py36h900e953_0 conda-forge

``` =================================== FAILURES =================================== ___ TestRasterio.testrasterio_vrt ____

self = <xarray.tests.test_backends.TestRasterio object at 0x7fc8355c8f60>

def test_rasterio_vrt(self):
    import rasterio

    # tmp_file default crs is UTM: CRS({'init': 'epsg:32618'}
    with create_tmp_geotiff() as (tmp_file, expected):
        with rasterio.open(tmp_file) as src:
            with rasterio.vrt.WarpedVRT(src, crs="epsg:4326") as vrt:
                expected_shape = (vrt.width, vrt.height)
                expected_crs = vrt.crs
                expected_res = vrt.res
                # Value of single pixel in center of image
                lon, lat = vrt.xy(vrt.width // 2, vrt.height // 2)
              expected_val = next(vrt.sample([(lon, lat)]))

xarray/tests/test_backends.py:3966:


/usr/share/miniconda/envs/xarray-tests/lib/python3.6/site-packages/rasterio/sample.py:43: in sample_gen data = read(indexes, window=window, masked=masked, boundless=True)


??? E ValueError: WarpedVRT does not permit boundless reads

rasterio/_warp.pyx:978: ValueError ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3573/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1001197796 I_kwDOAMm_X847rRDk 5804 vectorized groupby binary ops dcherian 2448579 closed 0     1 2021-09-20T17:04:47Z 2022-03-29T07:11:28Z 2022-03-29T07:11:28Z MEMBER      

By switching to numpy_groupies we are vectorizing our groupby reductions. I think we can do the same for groupby's binary ops.

Here's an example array ``` python import numpy as np import xarray as xr

%load_ext memory_profiler

N = 4 * 2000 da = xr.DataArray( np.random.random((N, N)), dims=("x", "y"), coords={"labels": ("x", np.repeat(["a", "b", "c", "d", "e", "f", "g", "h"], repeats=N//8))}, ) ```

Consider this "anomaly" calculation, anomaly defined relative to the group mean

``` python def anom_current(da): grouped = da.groupby("labels") mean = grouped.mean() anom = grouped - mean return anom

```

With this approach, we loop over each group and apply the binary operation: https://github.com/pydata/xarray/blob/a1635d324753588e353e4e747f6058936fa8cf1e/xarray/core/computation.py#L502-L525

This saves some memory, but becomes slow for large number of groups.

We could instead do def anom_vectorized(da): mean = da.groupby("labels").mean() mean_expanded = mean.sel(labels=da.labels) anom = da - mean_expanded return anom

Now we are faster, but construct an extra array as big as the original array (I think this is an OK tradeoff). ``` %timeit anom_current(da)

1.4 s ± 20.5 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)

%timeit anom_vectorized(da)

937 ms ± 5.26 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)

```

(I haven't experimented with dask yet, so the following is just a theory).

I think the real benefit comes with dask. Depending on where the groups are located relative to chunking, we could end up creating a lot of tiny chunks by splitting up existing chunks. With the vectorized approach we can do better.

Ideally we would reindex the "mean" dask array with a numpy-array-of-repeated-ints such that the chunking of mean_expanded exactly matches the chunking of da along the grouped dimension.

~In practice, dask.array.take doesn't allow specifying "output chunks" so we'd end up chunking "mean_expanded" based on dask's automatic heuristics, and then rechunking again for the binary operation.~

Thoughts?

cc @rabernat

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5804/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1174177534 I_kwDOAMm_X85F_Ib- 6381 vectorized indexing with DataArray should not preserve IndexVariable dcherian 2448579 closed 0     1 2022-03-19T05:08:39Z 2022-03-21T04:47:47Z 2022-03-21T04:47:47Z MEMBER      

What happened?

After vectorized indexing a DataArray with dim xby a DataArray z, we get a DataArray with dim z and x as non-dim coordinate. But x is still an IndexVariable, not a normal variable.

What did you expect to happen?

x should be a normal variable.

Minimal Complete Verifiable Example

```python import xarray as xr xr.set_options(display_style="text")

da = xr.DataArray([1, 2, 3], dims="x", coords={"x": [0, 1, 2]}) idxr = xr.DataArray([1], dims="z", name="x", coords={"z": ("z", ["a"])}) da.sel(x=idxr) ```

<xarray.DataArray (z: 1)> array([2]) Coordinates: x (z) int64 1 * z (z) <U1 'a'

x is a non-dim coordinate but is backed by a IndexVariable with the wrong name! python da.sel(x=idxr).x.variable

<xarray.IndexVariable 'z' (z: 1)> array([1])

Relevant log output

No response

Anything else we need to know?

No response

Environment

xarray main but this bug was present prior to the explicit indexes refactor.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6381/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
539171948 MDExOlB1bGxSZXF1ZXN0MzU0MTk0MDE0 3637 concat keeps attrs from first variable. dcherian 2448579 closed 0     1 2019-12-17T16:20:22Z 2022-01-05T18:57:38Z 2019-12-24T13:37:04Z MEMBER   0 pydata/xarray/pulls/3637
  • [x] Closes #2060, closes #2575, xref #1614
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3637/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
664568885 MDExOlB1bGxSZXF1ZXN0NDU1Nzg5Mjk2 4259 Improve some error messages: apply_ufunc & set_options. dcherian 2448579 closed 0     1 2020-07-23T15:23:57Z 2022-01-05T18:57:23Z 2020-07-25T23:04:55Z MEMBER   0 pydata/xarray/pulls/4259

Makes some error messages clearer

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4259/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
687325506 MDExOlB1bGxSZXF1ZXN0NDc0NzcwMjEy 4383 Dask/cleanup dcherian 2448579 closed 0     1 2020-08-27T15:14:19Z 2022-01-05T18:57:23Z 2020-09-02T20:03:03Z MEMBER   0 pydata/xarray/pulls/4383

Some dask array cleanups

  1. switch to using dask.array.map_blocks instead of Array.map_blocks (duck dask array compatibility)
  2. Stop vendoring meta_from_array and median
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4383/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
589835599 MDExOlB1bGxSZXF1ZXN0Mzk1MjgzNzU4 3916 facetgrid: fix case when vmin == vmax dcherian 2448579 closed 0     1 2020-03-29T16:59:14Z 2022-01-05T18:57:20Z 2020-04-03T19:48:55Z MEMBER   0 pydata/xarray/pulls/3916
  • [x] Closes #3734
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3916/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
490514191 MDExOlB1bGxSZXF1ZXN0MzE1MTA2NzA0 3288 Remove deprecated concat kwargs. dcherian 2448579 closed 0     1 2019-09-06T20:41:31Z 2022-01-05T18:57:02Z 2019-09-09T18:34:14Z MEMBER   0 pydata/xarray/pulls/3288
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3288/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1038409453 PR_kwDOAMm_X84tyqhR 5905 [skip-ci] v0.20.0: whats-new for release dcherian 2448579 closed 0     1 2021-10-28T11:35:00Z 2022-01-05T18:56:55Z 2021-11-01T21:15:22Z MEMBER   0 pydata/xarray/pulls/5905

Whats-new fixes for the release.

Feel free to push to this branch with more improvements.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5905/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1094502107 PR_kwDOAMm_X84wkWcB 6141 Revert "Deprecate bool(ds) (#6126)" dcherian 2448579 closed 0     1 2022-01-05T15:58:27Z 2022-01-05T16:57:33Z 2022-01-05T16:57:32Z MEMBER   0 pydata/xarray/pulls/6141

This reverts commit d6ee8caa84b27d4635ec3384b1a06ef4ddf2d998.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6141/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
514716299 MDU6SXNzdWU1MTQ3MTYyOTk= 3468 failure when roundtripping empty dataset to pandas dcherian 2448579 open 0     1 2019-10-30T14:28:31Z 2021-11-13T14:54:09Z   MEMBER      

see https://github.com/pydata/xarray/pull/3285

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3468/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
1045266457 PR_kwDOAMm_X84uHoD5 5943 whats-new for 0.20.1 dcherian 2448579 closed 0     1 2021-11-04T22:25:49Z 2021-11-05T17:00:24Z 2021-11-05T17:00:23Z MEMBER   0 pydata/xarray/pulls/5943  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5943/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
970815025 MDExOlB1bGxSZXF1ZXN0NzEyNzA3OTY2 5708 Add .git-blame-ignore-revs dcherian 2448579 closed 0     1 2021-08-14T04:04:10Z 2021-08-23T16:42:11Z 2021-08-23T16:42:09Z MEMBER   0 pydata/xarray/pulls/5708

I found it useful to ignore big reformatting commits in git blame. See https://www.michaelheap.com/git-ignore-rev/

it's opt-in using a command-line flag or you can set git config --global blame.ignoreRevsFile .git-blame-ignore-revs

Thoughts on adding it to the repo? If so, are there more commits we can add?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5708/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
959380582 MDExOlB1bGxSZXF1ZXN0NzAyNTAzNTA0 5670 Flexible Indexes: Avoid len(index) in map_blocks dcherian 2448579 closed 0     1 2021-08-03T18:30:18Z 2021-08-05T13:28:48Z 2021-08-05T08:08:48Z MEMBER   0 pydata/xarray/pulls/5670

xref https://github.com/pydata/xarray/pull/5636/files#r679823542

avoid len(index) in two places.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5670/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
520079199 MDU6SXNzdWU1MjAwNzkxOTk= 3497 how should xarray handle pandas attrs dcherian 2448579 open 0     1 2019-11-08T15:32:36Z 2021-07-04T03:31:02Z   MEMBER      

Continuing discussion form #3491.

Pandas has added attrs to their objects. We should decide on what to do with them in the DataArray constructor. Many tests fail if we don't handle this case explicitly.

@dcherian:

Not sure what we want to do about these attributes in the long term. One option would be to pop the name attribute, assign to DataArray.name and keep the rest as DataArray.attrs? But what if name clashes with the provided name?

@max-sixty:

Agree! I think we could prioritize the supplied name above that in attrs. Another option would be raising an error if both were supplied.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3497/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
926421732 MDExOlB1bGxSZXF1ZXN0Njc0NzI4MzAw 5506 Refactor dataset groupby tests dcherian 2448579 closed 0     1 2021-06-21T17:04:34Z 2021-06-22T16:26:16Z 2021-06-22T16:00:15Z MEMBER   0 pydata/xarray/pulls/5506
  • xref #5409

Just moves the tests out, in preparation for numpy_groupies work

There are a few tests for .assign and .fillna (for e.g.) still present in test_dataset

The DataArray tests are not a simple copy and paste :(

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5506/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
891240764 MDU6SXNzdWU4OTEyNDA3NjQ= 5299 failing RTD build dcherian 2448579 closed 0     1 2021-05-13T17:50:37Z 2021-05-14T01:04:22Z 2021-05-14T01:04:22Z MEMBER      

The RTD build is failing on all PRs with

Sphinx parallel build error: nbsphinx.NotebookError: UndefinedError in examples/ERA5-GRIB-example.ipynb: 'nbformat.notebooknode.NotebookNode object' has no attribute 'tags'

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5299/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
850473442 MDU6SXNzdWU4NTA0NzM0NDI= 5113 docs sidebar formatting has changed dcherian 2448579 closed 0     1 2021-04-05T16:06:43Z 2021-04-19T02:35:34Z 2021-04-19T02:35:34Z MEMBER      

What happened: The formatting of section headings "for users", "community" etc. has changed: https://xarray.pydata.org/en/latest/

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5113/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
819241806 MDU6SXNzdWU4MTkyNDE4MDY= 4980 fix bottleneck + Dask 1D rolling operations dcherian 2448579 closed 0     1 2021-03-01T20:38:34Z 2021-03-01T20:39:28Z 2021-03-01T20:39:27Z MEMBER      

Just as a reminder.

Right now all rolling operations with dask arrays use .construct().reduce().

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4980/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
674414304 MDU6SXNzdWU2NzQ0MTQzMDQ= 4320 html repr doesn't work in some sphinx themes dcherian 2448579 closed 0     1 2020-08-06T15:45:54Z 2021-01-31T03:34:55Z 2021-01-31T03:34:54Z MEMBER      

Downstream issue: https://github.com/xarray-contrib/cf-xarray/issues/57 Example: no reprs displayed in https://cf-xarray.readthedocs.io/en/latest/examples/introduction.html

@benbovy's diagnosis:

It looks like bootstrap 4 (used by sphinx-book-theme) forces all html elements with hidden attributes to be actually hidden (source), so the hack in pydata/xarray#4053 does not work here (the result is even worse). I guess that a workaround would be to add some custom CSS such as .xr-wrap { display: block !important }, assuming that custom CSS is loaded after Bootstrap's CSS. Not ideal, though, it looks like a hack on top of another hack.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4320/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
776595030 MDExOlB1bGxSZXF1ZXN0NTQ3MDUzOTM5 4744 Speed up Dataset._construct_dataarray dcherian 2448579 closed 0     1 2020-12-30T19:03:05Z 2021-01-05T17:32:16Z 2021-01-05T17:32:13Z MEMBER   0 pydata/xarray/pulls/4744
  • [ ] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

Significantly speeds up _construct_dataarray by iterating over ._coord_names instead of .coords. This avoids unnecessarily constructing a DatasetCoordinates object and massively speeds up repr construction for datasets with large numbers of variables.

Construct a 2000 variable dataset ```python import numpy as np import xarray as xr

a = np.arange(0, 2000) b = np.core.defchararray.add("long_variable_name", a.astype(str)) coords = dict(time=np.array([0, 1])) data_vars = dict() for v in b: data_vars[v] = xr.DataArray( name=v, data=np.array([3, 4]), dims=["time"], coords=coords ) ds0 = xr.Dataset(data_vars) ```

Before: ``` %timeit ds0['long_variable_name1999'] %timeit ds0.repr()

1.33 ms ± 23 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each) 2.66 s ± 52.7 ms per loop (mean ± std. dev. of 7 runs, 1 loop each) ```

After:

``` %timeit ds0['long_variable_name1999'] %timeit ds0.repr()

10.5 µs ± 203 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each) 84.2 ms ± 1.28 ms per loop (mean ± std. dev. of 7 runs, 10 loops each) ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4744/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
773121631 MDExOlB1bGxSZXF1ZXN0NTQ0MjYzMjMw 4722 Add Zenodo DOI badge dcherian 2448579 closed 0     1 2020-12-22T17:31:33Z 2020-12-23T17:07:09Z 2020-12-23T17:06:59Z MEMBER   0 pydata/xarray/pulls/4722
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4722/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
722539100 MDU6SXNzdWU3MjI1MzkxMDA= 4515 show dimension coordinates at top of coordinates repr dcherian 2448579 closed 0     1 2020-10-15T17:44:28Z 2020-11-06T18:49:55Z 2020-11-06T18:49:55Z MEMBER      

Is your feature request related to a problem? Please describe. I have datasets with lots of non-dim coord variables. Its annoying to search through and look at the dimension coordinates to get an idea of what subset of data I am looking at.

Describe the solution you'd like I think we should show dimension coordinate variables at the top of the coordinates repr.

Example code

python ds = xr.Dataset() ds.coords["as"] = 10 ds["var"] = xr.DataArray(np.ones((10,)), dims="x", coords={"x": np.arange(10)}) ds

Related #4409

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4515/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
677231206 MDExOlB1bGxSZXF1ZXN0NDY2MzgxNjY5 4335 Add @mathause to current core developers. dcherian 2448579 closed 0     1 2020-08-11T22:10:45Z 2020-08-11T22:51:35Z 2020-08-11T22:51:06Z MEMBER   0 pydata/xarray/pulls/4335
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4335/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
663833847 MDU6SXNzdWU2NjM4MzM4NDc= 4249 RTD PR builds are timing out dcherian 2448579 closed 0     1 2020-07-22T15:04:22Z 2020-07-22T21:17:59Z 2020-07-22T21:17:59Z MEMBER      

See https://readthedocs.org/projects/xray/builds/

There's no useful information in the logs AFAICT: e.g. https://readthedocs.org/projects/xray/builds/11504571/

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4249/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
617579699 MDU6SXNzdWU2MTc1Nzk2OTk= 4056 flake8 failure dcherian 2448579 closed 0     1 2020-05-13T16:16:20Z 2020-05-13T17:35:46Z 2020-05-13T17:35:46Z MEMBER      

flake8 is failing on master (https://dev.azure.com/xarray/xarray/_build/results?buildId=2820&view=logs&jobId=a577607c-d99b-546f-eeb4-2341e9a21630&j=a577607c-d99b-546f-eeb4-2341e9a21630&t=7308a173-bf34-5af1-b6d9-30c4d79bebeb) with

``` ========================== Starting Command Output =========================== /bin/bash --noprofile --norc /home/vsts/work/_temp/e6322963-dd1c-4887-ba6a-2aa7ec888f4c.sh ./xarray/backends/memory.py:43:32: E741 ambiguous variable name 'l' ./xarray/backends/common.py:244:32: E741 ambiguous variable name 'l'

[error]Bash exited with code '1'.

Finishing: flake8 lint checks ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4056/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
602771945 MDExOlB1bGxSZXF1ZXN0NDA1NzAwOTk4 3983 Better chunking error messages for zarr backend dcherian 2448579 closed 0     1 2020-04-19T17:19:53Z 2020-04-22T19:28:03Z 2020-04-22T19:27:59Z MEMBER   0 pydata/xarray/pulls/3983

Make some zarr error messages more helpful.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3983/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 1,
    "eyes": 0
}
    xarray 13221727 pull
589833027 MDExOlB1bGxSZXF1ZXN0Mzk1MjgxODgw 3913 Use divergent colormap if lowest and highest level span 0 dcherian 2448579 closed 0     1 2020-03-29T16:45:56Z 2020-04-07T15:59:12Z 2020-04-05T13:41:25Z MEMBER   0 pydata/xarray/pulls/3913
  • [x] Closes #3524
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3913/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
573964634 MDExOlB1bGxSZXF1ZXN0MzgyMzc1OTcz 3817 map_blocks: allow user function to add new unindexed dimension. dcherian 2448579 closed 0     1 2020-03-02T13:12:25Z 2020-03-21T19:51:12Z 2020-03-21T19:51:07Z MEMBER   0 pydata/xarray/pulls/3817
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

Small change that makes map_blocks apply functions that add new unindexed dimensions.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3817/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
582474355 MDU6SXNzdWU1ODI0NzQzNTU= 3861 CI not running? dcherian 2448579 closed 0     1 2020-03-16T17:23:13Z 2020-03-17T13:18:07Z 2020-03-17T13:18:07Z MEMBER      

Looks like the last run was on Thursday: https://dev.azure.com/xarray/xarray/_build?definitionId=1&_a=summary&view=runs

No tests have been run for PRs #3826 #3836 #3858 and #3807 despite these having been updated recently.

There is a workaround posted at this Azure issue: https://status.dev.azure.com/_event/179641421 but it looks like a fix is coming soon.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3861/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
564780280 MDExOlB1bGxSZXF1ZXN0Mzc0OTQ3OTE3 3769 concat now handles non-dim coordinates only present in one dataset dcherian 2448579 closed 0     1 2020-02-13T15:55:30Z 2020-02-23T20:48:19Z 2020-02-23T19:45:18Z MEMBER   0 pydata/xarray/pulls/3769
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

da1 = xr.DataArray([1, 2, 3], dims="x", coords={"x": [1, 2, 3], "y": 1}) da2 = xr.DataArray([4, 5, 6], dims="x", coords={"x": [4, 5, 6]}) xr.concat([da1, da2], dim="x") This use case is quite common since you can get da1 from something like bigger_da1.sel(y=1)

On master this raises an uninformative KeyError because 'y' is not present in all datasets. This is because coords="different" by default which means that we are checking for equality. However coords='different'(and the equality check) is meaningless when the variable is only present in one of the objects to be concatenated.

This PR skips equality checking when a variable is only present in one dataset and raises a nicer error message when it is present in more than one but not all datasets.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3769/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
534450028 MDExOlB1bGxSZXF1ZXN0MzUwMzQ1NDQ0 3605 fix dask master test dcherian 2448579 closed 0     1 2019-12-07T20:42:35Z 2019-12-09T15:40:38Z 2019-12-09T15:40:34Z MEMBER   0 pydata/xarray/pulls/3605
  • [x] Closes #3603
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3605/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
530744399 MDExOlB1bGxSZXF1ZXN0MzQ3MzM3Njg5 3585 Add bottleneck & rasterio git tip to upstream-dev CI dcherian 2448579 closed 0     1 2019-12-01T14:57:02Z 2019-12-01T18:57:06Z 2019-12-01T18:57:03Z MEMBER   0 pydata/xarray/pulls/3585
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3585/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
526930276 MDExOlB1bGxSZXF1ZXN0MzQ0MzA2NjA0 3559 Reimplement quantile with apply_ufunc dcherian 2448579 closed 0     1 2019-11-22T01:16:29Z 2019-11-25T15:58:06Z 2019-11-25T15:57:49Z MEMBER   0 pydata/xarray/pulls/3559

Adds support for dask arrays.

  • [x] Closes #3326
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3559/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 1,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
522308594 MDExOlB1bGxSZXF1ZXN0MzQwNTMyNzUx 3519 propagate indexes in to_dataset, from_dataset dcherian 2448579 closed 0     1 2019-11-13T15:49:35Z 2019-11-22T15:47:22Z 2019-11-22T15:47:18Z MEMBER   0 pydata/xarray/pulls/3519

happy to make changes!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3519/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
507515103 MDExOlB1bGxSZXF1ZXN0MzI4NDkyMjY0 3403 Another groupby.reduce bugfix. dcherian 2448579 closed 0     1 2019-10-15T22:30:23Z 2019-10-25T21:01:16Z 2019-10-25T21:01:12Z MEMBER   0 pydata/xarray/pulls/3403
  • [x] Closes #3402
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3403/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
433874617 MDU6SXNzdWU0MzM4NzQ2MTc= 2901 Link to dask documentation on chunks dcherian 2448579 closed 0     1 2019-04-16T16:29:13Z 2019-10-04T17:04:37Z 2019-10-04T17:04:37Z MEMBER      

It would be good to link to https://docs.dask.org/en/latest/array-chunks.html in https://xarray.pydata.org/en/stable/dask.html#chunking-and-performance

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2901/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
500371226 MDExOlB1bGxSZXF1ZXN0MzIyODU3OTYy 3357 Add how do I ... section dcherian 2448579 closed 0     1 2019-09-30T16:00:34Z 2019-09-30T21:12:28Z 2019-09-30T21:12:23Z MEMBER   0 pydata/xarray/pulls/3357
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

Thoughts on adding something like this?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3357/reactions",
    "total_count": 4,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 1,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
494681363 MDExOlB1bGxSZXF1ZXN0MzE4Mzg2NjI2 3314 move auto_combine deprecation to 0.14 dcherian 2448579 closed 0     1 2019-09-17T15:04:38Z 2019-09-17T18:50:09Z 2019-09-17T18:50:06Z MEMBER   0 pydata/xarray/pulls/3314
  • [x] Closes #3280

This undoes the auto_combine deprecation until we figure out the best way to proceed.

(I am not very familiar with auto_combine so someone else should look over this. The tests all pass though...)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3314/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
385452676 MDExOlB1bGxSZXF1ZXN0MjM0NDIwNzYx 2581 fix examples dcherian 2448579 closed 0     1 2018-11-28T20:54:44Z 2019-08-15T15:33:09Z 2018-11-28T22:30:36Z MEMBER   0 pydata/xarray/pulls/2581
  • [x] Closes #2580

Use open_dataset.load() instead of load_dataset()

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2581/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
328573601 MDExOlB1bGxSZXF1ZXN0MTkyMDc1OTM2 2210 Remove height=12in from facetgrid example plots. dcherian 2448579 closed 0     1 2018-06-01T15:56:52Z 2019-08-15T15:33:03Z 2018-06-01T16:15:50Z MEMBER   0 pydata/xarray/pulls/2210
  • [x] Closes #2208 (remove if there is no corresponding issue, which should only be the case for minor changes)

This fixes it for me locally. The height was forced to be 12in, while width is 100%.

I'm not sure why this was added in the first place.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2210/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
322365966 MDExOlB1bGxSZXF1ZXN0MTg3NTE3MTUx 2120 Prevent Inf from screwing colorbar scale. dcherian 2448579 closed 0     1 2018-05-11T16:55:34Z 2019-08-15T15:32:51Z 2018-05-12T06:36:37Z MEMBER   0 pydata/xarray/pulls/2120
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)

The current version uses pd.isnull to remove invalid values from input data when making a colorbar. pd.isnull([np.inf]) is False which means _determine_cmap_params returns Inf for colorbar limits which screws everything up. This PR changes pd.isnull to np.isfinite.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2120/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
445776616 MDExOlB1bGxSZXF1ZXN0MjgwMTQwODcz 2973 More support for missing_value. dcherian 2448579 closed 0     1 2019-05-19T03:41:56Z 2019-06-12T15:32:32Z 2019-06-12T15:32:27Z MEMBER   0 pydata/xarray/pulls/2973
  • [x] Closes #2871
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2973/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
331795471 MDExOlB1bGxSZXF1ZXN0MTk0NDM5NDgz 2229 Bugfix for faceting line plots. dcherian 2448579 closed 0     1 2018-06-13T00:04:43Z 2019-04-12T16:31:18Z 2018-06-20T16:26:37Z MEMBER   0 pydata/xarray/pulls/2229

Closes #2239

Fixes a broken doc image: http://xarray.pydata.org/en/stable/plotting.html#id4

The tests passed previously because there was no metadata associated with the test DataArray. I've assigned some now, that should be good enough.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2229/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
391295532 MDExOlB1bGxSZXF1ZXN0MjM4ODU1NzM4 2608 .resample now supports loffset. dcherian 2448579 closed 0     1 2018-12-14T22:07:06Z 2019-04-12T16:29:09Z 2018-12-19T05:12:59Z MEMBER   0 pydata/xarray/pulls/2608
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2608/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
391440525 MDExOlB1bGxSZXF1ZXN0MjM4OTQ5ODI2 2611 doc fixes. dcherian 2448579 closed 0     1 2018-12-16T06:47:07Z 2018-12-17T21:57:36Z 2018-12-17T21:57:36Z MEMBER   0 pydata/xarray/pulls/2611
  • [x] Closes #2610

Quickfixes to make things work locally.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2611/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
389865283 MDU6SXNzdWUzODk4NjUyODM= 2600 Tests are failing on dask-dev dcherian 2448579 closed 0     1 2018-12-11T17:09:57Z 2018-12-12T03:13:30Z 2018-12-12T03:13:30Z MEMBER      

Sample error from https://travis-ci.org/pydata/xarray/jobs/466431752

``` _____ test_dataarray_with_dask_coords ______ def test_dataarray_with_dask_coords(): import toolz x = xr.Variable('x', da.arange(8, chunks=(4,))) y = xr.Variable('y', da.arange(8, chunks=(4,)) * 2) data = da.random.random((8, 8), chunks=(4, 4)) + 1 array = xr.DataArray(data, dims=['x', 'y']) array.coords['xx'] = x array.coords['yy'] = y

    assert dict(array.__dask_graph__()) == toolz.merge(data.__dask_graph__(),
                                                       x.__dask_graph__(),
                                                       y.__dask_graph__())
  (array2,) = dask.compute(array)

xarray/tests/test_dask.py:824:


../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:395: in compute dsk = collections_to_dsk(collections, optimize_graph, *kwargs) ../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:187: in collections_to_dsk for opt, val in groups.items()} ../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:187: in <dictcomp> for opt, val in groups.items()} ../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:212: in _extract_graph_and_keys graph = merge(graphs)


dicts = <dask.sharedict.ShareDict object at 0x7f307d29a128>, kwargs = {} factory = <class 'dict'>, rv = {} d = ('arange-36f53ab1e6153a63bbf7f4f8ff56693c', 0) def merge(dicts, *kwargs): """ Merge a collection of dictionaries

    >>> merge({1: 'one'}, {2: 'two'})
    {1: 'one', 2: 'two'}

    Later dictionaries have precedence

    >>> merge({1: 2, 3: 4}, {3: 3, 4: 4})
    {1: 2, 3: 3, 4: 4}

    See Also:
        merge_with
    """
    if len(dicts) == 1 and not isinstance(dicts[0], dict):
        dicts = dicts[0]
    factory = _get_factory(merge, kwargs)

    rv = factory()
    for d in dicts:
      rv.update(d)

E ValueError: dictionary update sequence element #0 has length 39; 2 is required ../../../miniconda/envs/test_env/lib/python3.6/site-packages/toolz/dicttoolz.py:39: ValueError ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2600/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
374183531 MDExOlB1bGxSZXF1ZXN0MjI1OTQ0MTg4 2513 Make sure datetime object arrays are converted to datetime64 dcherian 2448579 closed 0     1 2018-10-26T00:33:07Z 2018-10-27T16:34:58Z 2018-10-27T16:34:54Z MEMBER   0 pydata/xarray/pulls/2513
  • [x] Closes #2512 (remove if there is no corresponding issue, which should only be the case for minor changes)
  • [x] Tests added (for all bug fixes or enhancements)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2513/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
364419418 MDExOlB1bGxSZXF1ZXN0MjE4NjA5NTgz 2444 facetgrid: properly support cbar_kwargs. dcherian 2448579 closed 0     1 2018-09-27T10:59:44Z 2018-10-25T16:06:57Z 2018-10-25T16:06:54Z MEMBER   0 pydata/xarray/pulls/2444
  • [x] Closes #1504, closes #1717, closes #1735
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)

1735 is stalled, so I jumped in.

I've added an error if cbar_ax is provided as an option for FacetGrid. Don't think it's really needed.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2444/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
356381494 MDExOlB1bGxSZXF1ZXN0MjEyNjU3NTU1 2395 Properly support user-provided norm. dcherian 2448579 closed 0     1 2018-09-03T07:04:45Z 2018-09-05T06:53:35Z 2018-09-05T06:53:30Z MEMBER   0 pydata/xarray/pulls/2395
  • [x] Closes #2381 (remove if there is no corresponding issue, which should only be the case for minor changes)
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2395/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
282344188 MDExOlB1bGxSZXF1ZXN0MTU4NTQwODI1 1786 _plot2d: Rotate x-axis ticks if datetime subtype dcherian 2448579 closed 0     1 2017-12-15T07:43:19Z 2018-05-10T05:12:19Z 2018-01-03T16:37:56Z MEMBER   0 pydata/xarray/pulls/1786

Rotate x-axis dateticks by default, just as for plot.line()

  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Passes git diff upstream/master **/*py | flake8 --diff (remove if you did not edit any Python files)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1786/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
308107438 MDExOlB1bGxSZXF1ZXN0MTc3MTI5MTA1 2012 Add weighted mean docs. dcherian 2448579 closed 0     1 2018-03-23T16:57:29Z 2018-03-23T22:55:32Z 2018-03-23T22:51:57Z MEMBER   0 pydata/xarray/pulls/2012

I like @fujiisoup's weighted mean demo in this stack overflow example: https://stackoverflow.com/questions/48510784/xarray-rolling-mean-with-weights

I thought it'd be a useful addition to the docs on rolling.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2012/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 1042.096ms · About: xarray-datasette