home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

26 rows where comments = 2, state = "closed" and user = 35968931 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: draft, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 20
  • issue 6

state 1

  • closed · 26 ✖

repo 1

  • xarray 26
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
2254350395 PR_kwDOAMm_X85tPTua 8960 Option to not auto-create index during expand_dims TomNicholas 35968931 closed 0     2 2024-04-20T03:27:23Z 2024-04-27T16:48:30Z 2024-04-27T16:48:24Z MEMBER   0 pydata/xarray/pulls/8960
  • [x] Solves part of #8871 by pulling out part of https://github.com/pydata/xarray/pull/8872#issuecomment-2027571714
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] ~~New functions/methods are listed in api.rst~~

TODO: - [x] Add new kwarg to DataArray.expand_dims - [ ] Add examples to docstrings? - [x] Check it actually solves the problem in #8872

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8960/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2198196326 I_kwDOAMm_X86DBdBm 8860 Ugly error in constructor when no data passed TomNicholas 35968931 closed 0     2 2024-03-20T17:55:52Z 2024-04-10T22:46:55Z 2024-04-10T22:46:54Z MEMBER      

What happened?

Passing no data to the Dataset constructor can result in a very unhelpful "tuple index out of range" error when this is a clear case of malformed input that we should be able to catch.

What did you expect to happen?

An error more like "tuple must be of form (dims, data[, attrs])"

Minimal Complete Verifiable Example

Python xr.Dataset({"t": ()})

MVCE confirmation

  • [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [X] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [X] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [ ] New issue — a search of GitHub Issues suggests this is not a duplicate.
  • [X] Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

```Python

IndexError Traceback (most recent call last) Cell In[2], line 1 ----> 1 xr.Dataset({"t": ()})

File ~/Documents/Work/Code/xarray/xarray/core/dataset.py:693, in Dataset.init(self, data_vars, coords, attrs) 690 if isinstance(coords, Dataset): 691 coords = coords._variables --> 693 variables, coord_names, dims, indexes, _ = merge_data_and_coords( 694 data_vars, coords 695 ) 697 self._attrs = dict(attrs) if attrs else None 698 self._close = None

File ~/Documents/Work/Code/xarray/xarray/core/dataset.py:422, in merge_data_and_coords(data_vars, coords) 418 coords = create_coords_with_default_indexes(coords, data_vars) 420 # exclude coords from alignment (all variables in a Coordinates object should 421 # already be aligned together) and use coordinates' indexes to align data_vars --> 422 return merge_core( 423 [data_vars, coords], 424 compat="broadcast_equals", 425 join="outer", 426 explicit_coords=tuple(coords), 427 indexes=coords.xindexes, 428 priority_arg=1, 429 skip_align_args=[1], 430 )

File ~/Documents/Work/Code/xarray/xarray/core/merge.py:718, in merge_core(objects, compat, join, combine_attrs, priority_arg, explicit_coords, indexes, fill_value, skip_align_args) 715 for pos, obj in skip_align_objs: 716 aligned.insert(pos, obj) --> 718 collected = collect_variables_and_indexes(aligned, indexes=indexes) 719 prioritized = _get_priority_vars_and_indexes(aligned, priority_arg, compat=compat) 720 variables, out_indexes = merge_collected( 721 collected, prioritized, compat=compat, combine_attrs=combine_attrs 722 )

File ~/Documents/Work/Code/xarray/xarray/core/merge.py:358, in collect_variables_and_indexes(list_of_mappings, indexes) 355 indexes_.pop(name, None) 356 append_all(coords_, indexes_) --> 358 variable = as_variable(variable, name=name, auto_convert=False) 359 if name in indexes: 360 append(name, variable, indexes[name])

File ~/Documents/Work/Code/xarray/xarray/core/variable.py:126, in as_variable(obj, name, auto_convert) 124 obj = obj.copy(deep=False) 125 elif isinstance(obj, tuple): --> 126 if isinstance(obj[1], DataArray): 127 raise TypeError( 128 f"Variable {name!r}: Using a DataArray object to construct a variable is" 129 " ambiguous, please extract the data using the .data property." 130 ) 131 try:

IndexError: tuple index out of range ```

Anything else we need to know?

No response

Environment

Xarray main

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8860/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
2213406564 PR_kwDOAMm_X85rEF-X 8886 Allow multidimensional variable with same name as dim when constructing dataset via coords TomNicholas 35968931 closed 0     2 2024-03-28T14:37:27Z 2024-03-28T17:07:10Z 2024-03-28T16:28:09Z MEMBER   0 pydata/xarray/pulls/8886

Supercedes #8884 as a way to close #8883, in light of me having learnt that this is now allowed! https://github.com/pydata/xarray/issues/8883#issuecomment-2024645815. So this is really a follow-up to #7989.

  • [x] Closes #8883
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] ~~New functions/methods are listed in api.rst~~
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8886/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2212186122 I_kwDOAMm_X86D20gK 8883 Coordinates object permits invalid state TomNicholas 35968931 closed 0     2 2024-03-28T01:49:21Z 2024-03-28T16:28:11Z 2024-03-28T16:28:11Z MEMBER      

What happened?

It is currently possible to create a Coordinates object where a variable shares a name with a dimension, but the variable is not 1D. This is explicitly forbidden by the xarray data model.

What did you expect to happen?

If you try to pass the resulting object into the Dataset constructor you get the expected error telling you that this is forbidden, but that error should have been raised by Coordinates.__init__.

Minimal Complete Verifiable Example

```Python In [1]: from xarray.core.coordinates import Coordinates

In [2]: from xarray.core.variable import Variable

In [4]: import numpy as np

In [5]: var = Variable(data=np.arange(6).reshape(2, 3), dims=['x', 'y'])

In [6]: var Out[6]: <xarray.Variable (x: 2, y: 3)> Size: 48B array([[0, 1, 2], [3, 4, 5]])

In [7]: coords = Coordinates(coords={'x': var}, indexes={})

In [8]: coords Out[8]: Coordinates: x (x, y) int64 48B 0 1 2 3 4 5

In [10]: import xarray as xr

In [11]: ds = xr.Dataset(coords=coords)

MergeError Traceback (most recent call last) Cell In[11], line 1 ----> 1 ds = xr.Dataset(coords=coords)

File ~/Documents/Work/Code/xarray/xarray/core/dataset.py:693, in Dataset.init(self, data_vars, coords, attrs) 690 if isinstance(coords, Dataset): 691 coords = coords._variables --> 693 variables, coord_names, dims, indexes, _ = merge_data_and_coords( 694 data_vars, coords 695 ) 697 self._attrs = dict(attrs) if attrs else None 698 self._close = None

File ~/Documents/Work/Code/xarray/xarray/core/dataset.py:422, in merge_data_and_coords(data_vars, coords) 418 coords = create_coords_with_default_indexes(coords, data_vars) 420 # exclude coords from alignment (all variables in a Coordinates object should 421 # already be aligned together) and use coordinates' indexes to align data_vars --> 422 return merge_core( 423 [data_vars, coords], 424 compat="broadcast_equals", 425 join="outer", 426 explicit_coords=tuple(coords), 427 indexes=coords.xindexes, 428 priority_arg=1, 429 skip_align_args=[1], 430 )

File ~/Documents/Work/Code/xarray/xarray/core/merge.py:731, in merge_core(objects, compat, join, combine_attrs, priority_arg, explicit_coords, indexes, fill_value, skip_align_args) 729 coord_names.intersection_update(variables) 730 if explicit_coords is not None: --> 731 assert_valid_explicit_coords(variables, dims, explicit_coords) 732 coord_names.update(explicit_coords) 733 for dim, size in dims.items():

File ~/Documents/Work/Code/xarray/xarray/core/merge.py:577, in assert_valid_explicit_coords(variables, dims, explicit_coords) 575 for coord_name in explicit_coords: 576 if coord_name in dims and variables[coord_name].dims != (coord_name,): --> 577 raise MergeError( 578 f"coordinate {coord_name} shares a name with a dataset dimension, but is " 579 "not a 1D variable along that dimension. This is disallowed " 580 "by the xarray data model." 581 )

MergeError: coordinate x shares a name with a dataset dimension, but is not a 1D variable along that dimension. This is disallowed by the xarray data model. ```

MVCE confirmation

  • [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [X] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [X] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [x] New issue — a search of GitHub Issues suggests this is not a duplicate.
  • [X] Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

No response

Anything else we need to know?

I noticed this whilst working on #8872

Environment

main

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8883/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
2212211084 PR_kwDOAMm_X85rABMo 8884 Forbid invalid Coordinates object TomNicholas 35968931 closed 0     2 2024-03-28T02:14:01Z 2024-03-28T14:38:43Z 2024-03-28T14:38:03Z MEMBER   0 pydata/xarray/pulls/8884
  • [x] Closes #8883
  • [x] Tests added
  • [ ] ~~User visible changes (including notable bug fixes) are documented in whats-new.rst~~
  • [ ] ~~New functions/methods are listed in api.rst~~
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8884/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2092346228 PR_kwDOAMm_X85ko-Y2 8632 Pin sphinx-book-theme to 1.0.1 to try to deal with #8619 TomNicholas 35968931 closed 0     2 2024-01-21T02:18:49Z 2024-01-23T20:16:13Z 2024-01-23T18:28:35Z MEMBER   0 pydata/xarray/pulls/8632
  • [x] Hopefully closes #8619
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8632/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1820788594 PR_kwDOAMm_X85WW40r 8019 Generalize cumulative reduction (scan) to non-dask types TomNicholas 35968931 closed 0     2 2023-07-25T17:22:07Z 2023-12-18T19:30:18Z 2023-12-18T19:30:18Z MEMBER   0 pydata/xarray/pulls/8019
  • [x] Needed for https://github.com/tomwhite/cubed/issues/277#issuecomment-1648567431 - should have been added in #7019
  • [ ] ~~Tests added~~ (would go in cubed-xarray)
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] New functions/methods are listed in api.rst (new ABC method will be documented on chunked array types page automatically)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8019/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1036473974 PR_kwDOAMm_X84tsaL3 5900 Add .chunksizes property TomNicholas 35968931 closed 0     2 2021-10-26T15:51:09Z 2023-10-20T16:00:15Z 2021-10-29T18:12:22Z MEMBER   0 pydata/xarray/pulls/5900

Adds a new .chunksizes property to Dataset, DataArray and Variable, which returns a mapping from dimensions names to chunk sizes in all cases.

Supercedes #5846 because this PR is backwards-compatible.

  • [x] Closes #5843
  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5900/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1083507645 PR_kwDOAMm_X84wBDeq 6083 Manifest as variables attribute TomNicholas 35968931 closed 0     2 2021-12-17T18:14:26Z 2023-09-14T15:37:38Z 2023-09-14T15:37:37Z MEMBER   1 pydata/xarray/pulls/6083

Another attempt like #5961

@shoyer

  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] Passes pre-commit run --all-files
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6083/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1695244129 PR_kwDOAMm_X85PvJSS 7815 Array API fixes for astype TomNicholas 35968931 closed 0     2 2023-05-04T04:33:52Z 2023-05-18T20:10:48Z 2023-05-18T20:10:43Z MEMBER   0 pydata/xarray/pulls/7815

While it's common for duck arrays to have a .astype method, this doesn't exist in the new array API standard. We now have duck_array_ops.astype to deal with this, but for some reason changing it in just a couple more places broke practically every pint test in test_units.py :confused: @keewis

Builds on top of #7019 with just one extra commit to separate out this issue.

  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7815/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1337587854 PR_kwDOAMm_X849HJCV 6912 Automatic PR labeler TomNicholas 35968931 closed 0     2 2022-08-12T18:40:27Z 2022-08-12T19:52:49Z 2022-08-12T19:47:19Z MEMBER   0 pydata/xarray/pulls/6912

GH action to automatically label new PRs according to which files they touch.

Idea stolen from dask, see https://github.com/dask/dask/pull/7506 . Their PR labelling by file/module is specified here.

(My first use of this bot so might well be a mistake.)

@max-sixty you will probably enjoy this extra automation :robot:

  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6912/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1157289286 PR_kwDOAMm_X84z1Xnf 6319 v2022.03.0 release notes TomNicholas 35968931 closed 0     2 2022-03-02T14:43:34Z 2022-03-02T19:49:25Z 2022-03-02T15:49:23Z MEMBER   0 pydata/xarray/pulls/6319  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6319/reactions",
    "total_count": 4,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 4,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1041675013 PR_kwDOAMm_X84t8yv7 5924 v0.20 Release notes TomNicholas 35968931 closed 0     2 2021-11-01T21:53:29Z 2021-11-02T19:22:46Z 2021-11-02T16:37:45Z MEMBER   0 pydata/xarray/pulls/5924

@pydata/xarray the release notes for your approval

5889

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5924/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1039714252 PR_kwDOAMm_X84t25p8 5916 Update open_rasterio deprecation version number TomNicholas 35968931 closed 0     2 2021-10-29T15:56:04Z 2021-11-02T18:03:59Z 2021-11-02T18:03:58Z MEMBER   0 pydata/xarray/pulls/5916  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5916/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
959317311 MDExOlB1bGxSZXF1ZXN0NzAyNDQ5NDg1 5669 Combine='by_coords' and concat dim deprecation in open_mfdataset TomNicholas 35968931 closed 0     2 2021-08-03T17:03:44Z 2021-10-01T18:52:00Z 2021-10-01T18:52:00Z MEMBER   0 pydata/xarray/pulls/5669

Noticed this hadn't been completed in https://github.com/pydata/xarray/discussions/5659

  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5669/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
911663002 MDU6SXNzdWU5MTE2NjMwMDI= 5438 Add Union Operators for Dataset TomNicholas 35968931 closed 0     2 2021-06-04T16:21:06Z 2021-06-04T16:35:36Z 2021-06-04T16:35:36Z MEMBER      

As of python 3.9, python dictionaries now support being merged via python c = a | b and updated via python c = a |= b see PEP 584.

xarray.Dataset is dict-like, so it would make sense to support the same syntax for merging. The way to achieve that is by adding new dunder methods to xarray.Dataset, something like

```python def or(self, other): if not isinstance(other, xr.Dataset): return NotImplemented new = xr.merge(self, other) return new

def ror(self, other): if not isinstance(other, xr.Dataset): return NotImplemented new = xr.merge(self, other) return new

def ior(self, other): self.merge(other) return self ```

The distinction between the intent of these different operators is whether a new object is returned or the original object is updated.

This would allow things like (ds1 | ds2).to_netcdf()

(This feature doesn't require python 3.9, it merely echoes a feature that is only available in 3.9+)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5438/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
902830027 MDExOlB1bGxSZXF1ZXN0NjU0MTU2NTA5 5383 Corrected reference to blockwise to refer to apply_gufunc instead TomNicholas 35968931 closed 0     2 2021-05-26T19:23:53Z 2021-05-26T21:34:06Z 2021-05-26T21:34:06Z MEMBER   0 pydata/xarray/pulls/5383

I noticed that the apply_ufunc tutorial notebook says that xarray.apply_ufunc uses dask.array.blockwise, but that's no longer true as of PR #4060 .

  • [x] Passes pre-commit run --all-files
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5383/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
887597884 MDExOlB1bGxSZXF1ZXN0NjQwODE5NTMz 5289 Explained what a deprecation cycle is TomNicholas 35968931 closed 0     2 2021-05-11T15:15:08Z 2021-05-13T16:38:19Z 2021-05-13T16:38:19Z MEMBER   0 pydata/xarray/pulls/5289

Inspired by a question asked in #4696, but does not close that issue - [x] Passes pre-commit run --all-files - [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5289/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
871234354 MDExOlB1bGxSZXF1ZXN0NjI2Mjg2ODQy 5237 Add deprecation warnings for lock kwarg TomNicholas 35968931 closed 0     2 2021-04-29T16:45:45Z 2021-05-04T19:17:31Z 2021-05-04T19:17:31Z MEMBER   0 pydata/xarray/pulls/5237

Does this need a test?

  • [x] Closes #5073
  • [ ] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5237/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
871111282 MDU6SXNzdWU4NzExMTEyODI= 5236 Error collecting tests due to optional pint import TomNicholas 35968931 closed 0     2 2021-04-29T15:01:13Z 2021-04-29T15:32:08Z 2021-04-29T15:32:08Z MEMBER      

When I try to run xarray's test suite locally with pytest I've suddenly started getting this weird error:

``` (xarray-dev) tegn500@fusion192:~/Documents/Work/Code/xarray$ pytest xarray/tests/test_backends.py ==================================================================================== test session starts ===================================================================================== platform linux -- Python 3.9.2, pytest-6.2.3, py-1.10.0, pluggy-0.13.1 rootdir: /home/tegn500/Documents/Work/Code/xarray, configfile: setup.cfg collected 0 items / 1 error

=========================================================================================== ERRORS =========================================================================================== __________ ERROR collecting xarray/tests/test_backends.py __________ ../../../../anaconda3/envs/xarray-dev/lib/python3.9/importlib/init.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) <frozen importlib._bootstrap>:1030: in _gcd_import ??? <frozen importlib._bootstrap>:1007: in _find_and_load ??? <frozen importlib._bootstrap>:972: in _find_and_load_unlocked ??? <frozen importlib._bootstrap>:228: in _call_with_frames_removed ??? <frozen importlib._bootstrap>:1030: in _gcd_import ??? <frozen importlib._bootstrap>:1007: in _find_and_load ??? <frozen importlib._bootstrap>:986: in _find_and_load_unlocked ??? <frozen importlib._bootstrap>:680: in _load_unlocked ??? <frozen importlib._bootstrap_external>:790: in exec_module ??? <frozen importlib._bootstrap>:228: in _call_with_frames_removed ??? xarray/tests/init.py:84: in <module> has_pint_0_15, requires_pint_0_15 = _importorskip("pint", minversion="0.15") xarray/tests/init.py:46: in _importorskip if LooseVersion(mod.version) < LooseVersion(minversion): E AttributeError: module 'pint' has no attribute 'version' ================================================================================== short test summary info =================================================================================== ERROR xarray/tests/test_backends.py - AttributeError: module 'pint' has no attribute 'version' !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ====================================================================================== 1 error in 0.88s ====================================================================================== ```

I'm not sure whether this is my fault or a problem with xarray somehow. @keewis have you seen this happen before? This is with a fresh conda environment, running locally on my laptop, and on python 3.9.2. Pint isn't even in this environment. I can force it to proceed with the tests by also catching the attribute error, i.e.

python def _importorskip(modname, minversion=None): try: mod = importlib.import_module(modname) has = True if minversion is not None: if LooseVersion(mod.__version__) < LooseVersion(minversion): raise ImportError("Minimum version not satisfied") except (ImportError, AttributeError): has = False

but I obviously shouldn't need to do that. Any ideas?

Environment:

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: a5e72c9aacbf26936844840b75dd59fe7d13f1e6 python: 3.9.2 | packaged by conda-forge | (default, Feb 21 2021, 05:02:46) [GCC 9.3.0] python-bits: 64 OS: Linux OS-release: 4.8.10-040810-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.8.0 xarray: 0.15.2.dev545+ga5e72c9 pandas: 1.2.4 numpy: 1.20.2 scipy: 1.6.3 netCDF4: 1.5.6 pydap: None h5netcdf: None h5py: None Nio: None zarr: 2.8.1 cftime: 1.4.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.2 dask: 2021.04.1 distributed: 2021.04.1 matplotlib: 3.4.1 cartopy: installed seaborn: None numbagg: None pint: installed setuptools: 49.6.0.post20210108 pip: 21.1 conda: None pytest: 6.2.3 IPython: None sphinx: None

Conda Environment:

Output of <tt>conda list</tt> # packages in environment at /home/tegn500/anaconda3/envs/xarray-dev: # # Name Version Build Channel _libgcc_mutex 0.1 conda_forge conda-forge _openmp_mutex 4.5 1_gnu conda-forge alsa-lib 1.2.3 h516909a_0 conda-forge asciitree 0.3.3 py_2 conda-forge attrs 20.3.0 pyhd3deb0d_0 conda-forge bokeh 2.3.1 py39hf3d152e_0 conda-forge bottleneck 1.3.2 py39hce5d2b2_3 conda-forge bzip2 1.0.8 h7f98852_4 conda-forge c-ares 1.17.1 h7f98852_1 conda-forge ca-certificates 2020.12.5 ha878542_0 conda-forge certifi 2020.12.5 py39hf3d152e_1 conda-forge cftime 1.4.1 py39hce5d2b2_0 conda-forge click 7.1.2 pyh9f0ad1d_0 conda-forge cloudpickle 1.6.0 py_0 conda-forge curl 7.76.1 h979ede3_1 conda-forge cycler 0.10.0 py_2 conda-forge cytoolz 0.11.0 py39h3811e60_3 conda-forge dask 2021.4.1 pyhd8ed1ab_0 conda-forge dask-core 2021.4.1 pyhd8ed1ab_0 conda-forge dbus 1.13.6 h48d8840_2 conda-forge distributed 2021.4.1 py39hf3d152e_0 conda-forge expat 2.3.0 h9c3ff4c_0 conda-forge fasteners 0.14.1 py_3 conda-forge fontconfig 2.13.1 hba837de_1005 conda-forge freetype 2.10.4 h0708190_1 conda-forge fsspec 2021.4.0 pyhd8ed1ab_0 conda-forge gettext 0.19.8.1 h0b5b191_1005 conda-forge glib 2.68.1 h9c3ff4c_0 conda-forge glib-tools 2.68.1 h9c3ff4c_0 conda-forge gst-plugins-base 1.18.4 hf529b03_2 conda-forge gstreamer 1.18.4 h76c114f_2 conda-forge hdf4 4.2.13 h10796ff_1005 conda-forge hdf5 1.10.6 nompi_h6a2412b_1114 conda-forge heapdict 1.0.1 py_0 conda-forge icu 68.1 h58526e2_0 conda-forge iniconfig 1.1.1 pyh9f0ad1d_0 conda-forge jinja2 2.11.3 pyh44b312d_0 conda-forge jpeg 9d h36c2ea0_0 conda-forge kiwisolver 1.3.1 py39h1a9c180_1 conda-forge krb5 1.17.2 h926e7f8_0 conda-forge lcms2 2.12 hddcbb42_0 conda-forge ld_impl_linux-64 2.35.1 hea4e1c9_2 conda-forge libblas 3.9.0 8_openblas conda-forge libcblas 3.9.0 8_openblas conda-forge libclang 11.1.0 default_ha53f305_0 conda-forge libcurl 7.76.1 hc4aaa36_1 conda-forge libedit 3.1.20191231 he28a2e2_2 conda-forge libev 4.33 h516909a_1 conda-forge libevent 2.1.10 hcdb4288_3 conda-forge libffi 3.3 h58526e2_2 conda-forge libgcc-ng 9.3.0 h2828fa1_19 conda-forge libgfortran-ng 9.3.0 hff62375_19 conda-forge libgfortran5 9.3.0 hff62375_19 conda-forge libglib 2.68.1 h3e27bee_0 conda-forge libgomp 9.3.0 h2828fa1_19 conda-forge libiconv 1.16 h516909a_0 conda-forge liblapack 3.9.0 8_openblas conda-forge libllvm11 11.1.0 hf817b99_2 conda-forge libnetcdf 4.8.0 nompi_hfa85936_101 conda-forge libnghttp2 1.43.0 h812cca2_0 conda-forge libogg 1.3.4 h7f98852_1 conda-forge libopenblas 0.3.12 pthreads_h4812303_1 conda-forge libopus 1.3.1 h7f98852_1 conda-forge libpng 1.6.37 h21135ba_2 conda-forge libpq 13.2 hfd2b0eb_2 conda-forge libssh2 1.9.0 ha56f1ee_6 conda-forge libstdcxx-ng 9.3.0 h6de172a_19 conda-forge libtiff 4.2.0 hdc55705_1 conda-forge libuuid 2.32.1 h7f98852_1000 conda-forge libvorbis 1.3.7 h9c3ff4c_0 conda-forge libwebp-base 1.2.0 h7f98852_2 conda-forge libxcb 1.13 h7f98852_1003 conda-forge libxkbcommon 1.0.3 he3ba5ed_0 conda-forge libxml2 2.9.10 h72842e0_4 conda-forge libzip 1.7.3 h4de3113_0 conda-forge locket 0.2.0 py_2 conda-forge lz4-c 1.9.3 h9c3ff4c_0 conda-forge markupsafe 1.1.1 py39h3811e60_3 conda-forge matplotlib 3.4.1 py39hf3d152e_0 conda-forge matplotlib-base 3.4.1 py39h2fa2bec_0 conda-forge monotonic 1.5 py_0 conda-forge more-itertools 8.7.0 pyhd8ed1ab_1 conda-forge msgpack-python 1.0.2 py39h1a9c180_1 conda-forge mysql-common 8.0.23 ha770c72_1 conda-forge mysql-libs 8.0.23 h935591d_1 conda-forge ncurses 6.2 h58526e2_4 conda-forge netcdf4 1.5.6 nompi_py39hc6dca20_103 conda-forge nspr 4.30 h9c3ff4c_0 conda-forge nss 3.64 hb5efdd6_0 conda-forge numcodecs 0.7.3 py39he80948d_0 conda-forge numpy 1.20.2 py39hdbf815f_0 conda-forge olefile 0.46 pyh9f0ad1d_1 conda-forge openjpeg 2.4.0 hf7af979_0 conda-forge openssl 1.1.1k h7f98852_0 conda-forge packaging 20.9 pyh44b312d_0 conda-forge pandas 1.2.4 py39hde0f152_0 conda-forge partd 1.2.0 pyhd8ed1ab_0 conda-forge pcre 8.44 he1b5a44_0 conda-forge pillow 8.1.2 py39hf95b381_1 conda-forge pip 21.1 pyhd8ed1ab_0 conda-forge pluggy 0.13.1 py39hf3d152e_4 conda-forge psutil 5.8.0 py39h3811e60_1 conda-forge pthread-stubs 0.4 h36c2ea0_1001 conda-forge py 1.10.0 pyhd3deb0d_0 conda-forge pyparsing 2.4.7 pyh9f0ad1d_0 conda-forge pyqt 5.12.3 py39hf3d152e_7 conda-forge pyqt-impl 5.12.3 py39h0fcd23e_7 conda-forge pyqt5-sip 4.19.18 py39he80948d_7 conda-forge pyqtchart 5.12 py39h0fcd23e_7 conda-forge pyqtwebengine 5.12.1 py39h0fcd23e_7 conda-forge pytest 6.2.3 py39hf3d152e_0 conda-forge python 3.9.2 hffdb5ce_0_cpython conda-forge python-dateutil 2.8.1 py_0 conda-forge python_abi 3.9 1_cp39 conda-forge pytz 2021.1 pyhd8ed1ab_0 conda-forge pyyaml 5.4.1 py39h3811e60_0 conda-forge qt 5.12.9 hda022c4_4 conda-forge readline 8.1 h46c0cb4_0 conda-forge scipy 1.6.3 py39hee8e79c_0 conda-forge setuptools 49.6.0 py39hf3d152e_3 conda-forge six 1.15.0 pyh9f0ad1d_0 conda-forge sortedcontainers 2.3.0 pyhd8ed1ab_0 conda-forge sqlite 3.35.5 h74cdb3f_0 conda-forge tblib 1.7.0 pyhd8ed1ab_0 conda-forge tk 8.6.10 h21135ba_1 conda-forge toml 0.10.2 pyhd8ed1ab_0 conda-forge toolz 0.11.1 py_0 conda-forge tornado 6.1 py39h3811e60_1 conda-forge typing_extensions 3.7.4.3 py_0 conda-forge tzdata 2021a he74cb21_0 conda-forge wheel 0.36.2 pyhd3deb0d_0 conda-forge xorg-libxau 1.0.9 h7f98852_0 conda-forge xorg-libxdmcp 1.1.3 h7f98852_0 conda-forge xz 5.2.5 h516909a_1 conda-forge yaml 0.2.5 h516909a_0 conda-forge zarr 2.8.1 pyhd8ed1ab_0 conda-forge zict 2.0.0 py_0 conda-forge zlib 1.2.11 h516909a_1010 conda-forge zstd 1.4.9 ha95c52a_0 conda-forge
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5236/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
604265838 MDExOlB1bGxSZXF1ZXN0NDA2ODkyMzc3 3993 dim -> coord in DataArray.integrate TomNicholas 35968931 closed 0     2 2020-04-21T20:30:35Z 2021-02-05T21:48:39Z 2021-01-29T22:59:30Z MEMBER   0 pydata/xarray/pulls/3993
  • [x] Closes #3992
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3993/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
332987740 MDU6SXNzdWUzMzI5ODc3NDA= 2235 Adding surface plot for 2D data TomNicholas 35968931 closed 0     2 2018-06-16T13:36:10Z 2020-06-17T04:49:50Z 2020-06-17T04:49:50Z MEMBER      

I am interested in adding the ability to plot surface plots of 2D xarray data using matplotlib's 3D plotting function plot_surface().

This would be nice because a surface in 3D is much more useful for showing certain features of 2D data then color plots are. For example an outlier would appear as an obvious spike rather than just a single bright point as it would when using plot.imshow(). I'm not suggesting adding full 3D plotting capability, just the ability to visualise 2D data as a surface in 3D.

The code would end up allowing you to just call xr.Dataarray.plot.surface() to create something like this example from here (code here):

Obviously xarray would be used to automatically set the axes labels and title and so on.

As far as I can tell it wouldn't be too difficult to do, it would just be implemented as another 2D plotting method the same way as the Dataarray.plot.imshow(), Dataarray.plot.contour() etc methods currently are. It would require the imports python import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D but these would only need to be imported if this type of plot was chosen.

I would be interested in trying to add this myself, but I've never contributed to an open-source project before. Is this a reasonable thing for me to try? Can anyone see any immediate difficulties with this? Would I just need to have a go and then submit a pull request?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2235/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
599597677 MDExOlB1bGxSZXF1ZXN0NDAzMjA0MTI5 3970 keep attrs in interpolate_na TomNicholas 35968931 closed 0     2 2020-04-14T14:02:33Z 2020-04-17T20:16:27Z 2020-04-17T20:16:27Z MEMBER   0 pydata/xarray/pulls/3970

dataarray.interpolate_na was dropping attrs because even though the internal apply_ufunc call was being passed keep_attrs=True, the order of arguments index, da to apply_ufunc meant that it was trying to keep only the non-existent attrs from the first argument. I just swapped them round, and added a globally-aware keep_attrs kwarg.

  • [x] Closes #3968
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3970/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
595602834 MDExOlB1bGxSZXF1ZXN0NDAwMDQ4MTEz 3943 Updated list of core developers TomNicholas 35968931 closed 0     2 2020-04-07T05:31:55Z 2020-04-07T19:28:37Z 2020-04-07T19:28:25Z MEMBER   0 pydata/xarray/pulls/3943

Names listed in order of date added.

  • [x] Closes #3892
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3943/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
460674994 MDExOlB1bGxSZXF1ZXN0MjkxNzU1NjA4 3043 Rename combine functions TomNicholas 35968931 closed 0     2 2019-06-25T22:31:40Z 2019-06-26T15:01:01Z 2019-06-26T15:00:38Z MEMBER   0 pydata/xarray/pulls/3043

Finishes #2616 by renaming combine_manual -> combine_nested, combine_auto -> combine_by_coords, and renaming the arguments to open_mfdataset: combine='manual' -> combine='nested' combine='auto' -> combine='by_coords'

[x] code changes [x] docs changes [x] what's new updated

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3043/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
354923742 MDU6SXNzdWUzNTQ5MjM3NDI= 2388 Test equality of DataArrays up to transposition TomNicholas 35968931 closed 0     2 2018-08-28T22:13:01Z 2018-10-08T12:25:46Z 2018-10-08T12:25:46Z MEMBER      

While writing some unit tests to check I had wrapped np.gradient correctly with xr.apply_ufunc, I came unstuck because my results were equivalent except for transposed dimensions. It took me a while to realise that xarray.testing.asset_equal considers two DataArrays equal only if their dimensions are in the same order, because intuitively that shouldn't matter in the context of xarray's data model.

A simple example to demonstrate what I mean: ```python

Create two functionally-equivalent dataarrays

data = np.random.randn(4, 3) da1 = xr.DataArray(data, dims=('x', 'y')) da2 = xr.DataArray(data.T, dims=('y', 'x'))

This test will fail

xarray.tests.assert_equal(da1, da2) This test fails, with output E AssertionError: <xarray.DataArray (x: 4, y: 3)> E array([[ 0.761038, 0.121675, 0.443863], E [ 0.333674, 1.494079, -0.205158], E [ 0.313068, -0.854096, -2.55299 ], E [ 0.653619, 0.864436, -0.742165]]) E Coordinates: E * x (x) int64 5 7 9 11 E * y (y) int64 1 4 6 E <xarray.DataArray (y: 3, x: 4)> E array([[ 0.761038, 0.333674, 0.313068, 0.653619], E [ 0.121675, 1.494079, -0.854096, 0.864436], E [ 0.443863, -0.205158, -2.55299 , -0.742165]]) E Coordinates: E * x (x) int64 5 7 9 11 E * y (y) int64 1 4 6 ``` even though these two DataArrays are functionally-equivalent for all xarray operations you could perform with them.

It would make certain types of unit tests simpler and clearer to have a function like python xarray.tests.assert_equivalent(da1, da2) which would return true if one DataArray can be formed from the other by transposition.

I would have thought that a test that does this would just transpose one into the shape of the other before comparison?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2388/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 28.82ms · About: xarray-datasette