issues
46 rows where comments = 0, type = "issue" and user = 2448579 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2228319306 | I_kwDOAMm_X86E0XRK | 8914 | swap_dims does not propagate indexes properly | dcherian 2448579 | open | 0 | 0 | 2024-04-05T15:36:26Z | 2024-04-05T15:36:27Z | MEMBER | What happened?Found by hypothesis ``` import xarray as xr import numpy as np var = xr.Variable(dims="2", data=np.array(['1970-01-01T00:00:00.000000000', '1970-01-01T00:00:00.000000002', '1970-01-01T00:00:00.000000001'], dtype='datetime64[ns]')) var1 = xr.Variable(data=np.array([0], dtype=np.uint32), dims=['1'], attrs={}) state = xr.Dataset() state['2'] = var state = state.stack({"0": ["2"]}) state['1'] = var1 state['1_'] = var1#.copy(deep=True) state = state.swap_dims({"1": "1_"}) xr.testing.assertions._assert_internal_invariants(state, False) ``` This swaps simple pandas indexed dims, but the multi-index that is in the dataset and not affected by the swap_dims op ends up broken. cc @benbovy What did you expect to happen?No response Minimal Complete Verifiable ExampleNo response MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8914/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2136709010 | I_kwDOAMm_X85_W5eS | 8753 | Lazy Loading with `DataArray` vs. `Variable` | dcherian 2448579 | closed | 0 | 0 | 2024-02-15T14:42:24Z | 2024-04-04T16:46:54Z | 2024-04-04T16:46:54Z | MEMBER | Discussed in https://github.com/pydata/xarray/discussions/8751
<sup>Originally posted by **ilan-gold** February 15, 2024</sup>
My goal is to get a dataset from [custom io-zarr backend lazy-loaded](https://docs.xarray.dev/en/stable/internals/how-to-add-new-backend.html#how-to-support-lazy-loading). But when I declare a `DataArray` based on the `Variable` which uses `LazilyIndexedArray`, everything is read in. Is this expected? I specifically don't want to have to use dask if possible. I have seen https://github.com/aurghs/xarray-backend-tutorial/blob/main/2.Backend_with_Lazy_Loading.ipynb but it's a little bit different.
While I have a custom backend array inheriting from `ZarrArrayWrapper`, this example using `ZarrArrayWrapper` directly still highlights the same unexpected behavior of everything being read in.
```python
import zarr
import xarray as xr
from tempfile import mkdtemp
import numpy as np
from pathlib import Path
from collections import defaultdict
class AccessTrackingStore(zarr.DirectoryStore):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._access_count = {}
self._accessed = defaultdict(set)
def __getitem__(self, key):
for tracked in self._access_count:
if tracked in key:
self._access_count[tracked] += 1
self._accessed[tracked].add(key)
return super().__getitem__(key)
def get_access_count(self, key):
return self._access_count[key]
def set_key_trackers(self, keys_to_track):
if isinstance(keys_to_track, str):
keys_to_track = [keys_to_track]
for k in keys_to_track:
self._access_count[k] = 0
def get_subkeys_accessed(self, key):
return self._accessed[key]
orig_path = Path(mkdtemp())
z = zarr.group(orig_path / "foo.zarr")
z['array'] = np.random.randn(1000, 1000)
store = AccessTrackingStore(orig_path / "foo.zarr")
store.set_key_trackers(['array'])
z = zarr.group(store)
arr = xr.backends.zarr.ZarrArrayWrapper(z['array'])
lazy_arr = xr.core.indexing.LazilyIndexedArray(arr)
# just `.zarray`
var = xr.Variable(('x', 'y'), lazy_arr)
print('Variable read in ', store.get_subkeys_accessed('array'))
# now everything is read in
da = xr.DataArray(var)
print('DataArray read in ', store.get_subkeys_accessed('array'))
``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8753/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2135011804 | I_kwDOAMm_X85_QbHc | 8748 | release v2024.02.0 | dcherian 2448579 | closed | 0 | keewis 14808389 | 0 | 2024-02-14T19:08:38Z | 2024-02-18T22:52:15Z | 2024-02-18T22:52:15Z | MEMBER | What is your issue?Thanks to @keewis for volunteering at today's meeting :() |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8748/reactions", "total_count": 3, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | |||||
2086607437 | I_kwDOAMm_X858XxpN | 8616 | new release 2024.01.0 | dcherian 2448579 | closed | 0 | 0 | 2024-01-17T17:03:20Z | 2024-01-17T19:21:12Z | 2024-01-17T19:21:12Z | MEMBER | What is your issue?Thanks @TomNicholas for volunteering to drive this release! |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8616/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2064420057 | I_kwDOAMm_X857DIzZ | 8581 | bump min versions | dcherian 2448579 | closed | 0 | 0 | 2024-01-03T17:45:10Z | 2024-01-05T16:13:16Z | 2024-01-05T16:13:15Z | MEMBER | What is your issue?Looks like we can bump a number of min versions: ``` Package Required Policy Status cartopy 0.20 (2021-09-17) 0.21 (2022-09-10) < dask-core 2022.7 (2022-07-08) 2022.12 (2022-12-02) < distributed 2022.7 (2022-07-08) 2022.12 (2022-12-02) < flox 0.5 (2022-05-03) 0.6 (2022-10-12) < iris 3.2 (2022-02-15) 3.4 (2022-12-01) < matplotlib-base 3.5 (2021-11-18) 3.6 (2022-09-16) < numba 0.55 (2022-01-14) 0.56 (2022-09-28) < numpy 1.22 (2022-01-03) 1.23 (2022-06-23) < packaging 21.3 (2021-11-18) 22.0 (2022-12-08) < pandas 1.4 (2022-01-22) 1.5 (2022-09-19) < scipy 1.8 (2022-02-06) 1.9 (2022-07-30) < seaborn 0.11 (2020-09-08) 0.12 (2022-09-06) < typing_extensions 4.3 (2022-07-01) 4.4 (2022-10-07) < zarr 2.12 (2022-06-23) 2.13 (2022-09-27) < ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8581/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2052952379 | I_kwDOAMm_X856XZE7 | 8568 | Raise when assigning attrs to virtual variables (default coordinate arrays) | dcherian 2448579 | open | 0 | 0 | 2023-12-21T19:24:11Z | 2023-12-21T19:24:19Z | MEMBER | Discussed in https://github.com/pydata/xarray/discussions/8567
<sup>Originally posted by **matthew-brett** December 21, 2023</sup>
Sorry for the introductory question, but we (@ivanov and I) ran into this behavior while experimenting:
```python
import numpy as np
data = np.zeros((3, 4, 5))
ds = xr.DataArray(data, dims=('i', 'j', 'k'))
print(ds['k'].attrs)
```
This shows `{}` as we might reasonably expect. But then:
```python
ds['k'].attrs['foo'] = 'bar'
print(ds['k'].attrs)
```
This also gives `{}`, which we found surprising. We worked out why that was, after a little experimentation (the default coordinate arrays seems to get created on the fly and garbage collected immediately). But it took us a little while. Is that as intended? Is there a way of making this less confusing?
Thanks for any help. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8568/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1954809370 | I_kwDOAMm_X850hAYa | 8353 | Update benchmark suite for asv 0.6.1 | dcherian 2448579 | open | 0 | 0 | 2023-10-20T18:13:22Z | 2023-12-19T05:53:21Z | MEMBER | The new asv version comes with decorators for parameterizing and skipping, and the ability to use https://github.com/airspeed-velocity/asv/releases https://asv.readthedocs.io/en/v0.6.1/writing_benchmarks.html#skipping-benchmarks This might help us reduce benchmark times a bit, or at least simplify the code some. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8353/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1989588884 | I_kwDOAMm_X852lreU | 8448 | mypy 1.7.0 raising errors | dcherian 2448579 | closed | 0 | 0 | 2023-11-12T21:41:43Z | 2023-12-01T22:02:22Z | 2023-12-01T22:02:22Z | MEMBER | What happened?
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8448/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1942893480 | I_kwDOAMm_X85zzjOo | 8306 | keep_attrs for NamedArray | dcherian 2448579 | open | 0 | 0 | 2023-10-14T02:29:54Z | 2023-10-14T02:31:35Z | MEMBER | What is your issue?Copying over @max-sixty's comment from https://github.com/pydata/xarray/pull/8304#discussion_r1358873522
@pydata/xarray Should we just delete the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8306/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1888576440 | I_kwDOAMm_X85wkWO4 | 8162 | Update group by multi index | dcherian 2448579 | open | 0 | 0 | 2023-09-09T04:50:29Z | 2023-09-09T04:50:39Z | MEMBER | ideally The goal is to avoid calling There are actually more general issues:
Originally posted by @benbovy in https://github.com/pydata/xarray/issues/8140#issuecomment-1709775666 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8162/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1812504689 | I_kwDOAMm_X85sCKBx | 8006 | Fix documentation about datetime_unit of xarray.DataArray.differentiate | dcherian 2448579 | closed | 0 | 0 | 2023-07-19T18:31:10Z | 2023-09-01T09:37:15Z | 2023-09-01T09:37:15Z | MEMBER | Should say that Discussed in https://github.com/pydata/xarray/discussions/8000
<sup>Originally posted by **jesieleo** July 19, 2023</sup>
I have a piece of data that looks like this
```
<xarray.Dataset>
Dimensions: (time: 612, LEV: 15, latitude: 20, longitude: 357)
Coordinates:
* time (time) datetime64[ns] 1960-01-15 1960-02-15 ... 2010-12-15
* LEV (LEV) float64 5.01 15.07 25.28 35.76 ... 149.0 171.4 197.8 229.5
* latitude (latitude) float64 -4.75 -4.25 -3.75 -3.25 ... 3.75 4.25 4.75
* longitude (longitude) float64 114.2 114.8 115.2 115.8 ... 291.2 291.8 292.2
Data variables:
u (time, LEV, latitude, longitude) float32 ...
Attributes: (12/30)
cdm_data_type: Grid
Conventions: COARDS, CF-1.6, ACDD-1.3
creator_email: chepurin@umd.edu
creator_name: APDRC
creator_type: institution
creator_url: https://www.atmos.umd.edu/~ocean/
... ...
standard_name_vocabulary: CF Standard Name Table v29
summary: Simple Ocean Data Assimilation (SODA) soda po...
time_coverage_end: 2010-12-15T00:00:00Z
time_coverage_start: 1983-01-15T00:00:00Z
title: SODA soda pop2.2.4 [TIME][LEV][LAT][LON]
Westernmost_Easting: 118.25
```
when i try to use xarray.DataArray.differentiate
`data.u.differentiate('time',datetime_unit='M')`
will appear
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "D:\Anaconda3\lib\site-packages\xarray\core\dataarray.py", line 3609, in differentiate
ds = self._to_temp_dataset().differentiate(coord, edge_order, datetime_unit)
File "D:\Anaconda3\lib\site-packages\xarray\core\dataset.py", line 6372, in differentiate
coord_var = coord_var._to_numeric(datetime_unit=datetime_unit)
File "D:\Anaconda3\lib\site-packages\xarray\core\variable.py", line 2428, in _to_numeric
numeric_array = duck_array_ops.datetime_to_numeric(
File "D:\Anaconda3\lib\site-packages\xarray\core\duck_array_ops.py", line 466, in datetime_to_numeric
array = array / np.timedelta64(1, datetime_unit)
TypeError: Cannot get a common metadata divisor for Numpy datatime metadata [ns] and [M] because they have incompatible nonlinear base time units.
```
Would you please told me is this a BUG? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8006/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1824824446 | I_kwDOAMm_X85sxJx- | 8025 | Support Groupby first, last with flox | dcherian 2448579 | open | 0 | 0 | 2023-07-27T17:07:51Z | 2023-07-27T19:08:06Z | MEMBER | Is your feature request related to a problem?flox recently added support for first, last, nanfirst, nanlast. So we should support that on the Xarray GroupBy object. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8025/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1642299599 | I_kwDOAMm_X85h44DP | 7683 | automatically chunk in groupby binary ops | dcherian 2448579 | closed | 0 | 0 | 2023-03-27T15:14:09Z | 2023-07-27T16:41:35Z | 2023-07-27T16:41:34Z | MEMBER | What happened?From https://discourse.pangeo.io/t/xarray-unable-to-allocate-memory-how-to-size-up-problem/3233/4 Consider ``` python ds is dataset with big dask arraysmean = ds.groupby("time.day").mean() mean.to_netcdf() mean = xr.open_dataset(...) ds.groupby("time.day") - mean ``` In we will eagerly construct What did you expect to happen?I think the only solution is to automatically chunk if Minimal Complete Verifiable ExampleNo response MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7683/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1797636782 | I_kwDOAMm_X85rJcKu | 7976 | Explore updating colormap code | dcherian 2448579 | closed | 0 | 0 | 2023-07-10T21:51:30Z | 2023-07-11T13:49:54Z | 2023-07-11T13:49:53Z | MEMBER | What is your issue?See https://github.com/matplotlib/matplotlib/issues/16296 Looks like the MPL API may have advanced enough that we can delete some of our use of private attributes. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7976/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1692597701 | I_kwDOAMm_X85k4v3F | 7808 | Default behaviour of `min_count` wrong with flox | dcherian 2448579 | closed | 0 | 0 | 2023-05-02T15:04:11Z | 2023-05-10T02:39:45Z | 2023-05-10T02:39:45Z | MEMBER | What happened?```python with xr.set_options(display_style="text", use_flox=False): with xr.set_options(use_flox=False): display( xr.DataArray( data=np.array([np.nan, 1, 1, np.nan, 1, 1]), dims="x", coords={"labels": ("x", np.array([1, 2, 3, 1, 2, 3]))}, ) .groupby("labels") .sum() )
``` ``` without flox<xarray.DataArray (labels: 3)> array([0., 2., 2.]) Coordinates: * labels (labels) int64 1 2 3 with flox<xarray.DataArray (labels: 3)> array([nan, 2., 2.]) Coordinates: * labels (labels) int64 1 2 3 ``` What did you expect to happen?The same answer. We should set Minimal Complete Verifiable ExampleNo response MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7808/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1649611456 | I_kwDOAMm_X85iUxLA | 7704 | follow upstream scipy interpolation improvements | dcherian 2448579 | open | 0 | 0 | 2023-03-31T15:46:56Z | 2023-03-31T15:46:56Z | MEMBER | Is your feature request related to a problem?Scipy 1.10.0 has some great improvements to interpolation (release notes) particularly around the fancier methods like It'd be good to see if we can simplify some of our code (or even enable using these options). Describe the solution you'd likeNo response Describe alternatives you've consideredNo response Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7704/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
626591460 | MDU6SXNzdWU2MjY1OTE0NjA= | 4107 | renaming Variable to a dimension name does not convert to IndexVariable | dcherian 2448579 | closed | 0 | benbovy 4160723 | 0 | 2020-05-28T15:11:49Z | 2022-09-27T09:33:42Z | 2022-09-27T09:33:42Z | MEMBER | Seen in #4103 MCVE Code Sample```python from xarray.tests import assert_identical coord_1 = xr.DataArray([1, 2], dims=["coord_1"], attrs={"attrs": True}) da = xr.DataArray([1, 0], [coord_1]) obj = da.reset_index("coord_1").rename({"coord_1_": "coord_1"}) assert_identical(da, obj) ``` Expected OutputProblem Description``` AssertionErrorTraceback (most recent call last) <ipython-input-19-02ef6bd89884> in <module> ----> 1 assert_identical(da, obj) ~/work/python/xarray/xarray/tests/init.py in assert_identical(a, b) 160 xarray.testing.assert_identical(a, b) 161 xarray.testing._assert_internal_invariants(a) --> 162 xarray.testing._assert_internal_invariants(b) 163 164 ~/work/python/xarray/xarray/testing.py in _assert_internal_invariants(xarray_obj) 265 _assert_variable_invariants(xarray_obj) 266 elif isinstance(xarray_obj, DataArray): --> 267 _assert_dataarray_invariants(xarray_obj) 268 elif isinstance(xarray_obj, Dataset): 269 _assert_dataset_invariants(xarray_obj) ~/work/python/xarray/xarray/testing.py in _assert_dataarray_invariants(da) 210 assert all( 211 isinstance(v, IndexVariable) for (k, v) in da._coords.items() if v.dims == (k,) --> 212 ), {k: type(v) for k, v in da._coords.items()} 213 for k, v in da._coords.items(): 214 _assert_variable_invariants(v, k) AssertionError: {'coord_1': <class 'xarray.core.variable.Variable'>} ``` VersionsOutput of <tt>xr.show_versions()</tt> |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4107/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | |||||
1378174355 | I_kwDOAMm_X85SJUWT | 7055 | Use roundtrip context manager in distributed write tests | dcherian 2448579 | open | 0 | 0 | 2022-09-19T15:53:40Z | 2022-09-19T15:53:40Z | MEMBER | What is your issue?File roundtripping tests in |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7055/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1289174987 | I_kwDOAMm_X85M1z_L | 6739 | "center" kwarg ignored when manually iterating over DataArrayRolling | dcherian 2448579 | closed | 0 | 0 | 2022-06-29T19:07:07Z | 2022-07-14T17:41:01Z | 2022-07-14T17:41:01Z | MEMBER | Discussed in https://github.com/pydata/xarray/discussions/6738
<sup>Originally posted by **ckingdon95** June 29, 2022</sup>
Hello, I am trying to manually iterate over a DataArrayRolling object, as described [here ](https://docs.xarray.dev/en/stable/user-guide/computation.html#rolling-window-operations)in the documentation.
I am confused why the following two code chunks do not produce the same sequence of values. I would like to be able to manually iterate over a DataArrayRolling object, and still be given center-justified windows. Is there a way to do this?
```python
import xarray as xr
import numpy as np
my_data = xr.DataArray(np.arange(1,10), dims="x")
# Option 1: take a center-justified rolling average
result1 = my_data.rolling(x=3, center=True).mean().values
result1
```
This returns the following values, as expected:
```
array([nan, 2., 3., 4., 5., 6., 7., 8., nan])
```
Whereas when I do it manually, it is not equivalent:
```python
# Option 2: try to manually iterate, but the result is not centered
my_data_rolling = my_data.rolling(x=3, center=True)
result2 = [window.mean().values.item() for label, window in my_data_rolling]
result2
```
This returns
```
[nan, nan, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0]
```
Is this an issue with the window iterator? If it is not an issue, then is there a way for me to get the center-justified windows in the manual iteration? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6739/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
968977385 | MDU6SXNzdWU5Njg5NzczODU= | 5699 | describe options in documentation | dcherian 2448579 | closed | 0 | 0 | 2021-08-12T14:48:00Z | 2022-06-25T20:01:07Z | 2022-06-25T20:01:07Z | MEMBER | I think we only describe available options in the API reference for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5699/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1178907807 | I_kwDOAMm_X85GRLSf | 6407 | Add backend tutorial material | dcherian 2448579 | closed | 0 | 0 | 2022-03-24T03:44:22Z | 2022-06-23T01:51:44Z | 2022-06-23T01:51:44Z | MEMBER | What is your issue?@aurghs developed some nice backend tutorial material for the Dask Summit: https://github.com/aurghs/xarray-backend-tutorial It'd be nice to add it either to our main documentation or to https://github.com/xarray-contrib/xarray-tutorial. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6407/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1238783899 | I_kwDOAMm_X85J1leb | 6616 | flox breaks multiindex groupby | dcherian 2448579 | closed | 0 | 0 | 2022-05-17T15:05:00Z | 2022-05-17T16:11:18Z | 2022-05-17T16:11:18Z | MEMBER | What happened?From @malmans2 ``` python import numpy as np import xarray as xr ds = xr.Dataset( dict(a=(("z",), np.ones(10))), coords=dict(b=(("z"), np.arange(2).repeat(5)), c=(("z"), np.arange(5).repeat(2))), ).set_index(bc=["b", "c"]) grouped = ds.groupby("bc") with xr.set_options(use_flox=False): grouped.sum() # OK with xr.set_options(use_flox=True): grouped.sum() # Error ``` What did you expect to happen?No response Minimal Complete Verifiable ExampleNo response MVCE confirmation
Relevant log output```Python ctests/test_xarray.py:329: in test_multi_index_groupby_sum actual = xarray_reduce(ds, "bc", func="sum") flox/xarray.py:374: in xarray_reduce actual[k] = v.expand_dims(missing_group_dims) ../xarray/xarray/core/dataset.py:1427: in setitem self.update({key: value}) ../xarray/xarray/core/dataset.py:4432: in update merge_result = dataset_update_method(self, other) ../xarray/xarray/core/merge.py:1070: in dataset_update_method return merge_core( ../xarray/xarray/core/merge.py:722: in merge_core aligned = deep_align( ../xarray/xarray/core/alignment.py:824: in deep_align aligned = align( ../xarray/xarray/core/alignment.py:761: in align aligner.align() ../xarray/xarray/core/alignment.py:550: in align self.assert_unindexed_dim_sizes_equal() ../xarray/xarray/core/alignment.py:450: in assert_unindexed_dim_sizes_equal raise ValueError( E ValueError: cannot reindex or align along dimension 'bc' because of conflicting dimension sizes: {10, 6} (note: an index is found along that dimension with size=10) ____ test_multi_index_groupby_sum[numpy] _______________________________ tests/test_xarray.py:329: in test_multi_index_groupby_sum actual = xarray_reduce(ds, "bc", func="sum") flox/xarray.py:374: in xarray_reduce actual[k] = v.expand_dims(missing_group_dims) ../xarray/xarray/core/dataset.py:1427: in __setitem self.update({key: value}) ../xarray/xarray/core/dataset.py:4432: in update merge_result = dataset_update_method(self, other) ../xarray/xarray/core/merge.py:1070: in dataset_update_method return merge_core( ../xarray/xarray/core/merge.py:722: in merge_core aligned = deep_align( ../xarray/xarray/core/alignment.py:824: in deep_align aligned = align( ../xarray/xarray/core/alignment.py:761: in align aligner.align() ../xarray/xarray/core/alignment.py:550: in align self.assert_unindexed_dim_sizes_equal() ../xarray/xarray/core/alignment.py:450: in assert_unindexed_dim_sizes_equal raise ValueError( E ValueError: cannot reindex or align along dimension 'bc' because of conflicting dimension sizes: {10, 6} (note: an index is found along that dimension with size=10) Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6616/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1235494254 | I_kwDOAMm_X85JpCVu | 6606 | Fix benchmark CI | dcherian 2448579 | closed | 0 | 0 | 2022-05-13T17:18:32Z | 2022-05-14T23:06:44Z | 2022-05-14T23:06:44Z | MEMBER | What is your issue?It's failing during setup: https://github.com/pydata/xarray/runs/6424624397?check_suite_focus=true ``` · Discovering benchmarks ·· Uninstalling from conda-py3.8-bottleneck-dask-distributed-flox-netcdf4-numpy-numpy_groupies-pandas-scipy-sparse ·· Building dd20d07f for conda-py3.8-bottleneck-dask-distributed-flox-netcdf4-numpy-numpy_groupies-pandas-scipy-sparse ·· Error running /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/bin/python -mpip wheel --no-deps --no-index -w /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/asv-build-cache/dd20d07f4057a9e29222ca132c36cbaaf3fbb242 /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/project (exit status 1) STDOUT --------> Processing /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/project STDERR --------> ERROR: Some build dependencies for file:///home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/project are missing: 'setuptools_scm[toml]>=3.4', 'setuptools_scm_git_archive'. ·· Failed: trying different commit/environment ·· Uninstalling from conda-py3.8-bottleneck-dask-distributed-flox-netcdf4-numpy-numpy_groupies-pandas-scipy-sparse ·· Building c34ef8a6 for conda-py3.8-bottleneck-dask-distributed-flox-netcdf4-numpy-numpy_groupies-pandas-scipy-sparse ·· Error running /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/bin/python -mpip wheel --no-deps --no-index -w /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/asv-build-cache/c34ef8a60227720724e90aa11a6266c0026a812a /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/project (exit status 1) STDOUT --------> Processing /home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/project STDERR --------> ERROR: Some build dependencies for file:///home/runner/work/xarray/xarray/asv_bench/.asv/env/e8ce5703538597037a298414451d04d2/project are missing: 'setuptools_scm[toml]>=3.4', 'setuptools_scm_git_archive'. ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6606/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1180334986 | I_kwDOAMm_X85GWnuK | 6411 | Better dask support in polyval | dcherian 2448579 | closed | 0 | 0 | 2022-03-25T04:35:48Z | 2022-05-05T20:17:07Z | 2022-05-05T20:17:07Z | MEMBER | Is your feature request related to a problem?polyval does not handle dask inputs well. ```python nt = 8772 // 4 ny = 489 nx = 655 chunks like the data is stored on disksmall in time, big in spacebecause the chunk sizes are -1 along lat, lon;reshaping this array to (time, latlon) prior to fitting is pretty cheapchunks = (8, -1, -1) da = xr.DataArray( dask.array.random.random((nt, ny, nx), chunks=chunks), dims=("ocean_time", "eta_rho", "xi_rho"), ) dim = "ocean_time" deg = 1 p = da.polyfit(dim="ocean_time", deg=1, skipna=False) create a chunked version of the "ocean_time" dimensionchunked_dim = xr.DataArray(
dask.array.from_array(da[dim].data, chunks=da.chunksizes[dim]), dims=dim, name=dim
)
xr.polyval(chunked_dim, p.polyfit_coefficients)
```
Describe the solution you'd likeHere's a partial solution. It does not handle datetime inputs (polyval handles this using ```python def polyval(coord, coeffs, degree_dim="degree"): x = coord.data
polyval(chunked_dim, p.polyfit_coefficients) ``` This looks like what I expected
cc @aulemahal Describe alternatives you've consideredNo response Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6411/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1203414243 | I_kwDOAMm_X85HuqTj | 6481 | refactor broadcast for flexible indexes | dcherian 2448579 | open | 0 | 0 | 2022-04-13T14:51:19Z | 2022-04-13T14:51:28Z | MEMBER | What is your issue?From @benbovy in https://github.com/pydata/xarray/pull/6477
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6481/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1194790343 | I_kwDOAMm_X85HNw3H | 6445 | map removes non-dimensional coordinate variables | dcherian 2448579 | open | 0 | 0 | 2022-04-06T15:40:40Z | 2022-04-06T15:40:40Z | MEMBER | What happened?
Variables What did you expect to happen?No response Minimal Complete Verifiable ExampleNo response Relevant log outputNo response Anything else we need to know?No response Environmentxarray 2022.03.0 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6445/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1171916710 | I_kwDOAMm_X85F2gem | 6372 | apply_ufunc + dask="parallelized" + no core dimensions should raise a nicer error about core dimensions being absent | dcherian 2448579 | open | 0 | 0 | 2022-03-17T04:25:37Z | 2022-03-17T05:10:16Z | MEMBER | What happened?From https://github.com/pydata/xarray/discussions/6370 Calling
What did you expect to happen?With numpy data the apply_ufunc call does raise an error:
Minimal Complete Verifiable Example``` python import xarray as xr dt = xr.Dataset( data_vars=dict( value=(["x"], [1,1,2,2,2,3,3,3,3,3]), ), coords=dict( lon=(["x"], np.linspace(0,1,10)), ), ).chunk(chunks={'x': tuple([2,3,5])}) # three chunks of different size xr.apply_ufunc( lambda x: np.mean(x), dt, dask="parallelized" ) ``` Relevant log outputNo response Anything else we need to know?No response EnvironmentN/A |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6372/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1048856436 | I_kwDOAMm_X84-hEd0 | 5962 | Test resampling with dask arrays | dcherian 2448579 | open | 0 | 0 | 2021-11-09T17:02:45Z | 2021-11-09T17:02:45Z | MEMBER | I noticed that we don't test resampling with dask arrays (well just one). This could be a good opportunity to convert |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5962/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1043846371 | I_kwDOAMm_X84-N9Tj | 5934 | add test for custom backend entrypoint | dcherian 2448579 | open | 0 | 0 | 2021-11-03T16:57:14Z | 2021-11-03T16:57:21Z | MEMBER | From https://github.com/pydata/xarray/pull/5931 It would be good to add a test checking that custom backend entrypoints work. This might involve creating a dummy package that registers an entrypoint (https://github.com/pydata/xarray/pull/5931#issuecomment-959131968) |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5934/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
938141608 | MDU6SXNzdWU5MzgxNDE2MDg= | 5582 | Faster unstacking of dask arrays | dcherian 2448579 | open | 0 | 0 | 2021-07-06T18:12:05Z | 2021-07-06T18:54:40Z | MEMBER | Recent dask version support assigning to a list of ints along one dimension. we can use this for unstacking (diff builds on #5577) ```diff diff --git i/xarray/core/variable.py w/xarray/core/variable.py index 222e8dab9..a50dfc574 100644 --- i/xarray/core/variable.py +++ w/xarray/core/variable.py @@ -1593,11 +1593,9 @@ class Variable(AbstractArray, NdimSizeLenMixin, VariableArithmetic): else: dtype = self.dtype
This should be what The annoying bit is figuring out when to use this version and what to do with things like dask wrapping sparse. I think we want to loop over each variable in cc @Illviljan if you're interested in implementing this |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5582/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
798586325 | MDU6SXNzdWU3OTg1ODYzMjU= | 4852 | mention HDF files in docs | dcherian 2448579 | open | 0 | 0 | 2021-02-01T18:05:23Z | 2021-07-04T01:24:22Z | MEMBER | This is such a common question that we should address it in the docs. Just saying that some hdf5 files can be opened with |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4852/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
664595680 | MDU6SXNzdWU2NjQ1OTU2ODA= | 4260 | use matplotlib's categorical axis features | dcherian 2448579 | closed | 0 | 0 | 2020-07-23T16:01:13Z | 2021-06-21T17:45:39Z | 2021-06-21T17:45:39Z | MEMBER | Is your feature request related to a problem? Please describe. xarray currently doesn't allow plotting against coordinates with string labels for example. Describe the solution you'd like Use matplotlib's categorical axis support. Example: https://matplotlib.org/gallery/lines_bars_and_markers/categorical_variables.html This may be the only place a change is required: https://github.com/pydata/xarray/blob/4e893317240ed1a80e65ea2de107e9179bb65446/xarray/plot/utils.py#L572-L608 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4260/reactions", "total_count": 8, "+1": 8, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
685590739 | MDU6SXNzdWU2ODU1OTA3Mzk= | 4373 | Add Dataset.plot.quiver | dcherian 2448579 | closed | 0 | 0 | 2020-08-25T15:39:37Z | 2021-02-19T14:21:45Z | 2021-02-19T14:21:45Z | MEMBER | I think it would be nice to add a quiver plot function. I got this far in my current project: ``` python @xarray.plot.dataset_plot._dsplot def quiver(ds, x, y, ax, u, v, **kwargs): from xarray import broadcast
``` The autoscaling logic is quite crude; I tried to copy what matplotlib does but got somewhat confused. To get faceting to work properly, we'll need to estimate |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4373/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
797053785 | MDU6SXNzdWU3OTcwNTM3ODU= | 4848 | simplify API reference presentation | dcherian 2448579 | open | 0 | 0 | 2021-01-29T17:23:41Z | 2021-01-29T17:23:46Z | MEMBER | Can we remove |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4848/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
787486472 | MDU6SXNzdWU3ODc0ODY0NzI= | 4817 | Add encoding to HTML repr | dcherian 2448579 | open | 0 | 0 | 2021-01-16T15:14:50Z | 2021-01-24T17:31:31Z | MEMBER | Is your feature request related to a problem? Please describe.
Describe the solution you'd like I think it'd be nice to add it to the HTML repr, collapsed by default. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4817/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
484955475 | MDU6SXNzdWU0ODQ5NTU0NzU= | 3263 | test_sparse doesn't work with pytest-xdist | dcherian 2448579 | closed | 0 | 0 | 2019-08-25T17:30:51Z | 2020-12-17T22:33:46Z | 2020-12-17T22:33:46Z | MEMBER |
``` ====================================================================================== ERRORS ====================================================================================== ___________ ERROR collecting gw0 ___________ Different tests were collected between gw1 and gw0. The difference is: --- gw1 +++ gw0 @@ -11,7 +11,7 @@ xarray/tests/test_sparse.py::test_variable_method[obj.any((), {})-False] xarray/tests/test_sparse.py::test_variable_method[obj.astype((), {'dtype': <class 'int'>})-True] xarray/tests/test_sparse.py::test_variable_method[obj.clip(*(), {'min': 0, 'max': 1})-True] -xarray/tests/test_sparse.py::test_variable_method[obj.coarsen((), {'windows': {'x': 2}, 'func': <function sum at 0x7fc7303f7d08>})-True] +xarray/tests/test_sparse.py::test_variable_method[obj.coarsen((), {'windows': {'x': 2}, 'func': <function sum at 0x7f6009fa4d08>})-True] xarray/tests/test_sparse.py::test_variable_method[obj.compute(*(), {})-True] xarray/tests/test_sparse.py::test_variable_method[obj.conj((), {})-True] xarray/tests/test_sparse.py::test_variable_method[obj.copy((), **{})-True] @@ -49,7 +49,7 @@ xarray/tests/test_sparse.py::test_variable_method[obj.prod((), {})-False] xarray/tests/test_sparse.py::test_variable_method[obj.quantile((), {'q': 0.5})-True] xarray/tests/test_sparse.py::test_variable_method[obj.rank(*(), {'dim': 'x'})-False] -xarray/tests/test_sparse.py::test_variable_method[obj.reduce((), {'func': <function sum at 0x7fc7303f7d08>, 'dim': 'x'})-True] +xarray/tests/test_sparse.py::test_variable_method[obj.reduce((), {'func': <function sum at 0x7f6009fa4d08>, 'dim': 'x'})-True] xarray/tests/test_sparse.py::test_variable_method[obj.rolling_window(*(), {'dim': 'x', 'window': 2, 'window_dim': 'x_win'})-True] xarray/tests/test_sparse.py::test_variable_method[obj.shift((), {'x': 2})-True] xarray/tests/test_sparse.py::test_variable_method[obj.std((), **{})-False] @@ -144,11 +144,11 @@ xarray/tests/test_sparse.py::test_dataarray_method[obj.median((), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.min((), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.notnull(*(), {})-False] -xarray/tests/test_sparse.py::test_dataarray_method[obj.pipe((<function sum at 0x7fc7303f7d08>,), {'axis': 1})-True] +xarray/tests/test_sparse.py::test_dataarray_method[obj.pipe((<function sum at 0x7f6009fa4d08>,), {'axis': 1})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.prod(*(), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.quantile((), {'q': 0.5})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.rank(('x',), {})-False] -xarray/tests/test_sparse.py::test_dataarray_method[obj.reduce(*(<function sum at 0x7fc7303f7d08>,), {'dim': 'x'})-False] +xarray/tests/test_sparse.py::test_dataarray_method[obj.reduce((<function sum at 0x7f6009fa4d08>,), {'dim': 'x'})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.reindex_like((<xarray.DataArray 'test' (x: 10, y: 5)>\n<COO: shape=(10, 5), dtype=float64, nnz=5, fill_value=0.0>\nCoordinates:\n * x (x) float64 0.5 1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5 9.5\n * y (y) float64 0.5 1.5 2.5 3.5 4.5,), {})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.roll(*(), {'x': 2})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.sel((), *{'x': [0, 1, 2], 'y': [2, 3]})-True] ___________ ERROR collecting gw2 ___________ Different tests were collected between gw1 and gw2. The difference is: --- gw1 +++ gw2 @@ -11,7 +11,7 @@ xarray/tests/test_sparse.py::test_variable_method[obj.any((), {})-False] xarray/tests/test_sparse.py::test_variable_method[obj.astype((), {'dtype': <class 'int'>})-True] xarray/tests/test_sparse.py::test_variable_method[obj.clip(*(), {'min': 0, 'max': 1})-True] -xarray/tests/test_sparse.py::test_variable_method[obj.coarsen((), {'windows': {'x': 2}, 'func': <function sum at 0x7fc7303f7d08>})-True] +xarray/tests/test_sparse.py::test_variable_method[obj.coarsen((), {'windows': {'x': 2}, 'func': <function sum at 0x7f657c314d08>})-True] xarray/tests/test_sparse.py::test_variable_method[obj.compute(*(), {})-True] xarray/tests/test_sparse.py::test_variable_method[obj.conj((), {})-True] xarray/tests/test_sparse.py::test_variable_method[obj.copy((), **{})-True] @@ -49,7 +49,7 @@ xarray/tests/test_sparse.py::test_variable_method[obj.prod((), {})-False] xarray/tests/test_sparse.py::test_variable_method[obj.quantile((), {'q': 0.5})-True] xarray/tests/test_sparse.py::test_variable_method[obj.rank(*(), {'dim': 'x'})-False] -xarray/tests/test_sparse.py::test_variable_method[obj.reduce((), {'func': <function sum at 0x7fc7303f7d08>, 'dim': 'x'})-True] +xarray/tests/test_sparse.py::test_variable_method[obj.reduce((), {'func': <function sum at 0x7f657c314d08>, 'dim': 'x'})-True] xarray/tests/test_sparse.py::test_variable_method[obj.rolling_window(*(), {'dim': 'x', 'window': 2, 'window_dim': 'x_win'})-True] xarray/tests/test_sparse.py::test_variable_method[obj.shift((), {'x': 2})-True] xarray/tests/test_sparse.py::test_variable_method[obj.std((), **{})-False] @@ -118,7 +118,7 @@ xarray/tests/test_sparse.py::test_dataarray_method[obj.sel((), {'x': [0, 1, 2]})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.shift((), {})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.sortby(*('x',), {'ascending': False})-True] -xarray/tests/test_sparse.py::test_dataarray_method[obj.stack((), {'z': {'y', 'x'}})-True] +xarray/tests/test_sparse.py::test_dataarray_method[obj.stack((), {'z': {'x', 'y'}})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.transpose(*(), {})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.broadcast_equals((<xarray.Variable (x: 10, y: 5)>\n<COO: shape=(10, 5), dtype=float64, nnz=5, fill_value=0.0>,), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.equals((<xarray.Variable (x: 10, y: 5)>\n<COO: shape=(10, 5), dtype=float64, nnz=5, fill_value=0.0>,), **{})-False] @@ -144,11 +144,11 @@ xarray/tests/test_sparse.py::test_dataarray_method[obj.median((), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.min((), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.notnull(*(), {})-False] -xarray/tests/test_sparse.py::test_dataarray_method[obj.pipe((<function sum at 0x7fc7303f7d08>,), {'axis': 1})-True] +xarray/tests/test_sparse.py::test_dataarray_method[obj.pipe((<function sum at 0x7f657c314d08>,), {'axis': 1})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.prod(*(), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.quantile((), {'q': 0.5})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.rank(('x',), {})-False] -xarray/tests/test_sparse.py::test_dataarray_method[obj.reduce(*(<function sum at 0x7fc7303f7d08>,), {'dim': 'x'})-False] +xarray/tests/test_sparse.py::test_dataarray_method[obj.reduce((<function sum at 0x7f657c314d08>,), {'dim': 'x'})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.reindex_like((<xarray.DataArray 'test' (x: 10, y: 5)>\n<COO: shape=(10, 5), dtype=float64, nnz=5, fill_value=0.0>\nCoordinates:\n * x (x) float64 0.5 1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5 9.5\n * y (y) float64 0.5 1.5 2.5 3.5 4.5,), {})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.roll(*(), {'x': 2})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.sel((), *{'x': [0, 1, 2], 'y': [2, 3]})-True] ___________ ERROR collecting gw3 ___________ Different tests were collected between gw1 and gw3. The difference is: --- gw1 +++ gw3 @@ -11,7 +11,7 @@ xarray/tests/test_sparse.py::test_variable_method[obj.any((), {})-False] xarray/tests/test_sparse.py::test_variable_method[obj.astype((), {'dtype': <class 'int'>})-True] xarray/tests/test_sparse.py::test_variable_method[obj.clip(*(), {'min': 0, 'max': 1})-True] -xarray/tests/test_sparse.py::test_variable_method[obj.coarsen((), {'windows': {'x': 2}, 'func': <function sum at 0x7fc7303f7d08>})-True] +xarray/tests/test_sparse.py::test_variable_method[obj.coarsen((), {'windows': {'x': 2}, 'func': <function sum at 0x7f6f284e3d08>})-True] xarray/tests/test_sparse.py::test_variable_method[obj.compute(*(), {})-True] xarray/tests/test_sparse.py::test_variable_method[obj.conj((), {})-True] xarray/tests/test_sparse.py::test_variable_method[obj.copy((), **{})-True] @@ -49,7 +49,7 @@ xarray/tests/test_sparse.py::test_variable_method[obj.prod((), {})-False] xarray/tests/test_sparse.py::test_variable_method[obj.quantile((), {'q': 0.5})-True] xarray/tests/test_sparse.py::test_variable_method[obj.rank(*(), {'dim': 'x'})-False] -xarray/tests/test_sparse.py::test_variable_method[obj.reduce((), {'func': <function sum at 0x7fc7303f7d08>, 'dim': 'x'})-True] +xarray/tests/test_sparse.py::test_variable_method[obj.reduce((), {'func': <function sum at 0x7f6f284e3d08>, 'dim': 'x'})-True] xarray/tests/test_sparse.py::test_variable_method[obj.rolling_window(*(), {'dim': 'x', 'window': 2, 'window_dim': 'x_win'})-True] xarray/tests/test_sparse.py::test_variable_method[obj.shift((), {'x': 2})-True] xarray/tests/test_sparse.py::test_variable_method[obj.std((), **{})-False] @@ -144,11 +144,11 @@ xarray/tests/test_sparse.py::test_dataarray_method[obj.median((), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.min((), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.notnull(*(), {})-False] -xarray/tests/test_sparse.py::test_dataarray_method[obj.pipe((<function sum at 0x7fc7303f7d08>,), {'axis': 1})-True] +xarray/tests/test_sparse.py::test_dataarray_method[obj.pipe((<function sum at 0x7f6f284e3d08>,), {'axis': 1})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.prod(*(), {})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.quantile((), {'q': 0.5})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.rank(('x',), {})-False] -xarray/tests/test_sparse.py::test_dataarray_method[obj.reduce(*(<function sum at 0x7fc7303f7d08>,), {'dim': 'x'})-False] +xarray/tests/test_sparse.py::test_dataarray_method[obj.reduce((<function sum at 0x7f6f284e3d08>,), {'dim': 'x'})-False] xarray/tests/test_sparse.py::test_dataarray_method[obj.reindex_like((<xarray.DataArray 'test' (x: 10, y: 5)>\n<COO: shape=(10, 5), dtype=float64, nnz=5, fill_value=0.0>\nCoordinates:\n * x (x) float64 0.5 1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5 9.5\n * y (y) float64 0.5 1.5 2.5 3.5 4.5,), {})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.roll(*(), {'x': 2})-True] xarray/tests/test_sparse.py::test_dataarray_method[obj.sel((), *{'x': [0, 1, 2], 'y': [2, 3]})-True] ================================================================================= warnings summary ================================================================================= /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/client.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/client.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/client.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/client.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/client.py:2: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working from collections import defaultdict, Iterator /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/publish.py:1 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/publish.py:1 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/publish.py:1 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/publish.py:1 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/publish.py:1: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working from collections import MutableMapping /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2 /home/deepak/miniconda3/envs/dcpy/lib/python3.7/site-packages/distributed/scheduler.py:2: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working from collections import defaultdict, deque, OrderedDict, Mapping, Set -- Docs: https://docs.pytest.org/en/latest/warnings.html``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3263/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
648250671 | MDU6SXNzdWU2NDgyNTA2NzE= | 4189 | List supported options for `backend_kwargs` in `open_dataset` | dcherian 2448579 | open | 0 | 0 | 2020-06-30T15:01:31Z | 2020-12-15T04:28:04Z | MEMBER | We should list supported options for xref #4187 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4189/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
526754376 | MDU6SXNzdWU1MjY3NTQzNzY= | 3558 | optimizing xarray operations for lazy array equality test | dcherian 2448579 | closed | 0 | 0 | 2019-11-21T18:01:51Z | 2020-02-24T18:26:30Z | 2020-02-24T18:26:30Z | MEMBER | TLDR: I think we want Currently if I do ``` python
Questions:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3558/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
556998262 | MDU6SXNzdWU1NTY5OTgyNjI= | 3725 | pip warning on CI | dcherian 2448579 | closed | 0 | 0 | 2020-01-29T17:07:49Z | 2020-01-29T23:39:39Z | 2020-01-29T23:39:39Z | MEMBER | We're getting this warning on CI (e.g. https://dev.azure.com/xarray/xarray/_build/results?buildId=2047&view=logs&j=e9c23135-6f4c-5980-91c2-81d28ce70c9b&t=bd167cdf-93c0-5e40-5b11-b5faeb5dc22f)
cc @crusaderky |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3725/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
538521723 | MDU6SXNzdWU1Mzg1MjE3MjM= | 3630 | reviewnb for example notebooks? | dcherian 2448579 | open | 0 | 0 | 2019-12-16T16:34:28Z | 2019-12-16T16:34:28Z | MEMBER | What do people think of adding ReviewNB https://www.reviewnb.com/ to facilitate easy reviewing of example notebooks? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3630/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
499481465 | MDU6SXNzdWU0OTk0ODE0NjU= | 3351 | custom "coordinates" attribute | dcherian 2448579 | closed | 0 | 0 | 2019-09-27T14:32:58Z | 2019-12-10T16:02:01Z | 2019-12-10T16:02:01Z | MEMBER | This set of lines prevents users from writing files with a handcrafted It seems to me like we should only automatically set "coordinates" when What do you think? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3351/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
533578755 | MDU6SXNzdWU1MzM1Nzg3NTU= | 3599 | map_blocks graph construction bug | dcherian 2448579 | closed | 0 | 0 | 2019-12-05T20:23:06Z | 2019-12-07T04:30:19Z | 2019-12-07T04:30:19Z | MEMBER | Just making a new issue for #3598 The tests for https://github.com/pydata/xarray/pull/3584 fail on ``` python import dask import xarray as xr ds = xr.Dataset({'x': (('y',), dask.array.ones(10, chunks=(3,)))}) mapped = ds.map_blocks(lambda x: x) mapped.compute() # works xr.testing.assert_equal(mapped, ds) # does not work xr.testing.assert_equal(mapped, ds.compute()) # works xr.testing.assert_equal(mapped.compute(), ds) # works xr.testing.assert_equal(mapped.compute(), ds.compute()) # works ``` The traceback is ``` ~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/array/optimization.py in optimize(dsk, keys, fuse_keys, fast_functions, inline_functions_fast_functions, rename_fused_keys, **kwargs) 41 if isinstance(dsk, HighLevelGraph): 42 dsk = optimize_blockwise(dsk, keys=keys) ---> 43 dsk = fuse_roots(dsk, keys=keys) 44 45 # Low level task optimizations ~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/blockwise.py in fuse_roots(graph, keys) 819 isinstance(layer, Blockwise) 820 and len(deps) > 1 --> 821 and not any(dependencies[dep] for dep in deps) # no need to fuse if 0 or 1 822 and all(len(dependents[dep]) == 1 for dep in deps) 823 ): ~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/blockwise.py in <genexpr>(.0) 819 isinstance(layer, Blockwise) 820 and len(deps) > 1 --> 821 and not any(dependencies[dep] for dep in deps) # no need to fuse if 0 or 1 822 and all(len(dependents[dep]) == 1 for dep in deps) 823 ): KeyError: 'lambda-6720ab0e3639d5c63fc06dfc66a3ce47-x' ``` This key is not in
I'm not sure whether this is a bug in |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3599/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
494095795 | MDU6SXNzdWU0OTQwOTU3OTU= | 3311 | optimize compatibility checks in merge.unique_variable | dcherian 2448579 | closed | 0 | 0 | 2019-09-16T14:43:39Z | 2019-11-05T15:41:15Z | 2019-11-05T15:41:15Z | MEMBER | Currently
One solution would be to loop through once checking attrs, shapes and |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3311/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
491898181 | MDU6SXNzdWU0OTE4OTgxODE= | 3300 | test for h5netcdf invalid_netcdf warning is failing | dcherian 2448579 | closed | 0 | 0 | 2019-09-10T21:05:48Z | 2019-09-13T15:39:41Z | 2019-09-13T15:39:41Z | MEMBER | This passes locally with ``` =================================== FAILURES =================================== ___ TestH5NetCDFData.test_complex[None-FutureWarning-1] ______ self = <xarray.tests.test_backends.TestH5NetCDFData object at 0x7f4509fac358> invalid_netcdf = None, warns = <class 'FutureWarning'>, num_warns = 1
xarray/tests/test_backends.py:2185: AssertionError ___ TestH5NetCDFData.test_complex[False-FutureWarning-1] _____ self = <xarray.tests.test_backends.TestH5NetCDFData object at 0x7f450a96c748> invalid_netcdf = False, warns = <class 'FutureWarning'>, num_warns = 1
xarray/tests/test_backends.py:2185: AssertionError ___ TestH5NetCDFData.testcomplex[True-None-0] ____ self = <xarray.tests.test_backends.TestH5NetCDFData object at 0x7f450aa69be0> invalid_netcdf = True, warns = None, num_warns = 0
xarray/tests/test_backends.py:2185: AssertionError _ TestH5NetCDFFileObject.testcomplex[None-FutureWarning-1] ___ self = <xarray.tests.test_backends.TestH5NetCDFFileObject object at 0x7f4509e1a8d0> invalid_netcdf = None, warns = <class 'FutureWarning'>, num_warns = 1
xarray/tests/test_backends.py:2185: AssertionError _ TestH5NetCDFFileObject.testcomplex[False-FutureWarning-1] __ self = <xarray.tests.test_backends.TestH5NetCDFFileObject object at 0x7f4509c749e8> invalid_netcdf = False, warns = <class 'FutureWarning'>, num_warns = 1
xarray/tests/test_backends.py:2185: AssertionError __ TestH5NetCDFFileObject.test_complex[True-None-0] __ self = <xarray.tests.test_backends.TestH5NetCDFFileObject object at 0x7f450afe69b0> invalid_netcdf = True, warns = None, num_warns = 0
xarray/tests/test_backends.py:2185: AssertionError _ TestH5NetCDFViaDaskData.testcomplex[None-FutureWarning-1] __ self = <xarray.tests.test_backends.TestH5NetCDFViaDaskData object at 0x7f4509fa0c18> invalid_netcdf = None, warns = <class 'FutureWarning'>, num_warns = 1
xarray/tests/test_backends.py:2185: AssertionError ___ TestH5NetCDFViaDaskData.test_complex[False-FutureWarning-1] ____ self = <xarray.tests.test_backends.TestH5NetCDFViaDaskData object at 0x7f4509e45e10> invalid_netcdf = False, warns = <class 'FutureWarning'>, num_warns = 1
xarray/tests/test_backends.py:2185: AssertionError __ TestH5NetCDFViaDaskData.test_complex[True-None-0] ___ self = <xarray.tests.test_backends.TestH5NetCDFViaDaskData object at 0x7f450aa3f668> invalid_netcdf = True, warns = None, num_warns = 0
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3300/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
435787982 | MDU6SXNzdWU0MzU3ODc5ODI= | 2913 | Document xarray data model | dcherian 2448579 | open | 0 | 0 | 2019-04-22T16:23:41Z | 2019-04-22T16:23:41Z | MEMBER | It would be nice to have a separate page that detailed this for users unfamiliar with netCDF. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2913/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
391440480 | MDU6SXNzdWUzOTE0NDA0ODA= | 2610 | docs build errors | dcherian 2448579 | closed | 0 | 0 | 2018-12-16T06:45:58Z | 2018-12-17T21:57:36Z | 2018-12-17T21:57:36Z | MEMBER | We're seeing a lot of errors in the docs build on Travis though the build is reported successful?! |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2610/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);