issues
425 rows where state = "closed" and user = 2448579 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, draft, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2261855627 | PR_kwDOAMm_X85togwQ | 8969 | CI: python 3.12 by default. | dcherian 2448579 | closed | 0 | 2 | 2024-04-24T17:49:25Z | 2024-04-29T16:21:20Z | 2024-04-29T16:21:08Z | MEMBER | 0 | pydata/xarray/pulls/8969 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8969/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1574694462 | I_kwDOAMm_X85d2-4- | 7513 | intermittent failures with h5netcdf, h5py on macos | dcherian 2448579 | closed | 0 | 5 | 2023-02-07T16:58:43Z | 2024-04-28T23:35:21Z | 2024-04-28T23:35:21Z | MEMBER | What is your issue?cc @hmaarrfk @kmuehlbauer Passed: https://github.com/pydata/xarray/actions/runs/4115923717/jobs/7105298426 Failed: https://github.com/pydata/xarray/actions/runs/4115946392/jobs/7105345290 Versions:
``` =================================== FAILURES =================================== ___ test_open_mfdataset_manyfiles[h5netcdf-20-True-5-5] ______ [gw1] darwin -- Python 3.10.9 /Users/runner/micromamba-root/envs/xarray-tests/bin/python readengine = 'h5netcdf', nfiles = 20, parallel = True, chunks = 5 file_cache_maxsize = 5
/Users/runner/work/xarray/xarray/xarray/tests/test_backends.py:3267: /Users/runner/work/xarray/xarray/xarray/backends/api.py:991: in open_mfdataset datasets, closers = dask.compute(datasets, closers) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py:599: in compute results = schedule(dsk, keys, kwargs) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py:89: in get results = get_async( /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:511: in get_async raise_exception(exc, tb) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:319: in reraise raise exc /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:224: in execute_task result = _execute_task(task, data) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py:119: in _execute_task return func((_execute_task(a, cache) for a in args)) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py:72: in apply return func(args, kwargs) /Users/runner/work/xarray/xarray/xarray/backends/api.py:526: in open_dataset backend_ds = backend.open_dataset( /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:417: in open_dataset ds = store_entrypoint.open_dataset( /Users/runner/work/xarray/xarray/xarray/backends/store.py:32: in open_dataset vars, attrs = store.load() /Users/runner/work/xarray/xarray/xarray/backends/common.py:129: in load (decode_variable_name(k), v) for k, v in self.get_variables().items() /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf.py:220: in get_variables return FrozenDict( /Users/runner/work/xarray/xarray/xarray/core/utils.py:471: in FrozenDict return Frozen(dict(args, *kwargs)) /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:221: in <genexpr> (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items() /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:200: in open_store_variable elif var.compression is not None: /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/h5netcdf/core.py:394: in compression return self._h5ds.compression self = <[AttributeError("'NoneType' object has no attribute '_root'") raised in repr()] Variable object at 0x151378970>
``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7513/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2261844699 | PR_kwDOAMm_X85toeXT | 8968 | Bump dependencies incl `pandas>=2` | dcherian 2448579 | closed | 0 | 0 | 2024-04-24T17:42:19Z | 2024-04-27T14:17:16Z | 2024-04-27T14:17:16Z | MEMBER | 0 | pydata/xarray/pulls/8968 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8968/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2261917442 | PR_kwDOAMm_X85touYl | 8971 | Delete pynio backend. | dcherian 2448579 | closed | 0 | 2 | 2024-04-24T18:25:26Z | 2024-04-25T14:38:23Z | 2024-04-25T14:23:59Z | MEMBER | 0 | pydata/xarray/pulls/8971 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8971/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2228266052 | PR_kwDOAMm_X85r24hE | 8913 | Update hypothesis action to always save the cache | dcherian 2448579 | closed | 0 | 0 | 2024-04-05T15:09:35Z | 2024-04-05T16:51:05Z | 2024-04-05T16:51:03Z | MEMBER | 0 | pydata/xarray/pulls/8913 | Update the cache always. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8913/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2136709010 | I_kwDOAMm_X85_W5eS | 8753 | Lazy Loading with `DataArray` vs. `Variable` | dcherian 2448579 | closed | 0 | 0 | 2024-02-15T14:42:24Z | 2024-04-04T16:46:54Z | 2024-04-04T16:46:54Z | MEMBER | Discussed in https://github.com/pydata/xarray/discussions/8751
<sup>Originally posted by **ilan-gold** February 15, 2024</sup>
My goal is to get a dataset from [custom io-zarr backend lazy-loaded](https://docs.xarray.dev/en/stable/internals/how-to-add-new-backend.html#how-to-support-lazy-loading). But when I declare a `DataArray` based on the `Variable` which uses `LazilyIndexedArray`, everything is read in. Is this expected? I specifically don't want to have to use dask if possible. I have seen https://github.com/aurghs/xarray-backend-tutorial/blob/main/2.Backend_with_Lazy_Loading.ipynb but it's a little bit different.
While I have a custom backend array inheriting from `ZarrArrayWrapper`, this example using `ZarrArrayWrapper` directly still highlights the same unexpected behavior of everything being read in.
```python
import zarr
import xarray as xr
from tempfile import mkdtemp
import numpy as np
from pathlib import Path
from collections import defaultdict
class AccessTrackingStore(zarr.DirectoryStore):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._access_count = {}
self._accessed = defaultdict(set)
def __getitem__(self, key):
for tracked in self._access_count:
if tracked in key:
self._access_count[tracked] += 1
self._accessed[tracked].add(key)
return super().__getitem__(key)
def get_access_count(self, key):
return self._access_count[key]
def set_key_trackers(self, keys_to_track):
if isinstance(keys_to_track, str):
keys_to_track = [keys_to_track]
for k in keys_to_track:
self._access_count[k] = 0
def get_subkeys_accessed(self, key):
return self._accessed[key]
orig_path = Path(mkdtemp())
z = zarr.group(orig_path / "foo.zarr")
z['array'] = np.random.randn(1000, 1000)
store = AccessTrackingStore(orig_path / "foo.zarr")
store.set_key_trackers(['array'])
z = zarr.group(store)
arr = xr.backends.zarr.ZarrArrayWrapper(z['array'])
lazy_arr = xr.core.indexing.LazilyIndexedArray(arr)
# just `.zarray`
var = xr.Variable(('x', 'y'), lazy_arr)
print('Variable read in ', store.get_subkeys_accessed('array'))
# now everything is read in
da = xr.DataArray(var)
print('DataArray read in ', store.get_subkeys_accessed('array'))
``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8753/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2136724736 | PR_kwDOAMm_X85m_MtN | 8754 | Don't access data when creating DataArray from Variable. | dcherian 2448579 | closed | 0 | 2 | 2024-02-15T14:48:32Z | 2024-04-04T16:46:54Z | 2024-04-04T16:46:53Z | MEMBER | 0 | pydata/xarray/pulls/8754 |
This seems to have been around since 2016-ish, so presumably our backend code path is passing arrays around, not Variables. cc @ilan-gold |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8754/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2224300175 | PR_kwDOAMm_X85rpG4S | 8907 | Trigger hypothesis stateful tests nightly | dcherian 2448579 | closed | 0 | 0 | 2024-04-04T02:16:59Z | 2024-04-04T02:17:49Z | 2024-04-04T02:17:47Z | MEMBER | 0 | pydata/xarray/pulls/8907 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8907/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2098659175 | PR_kwDOAMm_X85k-T6b | 8658 | Stateful tests with Dataset | dcherian 2448579 | closed | 0 | 8 | 2024-01-24T16:34:59Z | 2024-04-03T21:29:38Z | 2024-04-03T21:29:36Z | MEMBER | 0 | pydata/xarray/pulls/8658 | I was curious to see if the hypothesis stateful testing would catch an inconsistent sequence of index manipulation operations like #8646. Turns out PS: this blog post is amazing.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8658/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2000205407 | PR_kwDOAMm_X85fzupc | 8467 | [skip-ci] dev whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-11-18T03:59:29Z | 2024-04-03T21:08:45Z | 2023-11-18T15:20:37Z | MEMBER | 0 | pydata/xarray/pulls/8467 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8467/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989233637 | PR_kwDOAMm_X85fOdAk | 8446 | Remove PseudoNetCDF | dcherian 2448579 | closed | 0 | 0 | 2023-11-12T04:29:50Z | 2024-04-03T21:08:44Z | 2023-11-13T21:53:56Z | MEMBER | 0 | pydata/xarray/pulls/8446 | joining the party
- [x] User visible changes (including notable bug fixes) are documented in |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8446/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2064698904 | PR_kwDOAMm_X85jLHsQ | 8584 | Silence a bunch of CachingFileManager warnings | dcherian 2448579 | closed | 0 | 1 | 2024-01-03T21:57:07Z | 2024-04-03T21:08:27Z | 2024-01-03T22:52:58Z | MEMBER | 0 | pydata/xarray/pulls/8584 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8584/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2102850331 | PR_kwDOAMm_X85lMW8k | 8674 | Fix negative slicing of Zarr arrays | dcherian 2448579 | closed | 0 | 0 | 2024-01-26T20:22:21Z | 2024-04-03T21:08:26Z | 2024-02-10T02:57:32Z | MEMBER | 0 | pydata/xarray/pulls/8674 | Closes #8252 Closes #3921
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8674/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2148245262 | PR_kwDOAMm_X85nmmqX | 8777 | Return a dataclass from Grouper.factorize | dcherian 2448579 | closed | 0 | 0 | 2024-02-22T05:41:29Z | 2024-04-03T21:08:25Z | 2024-03-15T04:47:30Z | MEMBER | 0 | pydata/xarray/pulls/8777 | Toward #8510, builds on #8776 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8777/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2148164557 | PR_kwDOAMm_X85nmU5w | 8775 | [skip-ci] NamedArray: Add lazy indexing array refactoring plan | dcherian 2448579 | closed | 0 | 0 | 2024-02-22T04:25:49Z | 2024-04-03T21:08:21Z | 2024-02-23T22:20:09Z | MEMBER | 0 | pydata/xarray/pulls/8775 | This adds a proposal for decoupling the lazy indexing array machinery, indexing adapter machinery, and Variable's setitem and getitem methods, so that the latter can be migrated to NamedArray. cc @andersy005 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8775/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 2, "eyes": 0 } |
xarray 13221727 | pull | |||||
2198991054 | PR_kwDOAMm_X85qTNFP | 8861 | upstream-dev CI: Fix interp and cumtrapz | dcherian 2448579 | closed | 0 | 0 | 2024-03-21T02:49:40Z | 2024-04-03T21:08:17Z | 2024-03-21T04:16:45Z | MEMBER | 0 | pydata/xarray/pulls/8861 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8861/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2215539648 | PR_kwDOAMm_X85rLW_p | 8891 | 2024.03.0: Add whats-new | dcherian 2448579 | closed | 0 | 0 | 2024-03-29T15:01:35Z | 2024-03-29T17:07:19Z | 2024-03-29T17:07:17Z | MEMBER | 0 | pydata/xarray/pulls/8891 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8891/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2206047573 | PR_kwDOAMm_X85qrHyn | 8875 | Optimize writes to existing Zarr stores. | dcherian 2448579 | closed | 0 | 0 | 2024-03-25T15:32:47Z | 2024-03-29T14:35:30Z | 2024-03-29T14:35:29Z | MEMBER | 0 | pydata/xarray/pulls/8875 | We need to read existing variables to make sure we append or write to a region with the right encoding. Currently we decode all arrays in a Zarr group. Instead only decode those arrays for which we require encoding information. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8875/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2066510805 | I_kwDOAMm_X857LHPV | 8589 | Don't overwrite indexes for region writes, always | dcherian 2448579 | closed | 0 | 2 | 2024-01-04T23:52:18Z | 2024-03-27T16:24:37Z | 2024-03-27T16:24:36Z | MEMBER | What happened?Currently we don't overwrite indexes when I propose we do this for all region writes and completely disallow modifying indexes with a region write. This would match the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8589/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2206385638 | PR_kwDOAMm_X85qsSKm | 8877 | Don't allow overwriting indexes with region writes | dcherian 2448579 | closed | 0 | 0 | 2024-03-25T18:13:19Z | 2024-03-27T16:24:37Z | 2024-03-27T16:24:35Z | MEMBER | 0 | pydata/xarray/pulls/8877 |
cc @slevang |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8877/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2123950388 | PR_kwDOAMm_X85mT6XD | 8720 | groupby: Dispatch quantile to flox. | dcherian 2448579 | closed | 0 | 7 | 2024-02-07T21:42:42Z | 2024-03-26T15:08:32Z | 2024-03-26T15:08:30Z | MEMBER | 0 | pydata/xarray/pulls/8720 |
@aulemahal would you be able to test against xclim's test suite. I imagine you're doing a bunch of grouped quantiles. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8720/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2184830377 | PR_kwDOAMm_X85pjN8A | 8829 | Revert "Do not attempt to broadcast when global option ``arithmetic_b… | dcherian 2448579 | closed | 0 | 7 | 2024-03-13T20:27:12Z | 2024-03-20T15:30:12Z | 2024-03-15T03:59:07Z | MEMBER | 0 | pydata/xarray/pulls/8829 | …roadcast=False`` (#8784)" This reverts commit 11f89ecdd41226cf93da8d1e720d2710849cd23e. Reverting #8784 Sadly that PR broke a lot of tests by breaking ```AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data() File ~/repos/xarray/xarray/tests/init.py:329, in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords["numbers"] = ("dim3", numbers_values) 328 obj.encoding = {"foo": "bar"} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj AssertionError: ``` Somehow that code changes whether cc @etienneschalk |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8829/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2189750364 | PR_kwDOAMm_X85p0Epw | 8847 | pandas 3 MultiIndex fixes | dcherian 2448579 | closed | 0 | 0 | 2024-03-16T03:51:06Z | 2024-03-20T15:00:20Z | 2024-03-20T15:00:18Z | MEMBER | 0 | pydata/xarray/pulls/8847 | xref #8844 Closes https://github.com/xarray-contrib/flox/issues/342 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8847/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2189738090 | PR_kwDOAMm_X85p0CKq | 8846 | Support pandas copy-on-write behaviour | dcherian 2448579 | closed | 0 | 2 | 2024-03-16T03:14:46Z | 2024-03-18T16:00:15Z | 2024-03-18T16:00:12Z | MEMBER | 0 | pydata/xarray/pulls/8846 |
```python import numpy as np import pandas as pd pd.set_option("mode.copy_on_write", True) from xarray.core.variable import _possibly_convert_objects string_var = np.array(["a", "bc", "def"], dtype=object) datetime_var = np.array( ["2019-01-01", "2019-01-02", "2019-01-03"], dtype="datetime64[ns]" ) assert _possibly_convert_objects(string_var).flags.writeable assert _possibly_convert_objects(datetime_var).flags.writeable ``` The core issue is that we now get read-only arrays back from pandas here: https://github.com/pydata/xarray/blob/fbcac7611bf9a16750678f93483d3dbe0e261a0a/xarray/core/variable.py#L197-L212 @phofl is this expected? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8846/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2188936276 | I_kwDOAMm_X86CeIRU | 8843 | Get ready for pandas 3 copy-on-write | dcherian 2448579 | closed | 0 | 2 | 2024-03-15T15:51:36Z | 2024-03-18T16:00:14Z | 2024-03-18T16:00:14Z | MEMBER | What is your issue?This line fails with We'll need to fix this before Pandas 3 is released in April: https://github.com/pydata/xarray/blob/c9d3084e98d38a7a9488380789a8d0acfde3256f/xarray/tests/init.py#L329 Here's a test ```python def example(): obj = Dataset() obj["dim2"] = ("dim2", 0.5 * np.arange(9)) obj["time"] = ("time", pd.date_range("2000-01-01", periods=20) print({k: v.data.flags for k, v in obj.variables.items()}) return obj example() pd.set_options("mode.copy_on_write", True) example() ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8843/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2098659703 | I_kwDOAMm_X859FwF3 | 8659 | renaming index variables with `rename_vars` seems buggy | dcherian 2448579 | closed | 0 | 1 | 2024-01-24T16:35:18Z | 2024-03-15T19:21:51Z | 2024-03-15T19:21:51Z | MEMBER | What happened?(xref #8658) I'm not sure what the expected behaviour is here: ```python import xarray as xr import numpy as np from xarray.testing import _assert_internal_invariants ds = xr.Dataset() ds.coords["1"] = ("1", np.array([1], dtype=np.uint32)) ds["1_"] = ("1", np.array([1], dtype=np.uint32)) ds = ds.rename_vars({"1": "0"}) ds ``` It looks like this sequence of operations creates a default index
But then ```python from xarray.testing import _assert_internal_invariants _assert_internal_invariants(ds, check_default_indexes=True)
AssertionError: ({'0'}, set()) ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8659/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2187682227 | PR_kwDOAMm_X85ps6tB | 8839 | [skip-ci] Fix upstream-dev env | dcherian 2448579 | closed | 0 | 0 | 2024-03-15T04:08:58Z | 2024-03-15T04:37:59Z | 2024-03-15T04:37:58Z | MEMBER | 0 | pydata/xarray/pulls/8839 | upstream-dev env is broken
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8839/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2187646833 | PR_kwDOAMm_X85psy9g | 8837 | Add dask-expr for windows envs | dcherian 2448579 | closed | 0 | 0 | 2024-03-15T03:27:48Z | 2024-03-15T04:06:05Z | 2024-03-15T04:06:03Z | MEMBER | 0 | pydata/xarray/pulls/8837 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8837/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2184871888 | I_kwDOAMm_X86COn_Q | 8830 | failing tests, all envs | dcherian 2448579 | closed | 0 | 1 | 2024-03-13T20:56:34Z | 2024-03-15T04:06:04Z | 2024-03-15T04:06:04Z | MEMBER | What happened?All tests are failing because of an error in
```AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data() File ~/repos/xarray/xarray/tests/init.py:329, in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords["numbers"] = ("dim3", numbers_values) 328 obj.encoding = {"foo": "bar"} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj AssertionError: ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8830/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2184606202 | PR_kwDOAMm_X85picsD | 8827 | Add `dask-expr` to environment-3.12.yml | dcherian 2448579 | closed | 0 | 0 | 2024-03-13T18:07:27Z | 2024-03-13T20:20:46Z | 2024-03-13T20:20:45Z | MEMBER | 0 | pydata/xarray/pulls/8827 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8827/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1308371056 | I_kwDOAMm_X85N_Chw | 6806 | New alignment option: "exact" without broadcasting OR Turn off automatic broadcasting | dcherian 2448579 | closed | 0 | 9 | 2022-07-18T18:43:31Z | 2024-03-13T15:36:35Z | 2024-03-13T15:36:35Z | MEMBER | Is your feature request related to a problem?If we have two objects with dims I'd like a stricter option ( Describe the solution you'd like
It'd be nice to have this as a built-in option so we can use
Describe alternatives you've consideredAn alternative would be to allow control over automatic broadcasting through the Additional contextThis turns up in staggered grid calculations with xgcm where it is easy to mistakenly construct very high-dimensional arrays because of automatic broadcasting. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6806/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2148242023 | PR_kwDOAMm_X85nml9d | 8776 | Refactor Grouper objects | dcherian 2448579 | closed | 0 | 0 | 2024-02-22T05:38:09Z | 2024-03-07T21:50:07Z | 2024-03-07T21:50:04Z | MEMBER | 0 | pydata/xarray/pulls/8776 | Some refactoring towards the Grouper refactor described in #8510
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8776/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2021858121 | PR_kwDOAMm_X85g81wJ | 8510 | Grouper object design doc | dcherian 2448579 | closed | 0 | 6 | 2023-12-02T04:56:54Z | 2024-03-06T02:27:07Z | 2024-03-06T02:27:04Z | MEMBER | 0 | pydata/xarray/pulls/8510 | xref #8509, #6610 Rendered version here @pydata/xarray I've been poking at this on and off for a year now and finally figured out how to do it cleanly (#8509). I wrote up a design doc for 8509 implements two custom Groupers for you to try out :)```python import xarray as xr from xarray.core.groupers import SeasonGrouper, SeasonResampler ds = xr.tutorial.open_dataset("air_temperature") custom seasons!ds.air.groupby(time=SeasonGrouper(["JF", "MAM", "JJAS", "OND"])).mean() ds.air.resample(time=SeasonResampler(["DJF", "MAM", "JJAS", "ON"])).count() ``` All comments are welcome,
1. there are a couple of specific API and design decisions to be made. I'll make some comments pointing these out.
2. I'm also curious about what cc @ilan-gold @ivirshup @aulemahal @tomvothecoder @jbusecke @katiedagon - it would be good to hear what "Groupers" would be useful for your work / projects. I bet you already have examples that fit this proposal |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8510/reactions", "total_count": 8, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 8, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2135011804 | I_kwDOAMm_X85_QbHc | 8748 | release v2024.02.0 | dcherian 2448579 | closed | 0 | keewis 14808389 | 0 | 2024-02-14T19:08:38Z | 2024-02-18T22:52:15Z | 2024-02-18T22:52:15Z | MEMBER | What is your issue?Thanks to @keewis for volunteering at today's meeting :() |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8748/reactions", "total_count": 3, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | |||||
2102852029 | PR_kwDOAMm_X85lMXU0 | 8675 | Fix NetCDF4 C version detection | dcherian 2448579 | closed | 0 | 1 | 2024-01-26T20:23:54Z | 2024-01-27T01:28:51Z | 2024-01-27T01:28:49Z | MEMBER | 0 | pydata/xarray/pulls/8675 | This fixes the failure locally for me. cc @max-sixty |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8675/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2098626592 | PR_kwDOAMm_X85k-Mnt | 8657 | groupby: Don't set `method` by default on flox>=0.9 | dcherian 2448579 | closed | 0 | 0 | 2024-01-24T16:20:57Z | 2024-01-26T16:54:25Z | 2024-01-26T16:54:23Z | MEMBER | 0 | pydata/xarray/pulls/8657 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8657/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2064313690 | I_kwDOAMm_X857Cu1a | 8580 | add py3.12 CI and update pyproject.toml | dcherian 2448579 | closed | 0 | 2 | 2024-01-03T16:26:47Z | 2024-01-17T21:54:13Z | 2024-01-17T21:54:13Z | MEMBER | What is your issue?We haven't done this yet! https://github.com/pydata/xarray/blob/d87ba61c957fc3af77251ca6db0f6bccca1acb82/pyproject.toml#L11-L15 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8580/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2086607437 | I_kwDOAMm_X858XxpN | 8616 | new release 2024.01.0 | dcherian 2448579 | closed | 0 | 0 | 2024-01-17T17:03:20Z | 2024-01-17T19:21:12Z | 2024-01-17T19:21:12Z | MEMBER | What is your issue?Thanks @TomNicholas for volunteering to drive this release! |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8616/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2021763059 | PR_kwDOAMm_X85g8iNJ | 8507 | Deprecate `squeeze` in GroupBy. | dcherian 2448579 | closed | 0 | 2 | 2023-12-02T00:21:43Z | 2024-01-08T03:08:47Z | 2024-01-08T01:05:23Z | MEMBER | 0 | pydata/xarray/pulls/8507 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8507/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2065788086 | PR_kwDOAMm_X85jOw74 | 8585 | Enable Zarr V3 tests in all CI runs. | dcherian 2448579 | closed | 0 | 0 | 2024-01-04T14:45:44Z | 2024-01-05T17:53:08Z | 2024-01-05T17:53:06Z | MEMBER | 0 | pydata/xarray/pulls/8585 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8585/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2064420057 | I_kwDOAMm_X857DIzZ | 8581 | bump min versions | dcherian 2448579 | closed | 0 | 0 | 2024-01-03T17:45:10Z | 2024-01-05T16:13:16Z | 2024-01-05T16:13:15Z | MEMBER | What is your issue?Looks like we can bump a number of min versions: ``` Package Required Policy Status cartopy 0.20 (2021-09-17) 0.21 (2022-09-10) < dask-core 2022.7 (2022-07-08) 2022.12 (2022-12-02) < distributed 2022.7 (2022-07-08) 2022.12 (2022-12-02) < flox 0.5 (2022-05-03) 0.6 (2022-10-12) < iris 3.2 (2022-02-15) 3.4 (2022-12-01) < matplotlib-base 3.5 (2021-11-18) 3.6 (2022-09-16) < numba 0.55 (2022-01-14) 0.56 (2022-09-28) < numpy 1.22 (2022-01-03) 1.23 (2022-06-23) < packaging 21.3 (2021-11-18) 22.0 (2022-12-08) < pandas 1.4 (2022-01-22) 1.5 (2022-09-19) < scipy 1.8 (2022-02-06) 1.9 (2022-07-30) < seaborn 0.11 (2020-09-08) 0.12 (2022-09-06) < typing_extensions 4.3 (2022-07-01) 4.4 (2022-10-07) < zarr 2.12 (2022-06-23) 2.13 (2022-09-27) < ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8581/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2065809896 | PR_kwDOAMm_X85jO1oX | 8586 | Bump min deps. | dcherian 2448579 | closed | 0 | 0 | 2024-01-04T14:59:05Z | 2024-01-05T16:13:16Z | 2024-01-05T16:13:14Z | MEMBER | 0 | pydata/xarray/pulls/8586 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8586/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2066129022 | PR_kwDOAMm_X85jP678 | 8587 | Silence another warning in test_backends.py | dcherian 2448579 | closed | 0 | 1 | 2024-01-04T18:20:49Z | 2024-01-05T16:13:05Z | 2024-01-05T16:13:03Z | MEMBER | 0 | pydata/xarray/pulls/8587 | Using 255 as fillvalue for int8 arrays will not be allowed any more. Previously this overflowed to -1. Now specify that instead. On numpy 1.24.4 ```
array([-1], dtype=int8) ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8587/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2052694433 | PR_kwDOAMm_X85ilhQm | 8565 | Faster encoding functions. | dcherian 2448579 | closed | 0 | 1 | 2023-12-21T16:05:02Z | 2024-01-04T14:25:45Z | 2024-01-04T14:25:43Z | MEMBER | 0 | pydata/xarray/pulls/8565 | Spotted when profiling some write workloads. 1. Speeds up the check for multi-index 2. Speeds up one string encoder by not re-creating variables when not necessary. @benbovy is there a better way? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8565/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1976752481 | PR_kwDOAMm_X85ekPdj | 8412 | Minimize duplication in `map_blocks` task graph | dcherian 2448579 | closed | 0 | 7 | 2023-11-03T18:30:02Z | 2024-01-03T04:10:17Z | 2024-01-03T04:10:15Z | MEMBER | 0 | pydata/xarray/pulls/8412 | Builds on #8560
cc @max-sixty ``` print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).map_blocks(lambda x: x)))) 779354739 -> 47699827print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).drop_vars(da.indexes).map_blocks(lambda x: x)))) 15981508``` This is a quick attempt. I think we can generalize this to minimize duplication. The downside is that the graphs are not totally embarrassingly parallel any more.
This PR:
vs main:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8412/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2052610515 | PR_kwDOAMm_X85ilOq9 | 8564 | Fix mypy type ignore | dcherian 2448579 | closed | 0 | 1 | 2023-12-21T15:15:26Z | 2023-12-21T15:41:13Z | 2023-12-21T15:24:52Z | MEMBER | 0 | pydata/xarray/pulls/8564 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8564/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2047617215 | PR_kwDOAMm_X85iUJ7y | 8560 | Adapt map_blocks to use new Coordinates API | dcherian 2448579 | closed | 0 | 0 | 2023-12-18T23:11:55Z | 2023-12-20T17:11:18Z | 2023-12-20T17:11:16Z | MEMBER | 0 | pydata/xarray/pulls/8560 | Fixes roundtripping of string dtype indexes
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8560/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2033054792 | PR_kwDOAMm_X85hi5U2 | 8532 | Whats-new for 2023.12.0 | dcherian 2448579 | closed | 0 | 0 | 2023-12-08T17:29:47Z | 2023-12-08T19:36:28Z | 2023-12-08T19:36:26Z | MEMBER | 0 | pydata/xarray/pulls/8532 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8532/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2021754904 | PR_kwDOAMm_X85g8gnU | 8506 | Deprecate `squeeze` in GroupBy. | dcherian 2448579 | closed | 0 | 1 | 2023-12-02T00:08:50Z | 2023-12-02T00:13:36Z | 2023-12-02T00:13:36Z | MEMBER | 0 | pydata/xarray/pulls/8506 |
Could use a close-ish review. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8506/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989588884 | I_kwDOAMm_X852lreU | 8448 | mypy 1.7.0 raising errors | dcherian 2448579 | closed | 0 | 0 | 2023-11-12T21:41:43Z | 2023-12-01T22:02:22Z | 2023-12-01T22:02:22Z | MEMBER | What happened?
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8448/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2003229011 | PR_kwDOAMm_X85f9xPG | 8472 | Avoid duplicate Zarr array read | dcherian 2448579 | closed | 0 | 0 | 2023-11-21T00:16:34Z | 2023-12-01T02:58:22Z | 2023-12-01T02:47:03Z | MEMBER | 0 | pydata/xarray/pulls/8472 | We already get the underlying Zarr array in https://github.com/pydata/xarray/blob/bb8511e0894020e180d95d2edb29ed4036ac6447/xarray/backends/zarr.py#L529-L531 and then pass it to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8472/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2015530249 | PR_kwDOAMm_X85gnO8L | 8489 | Minor to_zarr optimizations | dcherian 2448579 | closed | 0 | 0 | 2023-11-28T23:56:32Z | 2023-12-01T02:20:19Z | 2023-12-01T02:18:18Z | MEMBER | 0 | pydata/xarray/pulls/8489 | Avoid repeatedly pinging a remote store by requesting keys at one go. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8489/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1615596004 | I_kwDOAMm_X85gTAnk | 7596 | illustrate time offset arithmetic | dcherian 2448579 | closed | 0 | 2 | 2023-03-08T16:54:15Z | 2023-11-29T01:31:45Z | 2023-11-29T01:31:45Z | MEMBER | Is your feature request related to a problem?We should document changing the time vector using pandas date offsets here This is particularly useful for centering the time stamps after a resampling operation. Related:
- CFTime offsets: https://github.com/pydata/xarray/issues/5687
- Describe the solution you'd likeNo response Describe alternatives you've consideredNo response Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7596/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1997656427 | PR_kwDOAMm_X85frEdb | 8461 | 2023.11.0 Whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-11-16T19:55:12Z | 2023-11-17T21:02:22Z | 2023-11-17T21:02:20Z | MEMBER | 0 | pydata/xarray/pulls/8461 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8461/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1997136566 | PR_kwDOAMm_X85fpRL3 | 8458 | Pin mypy < 1.7 | dcherian 2448579 | closed | 0 | 0 | 2023-11-16T15:31:26Z | 2023-11-16T17:29:04Z | 2023-11-16T17:29:03Z | MEMBER | 0 | pydata/xarray/pulls/8458 | xref #8448 get back to green checks for now. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8458/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1995186323 | PR_kwDOAMm_X85finmE | 8452 | [skip-ci] Small updates to IO docs. | dcherian 2448579 | closed | 0 | 0 | 2023-11-15T17:05:47Z | 2023-11-16T15:19:59Z | 2023-11-16T15:19:57Z | MEMBER | 0 | pydata/xarray/pulls/8452 | Also fixes the RTD failure on main |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8452/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989227042 | PR_kwDOAMm_X85fObtL | 8445 | Pin pint to >=0.22 | dcherian 2448579 | closed | 0 | 3 | 2023-11-12T03:58:40Z | 2023-11-13T19:39:54Z | 2023-11-13T19:39:53Z | MEMBER | 0 | pydata/xarray/pulls/8445 |
We were previously pinned to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8445/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989212292 | PR_kwDOAMm_X85fOYwT | 8444 | Remove keep_attrs from resample signature | dcherian 2448579 | closed | 0 | 1 | 2023-11-12T02:57:59Z | 2023-11-12T22:53:36Z | 2023-11-12T22:53:35Z | MEMBER | 0 | pydata/xarray/pulls/8444 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8444/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1635949876 | PR_kwDOAMm_X85MpxlL | 7659 | Redo whats-new for 2023.03.0 | dcherian 2448579 | closed | 0 | 0 | 2023-03-22T15:02:38Z | 2023-11-06T04:25:54Z | 2023-03-22T15:42:49Z | MEMBER | 0 | pydata/xarray/pulls/7659 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7659/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1630533356 | PR_kwDOAMm_X85MXo4e | 7643 | Whats-new for release 2023.03.0 | dcherian 2448579 | closed | 0 | 0 | 2023-03-18T19:14:55Z | 2023-11-06T04:25:53Z | 2023-03-20T15:57:36Z | MEMBER | 0 | pydata/xarray/pulls/7643 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7643/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1471673992 | PR_kwDOAMm_X85EFDiU | 7343 | Fix mypy failures | dcherian 2448579 | closed | 0 | 1 | 2022-12-01T17:16:44Z | 2023-11-06T04:25:52Z | 2022-12-01T18:25:07Z | MEMBER | 0 | pydata/xarray/pulls/7343 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7343/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1533942791 | PR_kwDOAMm_X85HahUq | 7440 | v2023.01.0 whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-01-15T18:20:28Z | 2023-11-06T04:25:52Z | 2023-01-18T21:18:49Z | MEMBER | 0 | pydata/xarray/pulls/7440 | Should update the date and delete empty sections before merging |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7440/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1942666419 | PR_kwDOAMm_X85cxojc | 8304 | Move Variable aggregations to NamedArray | dcherian 2448579 | closed | 0 | 6 | 2023-10-13T21:31:01Z | 2023-11-06T04:25:43Z | 2023-10-17T19:14:12Z | MEMBER | 0 | pydata/xarray/pulls/8304 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8304/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1689364566 | PR_kwDOAMm_X85PbeOv | 7796 | Speed up .dt accessor by preserving Index objects. | dcherian 2448579 | closed | 0 | 1 | 2023-04-29T04:22:10Z | 2023-11-06T04:25:42Z | 2023-05-16T17:55:48Z | MEMBER | 0 | pydata/xarray/pulls/7796 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7796/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1973472421 | PR_kwDOAMm_X85eZF4x | 8400 | Better attr diff for `testing.assert_identical` | dcherian 2448579 | closed | 0 | 2 | 2023-11-02T04:15:09Z | 2023-11-04T20:25:37Z | 2023-11-04T20:25:36Z | MEMBER | 0 | pydata/xarray/pulls/8400 |
This gives us better reprs where only differing attributes are shown in the diff. On main:
With this PR: ``` Differing coordinates: L * x (x) %cU1 'a' 'b' Differing variable attributes: foo: bar R * x (x) %cU1 'a' 'c' Differing variable attributes: source: 0 foo: baz ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8400/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1962040911 | PR_kwDOAMm_X85dyZBT | 8373 | Use `opt_einsum` by default if installed. | dcherian 2448579 | closed | 0 | 2 | 2023-10-25T18:59:38Z | 2023-10-28T03:31:07Z | 2023-10-28T03:31:05Z | MEMBER | 0 | pydata/xarray/pulls/8373 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8373/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1672288892 | I_kwDOAMm_X85jrRp8 | 7764 | Support opt_einsum in xr.dot | dcherian 2448579 | closed | 0 | 7 | 2023-04-18T03:29:48Z | 2023-10-28T03:31:06Z | 2023-10-28T03:31:06Z | MEMBER | Is your feature request related to a problem?Shall we support opt_einsum as an optional backend for
Describe the solution you'd likeAdd a Describe alternatives you've consideredWe could create a new package but it seems a bit silly. Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7764/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1954535213 | PR_kwDOAMm_X85dZT47 | 8351 | [skip-ci] Add benchmarks for Dataset binary ops, chunk | dcherian 2448579 | closed | 0 | 1 | 2023-10-20T15:31:36Z | 2023-10-20T18:08:40Z | 2023-10-20T18:08:38Z | MEMBER | 0 | pydata/xarray/pulls/8351 | xref #8339 xref #8350 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8351/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1954360112 | PR_kwDOAMm_X85dYtpz | 8349 | [skip-ci] dev whats-new | dcherian 2448579 | closed | 0 | 1 | 2023-10-20T14:02:07Z | 2023-10-20T17:28:19Z | 2023-10-20T14:54:30Z | MEMBER | 0 | pydata/xarray/pulls/8349 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8349/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1950480317 | PR_kwDOAMm_X85dLkAj | 8334 | Whats-new: 2023.10.0 | dcherian 2448579 | closed | 0 | 1 | 2023-10-18T19:22:06Z | 2023-10-19T16:00:00Z | 2023-10-19T15:59:58Z | MEMBER | 0 | pydata/xarray/pulls/8334 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8334/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1944347086 | PR_kwDOAMm_X85c2nyz | 8316 | Enable numbagg for reductions | dcherian 2448579 | closed | 0 | 3 | 2023-10-16T04:46:10Z | 2023-10-18T14:54:48Z | 2023-10-18T10:39:30Z | MEMBER | 0 | pydata/xarray/pulls/8316 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8316/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1861954973 | PR_kwDOAMm_X85YhnBZ | 8100 | Remove np.asarray in formatting.py | dcherian 2448579 | closed | 0 | 2 | 2023-08-22T18:08:33Z | 2023-10-18T13:31:25Z | 2023-10-18T10:40:38Z | MEMBER | 0 | pydata/xarray/pulls/8100 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8100/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1942673099 | PR_kwDOAMm_X85cxp-D | 8305 | Update labeler.yml to add NamedArray | dcherian 2448579 | closed | 0 | 0 | 2023-10-13T21:39:56Z | 2023-10-14T06:47:08Z | 2023-10-14T06:47:07Z | MEMBER | 0 | pydata/xarray/pulls/8305 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8305/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1908084109 | I_kwDOAMm_X85xuw2N | 8223 | release 2023.09.0 | dcherian 2448579 | closed | 0 | 6 | 2023-09-22T02:29:30Z | 2023-09-26T08:12:46Z | 2023-09-26T08:12:46Z | MEMBER | We've accumulated a nice number of changes. Can someone volunteer to do a release in the next few days? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8223/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1874773066 | PR_kwDOAMm_X85ZMtUP | 8126 | Allow creating DataArrays with nD coordinate variables | dcherian 2448579 | closed | 0 | 0 | 2023-08-31T04:40:37Z | 2023-09-22T12:48:38Z | 2023-09-22T12:48:34Z | MEMBER | 0 | pydata/xarray/pulls/8126 |
cc @blaylockbk |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8126/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1175093771 | I_kwDOAMm_X85GCoIL | 6391 | apply_ufunc and Datasets with variables without the core dimension | dcherian 2448579 | closed | 0 | 5 | 2022-03-21T09:13:02Z | 2023-09-17T08:20:15Z | 2023-09-17T08:20:14Z | MEMBER | Is your feature request related to a problem?Consider this example
This raises
because core dimension Describe the solution you'd likeAdd a new kwarg to Describe alternatives you've consideredNo response Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6391/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1874695065 | I_kwDOAMm_X85vvZOZ | 8125 | failing tests with pandas 2.1 | dcherian 2448579 | closed | 0 | 10 | 2023-08-31T02:42:32Z | 2023-09-15T13:12:02Z | 2023-09-15T13:12:02Z | MEMBER | What happened?See https://github.com/pydata/xarray/pull/8101
and this doctest
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8125/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1812504689 | I_kwDOAMm_X85sCKBx | 8006 | Fix documentation about datetime_unit of xarray.DataArray.differentiate | dcherian 2448579 | closed | 0 | 0 | 2023-07-19T18:31:10Z | 2023-09-01T09:37:15Z | 2023-09-01T09:37:15Z | MEMBER | Should say that Discussed in https://github.com/pydata/xarray/discussions/8000
<sup>Originally posted by **jesieleo** July 19, 2023</sup>
I have a piece of data that looks like this
```
<xarray.Dataset>
Dimensions: (time: 612, LEV: 15, latitude: 20, longitude: 357)
Coordinates:
* time (time) datetime64[ns] 1960-01-15 1960-02-15 ... 2010-12-15
* LEV (LEV) float64 5.01 15.07 25.28 35.76 ... 149.0 171.4 197.8 229.5
* latitude (latitude) float64 -4.75 -4.25 -3.75 -3.25 ... 3.75 4.25 4.75
* longitude (longitude) float64 114.2 114.8 115.2 115.8 ... 291.2 291.8 292.2
Data variables:
u (time, LEV, latitude, longitude) float32 ...
Attributes: (12/30)
cdm_data_type: Grid
Conventions: COARDS, CF-1.6, ACDD-1.3
creator_email: chepurin@umd.edu
creator_name: APDRC
creator_type: institution
creator_url: https://www.atmos.umd.edu/~ocean/
... ...
standard_name_vocabulary: CF Standard Name Table v29
summary: Simple Ocean Data Assimilation (SODA) soda po...
time_coverage_end: 2010-12-15T00:00:00Z
time_coverage_start: 1983-01-15T00:00:00Z
title: SODA soda pop2.2.4 [TIME][LEV][LAT][LON]
Westernmost_Easting: 118.25
```
when i try to use xarray.DataArray.differentiate
`data.u.differentiate('time',datetime_unit='M')`
will appear
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "D:\Anaconda3\lib\site-packages\xarray\core\dataarray.py", line 3609, in differentiate
ds = self._to_temp_dataset().differentiate(coord, edge_order, datetime_unit)
File "D:\Anaconda3\lib\site-packages\xarray\core\dataset.py", line 6372, in differentiate
coord_var = coord_var._to_numeric(datetime_unit=datetime_unit)
File "D:\Anaconda3\lib\site-packages\xarray\core\variable.py", line 2428, in _to_numeric
numeric_array = duck_array_ops.datetime_to_numeric(
File "D:\Anaconda3\lib\site-packages\xarray\core\duck_array_ops.py", line 466, in datetime_to_numeric
array = array / np.timedelta64(1, datetime_unit)
TypeError: Cannot get a common metadata divisor for Numpy datatime metadata [ns] and [M] because they have incompatible nonlinear base time units.
```
Would you please told me is this a BUG? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8006/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1861692451 | PR_kwDOAMm_X85YgtYD | 8098 | [skip-ci] dev whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-08-22T15:20:54Z | 2023-08-22T20:46:29Z | 2023-08-22T20:46:29Z | MEMBER | 0 | pydata/xarray/pulls/8098 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8098/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1603957501 | I_kwDOAMm_X85fmnL9 | 7573 | Add optional min versions to conda-forge recipe (`run_constrained`) | dcherian 2448579 | closed | 0 | 4 | 2023-02-28T23:12:15Z | 2023-08-21T16:12:34Z | 2023-08-21T16:12:21Z | MEMBER | Is your feature request related to a problem?I opened this PR to add minimum versions for our optional dependencies: https://github.com/conda-forge/xarray-feedstock/pull/84/files to prevent issues like #7467 I think we'd need a policy to choose which ones to list. Here's the current list:
Some examples to think about:
1. Describe the solution you'd likeNo response Describe alternatives you've consideredNo response Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7573/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1855338426 | PR_kwDOAMm_X85YLRQH | 8081 | Add 2023.08.0 whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-08-17T16:36:06Z | 2023-08-18T20:12:27Z | 2023-08-18T20:12:25Z | MEMBER | 0 | pydata/xarray/pulls/8081 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8081/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1829952467 | PR_kwDOAMm_X85W1yq4 | 8033 | Reduce pre-commit update frequency to monthly from weekly. | dcherian 2448579 | closed | 0 | 0 | 2023-07-31T20:16:05Z | 2023-08-01T16:48:12Z | 2023-08-01T16:48:10Z | MEMBER | 0 | pydata/xarray/pulls/8033 | We could even go down to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8033/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1642301775 | PR_kwDOAMm_X85M-3H- | 7684 | Automatically chunk `other` in GroupBy binary ops. | dcherian 2448579 | closed | 0 | 2 | 2023-03-27T15:15:22Z | 2023-07-28T03:12:20Z | 2023-07-27T16:41:33Z | MEMBER | 0 | pydata/xarray/pulls/7684 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7684/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1642299599 | I_kwDOAMm_X85h44DP | 7683 | automatically chunk in groupby binary ops | dcherian 2448579 | closed | 0 | 0 | 2023-03-27T15:14:09Z | 2023-07-27T16:41:35Z | 2023-07-27T16:41:34Z | MEMBER | What happened?From https://discourse.pangeo.io/t/xarray-unable-to-allocate-memory-how-to-size-up-problem/3233/4 Consider ``` python ds is dataset with big dask arraysmean = ds.groupby("time.day").mean() mean.to_netcdf() mean = xr.open_dataset(...) ds.groupby("time.day") - mean ``` In we will eagerly construct What did you expect to happen?I think the only solution is to automatically chunk if Minimal Complete Verifiable ExampleNo response MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7683/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1789989152 | I_kwDOAMm_X85qsREg | 7962 | Better chunk manager error | dcherian 2448579 | closed | 0 | 4 | 2023-07-05T17:27:25Z | 2023-07-24T22:26:14Z | 2023-07-24T22:26:13Z | MEMBER | What happened?I just ran in to this error in an environment without dask.
I think we could easily recommend the user to install a package that provides |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7962/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1675073096 | PR_kwDOAMm_X85OrnNQ | 7769 | Fix groupby_bins when labels are specified | dcherian 2448579 | closed | 0 | 2 | 2023-04-19T14:49:23Z | 2023-07-22T01:01:34Z | 2023-04-20T17:17:16Z | MEMBER | 0 | pydata/xarray/pulls/7769 |
@gsieros can you try this out please? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7769/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1812646094 | PR_kwDOAMm_X85V7g7q | 8007 | Update copyright year in README | dcherian 2448579 | closed | 0 | 0 | 2023-07-19T20:00:50Z | 2023-07-20T21:13:27Z | 2023-07-20T21:13:26Z | MEMBER | 0 | pydata/xarray/pulls/8007 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8007/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1806239984 | PR_kwDOAMm_X85Vl5Ch | 7989 | Allow opening datasets with nD dimenson coordinate variables. | dcherian 2448579 | closed | 0 | 5 | 2023-07-15T17:33:18Z | 2023-07-19T19:06:25Z | 2023-07-19T18:25:33Z | MEMBER | 0 | pydata/xarray/pulls/7989 |
Avoid automatic creating of Index variable when nD variable shares name with one of its dimensions. Closes #2233 ```python url = "http://www.smast.umassd.edu:8080/thredds/dodsC/FVCOM/NECOFS/Forecasts/NECOFS_GOM3_FORECAST.nc" ds = xr.open_dataset(url, engine="netcdf4") display(ds) xr.testing._assert_internal_invariants(ds, check_default_indexes=False) ! no raise on #7368 ``` ~The internal invariants assert fails on |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7989/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 1, "eyes": 0 } |
xarray 13221727 | pull | |||||
1797636782 | I_kwDOAMm_X85rJcKu | 7976 | Explore updating colormap code | dcherian 2448579 | closed | 0 | 0 | 2023-07-10T21:51:30Z | 2023-07-11T13:49:54Z | 2023-07-11T13:49:53Z | MEMBER | What is your issue?See https://github.com/matplotlib/matplotlib/issues/16296 Looks like the MPL API may have advanced enough that we can delete some of our use of private attributes. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7976/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1736542260 | PR_kwDOAMm_X85R6fac | 7888 | Add cfgrib,ipywidgets to doc env | dcherian 2448579 | closed | 0 | 3 | 2023-06-01T15:11:10Z | 2023-06-16T14:14:01Z | 2023-06-16T14:13:59Z | MEMBER | 0 | pydata/xarray/pulls/7888 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7888/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1600382587 | PR_kwDOAMm_X85Kyh9V | 7561 | Introduce Grouper objects internally | dcherian 2448579 | closed | 0 | 4 | 2023-02-27T03:11:36Z | 2023-06-14T21:18:26Z | 2023-05-04T02:35:57Z | MEMBER | 0 | pydata/xarray/pulls/7561 | Builds on the refactoring in #7206
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7561/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1701070898 | PR_kwDOAMm_X85QCzA1 | 7830 | Fix .groupby(multi index level) | dcherian 2448579 | closed | 0 | 0 | 2023-05-08T23:16:07Z | 2023-06-06T00:21:36Z | 2023-06-06T00:21:31Z | MEMBER | 0 | pydata/xarray/pulls/7830 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7830/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1639732867 | PR_kwDOAMm_X85M2fjy | 7670 | Delete built-in cfgrib backend | dcherian 2448579 | closed | 0 | 5 | 2023-03-24T16:53:56Z | 2023-06-01T15:22:33Z | 2023-03-29T15:19:51Z | MEMBER | 0 | pydata/xarray/pulls/7670 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7670/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1688781350 | PR_kwDOAMm_X85PZf3R | 7795 | [skip-ci] Add cftime groupby, resample benchmarks | dcherian 2448579 | closed | 0 | 8 | 2023-04-28T15:49:39Z | 2023-05-24T16:07:58Z | 2023-05-02T15:56:55Z | MEMBER | 0 | pydata/xarray/pulls/7795 | xref #7730 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7795/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1692597701 | I_kwDOAMm_X85k4v3F | 7808 | Default behaviour of `min_count` wrong with flox | dcherian 2448579 | closed | 0 | 0 | 2023-05-02T15:04:11Z | 2023-05-10T02:39:45Z | 2023-05-10T02:39:45Z | MEMBER | What happened?```python with xr.set_options(display_style="text", use_flox=False): with xr.set_options(use_flox=False): display( xr.DataArray( data=np.array([np.nan, 1, 1, np.nan, 1, 1]), dims="x", coords={"labels": ("x", np.array([1, 2, 3, 1, 2, 3]))}, ) .groupby("labels") .sum() )
``` ``` without flox<xarray.DataArray (labels: 3)> array([0., 2., 2.]) Coordinates: * labels (labels) int64 1 2 3 with flox<xarray.DataArray (labels: 3)> array([nan, 2., 2.]) Coordinates: * labels (labels) int64 1 2 3 ``` What did you expect to happen?The same answer. We should set Minimal Complete Verifiable ExampleNo response MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7808/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1692612622 | PR_kwDOAMm_X85PmMOy | 7809 | Fix `min_count` behaviour with flox. | dcherian 2448579 | closed | 0 | 0 | 2023-05-02T15:13:17Z | 2023-05-10T02:39:45Z | 2023-05-10T02:39:43Z | MEMBER | 0 | pydata/xarray/pulls/7809 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7809/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1689773381 | PR_kwDOAMm_X85PctlP | 7798 | Fix groupby binary ops when grouped array is subset relative to other | dcherian 2448579 | closed | 0 | 3 | 2023-04-30T04:14:14Z | 2023-05-03T12:58:35Z | 2023-05-02T14:48:42Z | MEMBER | 0 | pydata/xarray/pulls/7798 |
cc @slevang |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7798/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1632422255 | PR_kwDOAMm_X85Md6iW | 7650 | Pin pandas < 2 | dcherian 2448579 | closed | 0 | 3 | 2023-03-20T16:03:58Z | 2023-04-25T13:42:48Z | 2023-03-22T14:53:53Z | MEMBER | 0 | pydata/xarray/pulls/7650 | Pandas is expecting to release v2 in two weeks (pandas-dev/pandas#46776 (comment)). But we are still incompatible with their main branch: - #7441 - #7420 This PR pins pandas to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7650/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1677167290 | PR_kwDOAMm_X85Oyokd | 7775 | [skip-ci] dev whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-04-20T17:54:27Z | 2023-04-20T21:08:14Z | 2023-04-20T21:08:11Z | MEMBER | 0 | pydata/xarray/pulls/7775 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7775/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1677161134 | PR_kwDOAMm_X85OynVg | 7774 | [skip-ci] Release 2023.04.2 | dcherian 2448579 | closed | 0 | 0 | 2023-04-20T17:49:46Z | 2023-04-20T18:26:39Z | 2023-04-20T18:26:37Z | MEMBER | 0 | pydata/xarray/pulls/7774 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7774/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);