issues
495 rows where user = 2448579 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, draft, created_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2278499376 | PR_kwDOAMm_X85uhFke | 8997 | Zarr: Optimize `region="auto"` detection | dcherian 2448579 | open | 0 | 1 | 2024-05-03T22:13:18Z | 2024-05-04T21:47:39Z | MEMBER | 0 | pydata/xarray/pulls/8997 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8997/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||||
2278510478 | PR_kwDOAMm_X85uhIGP | 8998 | Zarr: Optimize appending | dcherian 2448579 | open | 0 | 0 | 2024-05-03T22:21:44Z | 2024-05-03T22:23:34Z | MEMBER | 1 | pydata/xarray/pulls/8998 | Builds on #8997 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8998/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1915997507 | I_kwDOAMm_X85yM81D | 8238 | NamedArray tracking issue | dcherian 2448579 | open | 0 | 12 | 2023-09-27T17:07:58Z | 2024-04-30T12:49:17Z | MEMBER | @andersy005 I think it would be good to keep a running list of NamedArray tasks. I'll start with a rough sketch, please update/edit as you like.
xref #3981 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8238/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2259316341 | I_kwDOAMm_X86Gqm51 | 8965 | Support concurrent loading of variables | dcherian 2448579 | open | 0 | 4 | 2024-04-23T16:41:24Z | 2024-04-29T22:21:51Z | MEMBER | Is your feature request related to a problem?Today if users have to concurrently load multiple variables in a DataArray or Dataset, they have to use dask. It struck me that it'd be pretty easy for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8965/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2261855627 | PR_kwDOAMm_X85togwQ | 8969 | CI: python 3.12 by default. | dcherian 2448579 | closed | 0 | 2 | 2024-04-24T17:49:25Z | 2024-04-29T16:21:20Z | 2024-04-29T16:21:08Z | MEMBER | 0 | pydata/xarray/pulls/8969 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8969/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1574694462 | I_kwDOAMm_X85d2-4- | 7513 | intermittent failures with h5netcdf, h5py on macos | dcherian 2448579 | closed | 0 | 5 | 2023-02-07T16:58:43Z | 2024-04-28T23:35:21Z | 2024-04-28T23:35:21Z | MEMBER | What is your issue?cc @hmaarrfk @kmuehlbauer Passed: https://github.com/pydata/xarray/actions/runs/4115923717/jobs/7105298426 Failed: https://github.com/pydata/xarray/actions/runs/4115946392/jobs/7105345290 Versions:
``` =================================== FAILURES =================================== ___ test_open_mfdataset_manyfiles[h5netcdf-20-True-5-5] ______ [gw1] darwin -- Python 3.10.9 /Users/runner/micromamba-root/envs/xarray-tests/bin/python readengine = 'h5netcdf', nfiles = 20, parallel = True, chunks = 5 file_cache_maxsize = 5
/Users/runner/work/xarray/xarray/xarray/tests/test_backends.py:3267: /Users/runner/work/xarray/xarray/xarray/backends/api.py:991: in open_mfdataset datasets, closers = dask.compute(datasets, closers) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py:599: in compute results = schedule(dsk, keys, kwargs) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py:89: in get results = get_async( /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:511: in get_async raise_exception(exc, tb) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:319: in reraise raise exc /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py:224: in execute_task result = _execute_task(task, data) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py:119: in _execute_task return func((_execute_task(a, cache) for a in args)) /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py:72: in apply return func(args, kwargs) /Users/runner/work/xarray/xarray/xarray/backends/api.py:526: in open_dataset backend_ds = backend.open_dataset( /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:417: in open_dataset ds = store_entrypoint.open_dataset( /Users/runner/work/xarray/xarray/xarray/backends/store.py:32: in open_dataset vars, attrs = store.load() /Users/runner/work/xarray/xarray/xarray/backends/common.py:129: in load (decode_variable_name(k), v) for k, v in self.get_variables().items() /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf.py:220: in get_variables return FrozenDict( /Users/runner/work/xarray/xarray/xarray/core/utils.py:471: in FrozenDict return Frozen(dict(args, *kwargs)) /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:221: in <genexpr> (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items() /Users/runner/work/xarray/xarray/xarray/backends/h5netcdf_.py:200: in open_store_variable elif var.compression is not None: /Users/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/h5netcdf/core.py:394: in compression return self._h5ds.compression self = <[AttributeError("'NoneType' object has no attribute '_root'") raised in repr()] Variable object at 0x151378970>
``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7513/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2261844699 | PR_kwDOAMm_X85toeXT | 8968 | Bump dependencies incl `pandas>=2` | dcherian 2448579 | closed | 0 | 0 | 2024-04-24T17:42:19Z | 2024-04-27T14:17:16Z | 2024-04-27T14:17:16Z | MEMBER | 0 | pydata/xarray/pulls/8968 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8968/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2261917442 | PR_kwDOAMm_X85touYl | 8971 | Delete pynio backend. | dcherian 2448579 | closed | 0 | 2 | 2024-04-24T18:25:26Z | 2024-04-25T14:38:23Z | 2024-04-25T14:23:59Z | MEMBER | 0 | pydata/xarray/pulls/8971 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8971/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2187743087 | PR_kwDOAMm_X85ptH1f | 8840 | Grouper, Resampler as public api | dcherian 2448579 | open | 0 | 0 | 2024-03-15T05:16:05Z | 2024-04-21T16:21:34Z | MEMBER | 1 | pydata/xarray/pulls/8840 | Expose Grouper and Resampler as public API TODO: - [ ] Consider avoiding IndexVariable
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8840/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2248614324 | I_kwDOAMm_X86GByG0 | 8952 | `isel(multi_index_level_name = MultiIndex.level)` corrupts the MultiIndex | dcherian 2448579 | open | 0 | 1 | 2024-04-17T15:41:39Z | 2024-04-18T13:14:46Z | MEMBER | What happened?From https://github.com/pydata/xarray/discussions/8951 if cc @benbovy What did you expect to happen?No response Minimal Complete Verifiable Example```Python import pandas as pd, xarray as xr, numpy as np xr.set_options(use_flox=True) test = pd.DataFrame() test["x"] = np.arange(100) % 10 test["y"] = np.arange(100) test["z"] = np.arange(100) test["v"] = np.arange(100) d = xr.Dataset.from_dataframe(test) d = d.set_index(index = ["x", "y", "z"]) print(d) m = d.groupby("x").mean() print(m) print(d.xindexes) print(m.isel(x=d.x).xindexes) xr.align(d, m.isel(x=d.x)) res = d.groupby("x") - mprint(res)```
MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8952/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2215762637 | PR_kwDOAMm_X85rMHpN | 8893 | Avoid extra read from disk when creating Pandas Index. | dcherian 2448579 | open | 0 | 1 | 2024-03-29T17:44:52Z | 2024-04-08T18:55:09Z | MEMBER | 0 | pydata/xarray/pulls/8893 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8893/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||||
2228266052 | PR_kwDOAMm_X85r24hE | 8913 | Update hypothesis action to always save the cache | dcherian 2448579 | closed | 0 | 0 | 2024-04-05T15:09:35Z | 2024-04-05T16:51:05Z | 2024-04-05T16:51:03Z | MEMBER | 0 | pydata/xarray/pulls/8913 | Update the cache always. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8913/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2228319306 | I_kwDOAMm_X86E0XRK | 8914 | swap_dims does not propagate indexes properly | dcherian 2448579 | open | 0 | 0 | 2024-04-05T15:36:26Z | 2024-04-05T15:36:27Z | MEMBER | What happened?Found by hypothesis ``` import xarray as xr import numpy as np var = xr.Variable(dims="2", data=np.array(['1970-01-01T00:00:00.000000000', '1970-01-01T00:00:00.000000002', '1970-01-01T00:00:00.000000001'], dtype='datetime64[ns]')) var1 = xr.Variable(data=np.array([0], dtype=np.uint32), dims=['1'], attrs={}) state = xr.Dataset() state['2'] = var state = state.stack({"0": ["2"]}) state['1'] = var1 state['1_'] = var1#.copy(deep=True) state = state.swap_dims({"1": "1_"}) xr.testing.assertions._assert_internal_invariants(state, False) ``` This swaps simple pandas indexed dims, but the multi-index that is in the dataset and not affected by the swap_dims op ends up broken. cc @benbovy What did you expect to happen?No response Minimal Complete Verifiable ExampleNo response MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8914/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2224297504 | PR_kwDOAMm_X85rpGUH | 8906 | Add invariant check for IndexVariable.name | dcherian 2448579 | open | 0 | 1 | 2024-04-04T02:13:33Z | 2024-04-05T07:12:54Z | MEMBER | 1 | pydata/xarray/pulls/8906 | @benbovy this seems to be the root cause of #8646, the variable name in A good number of tests seem to fail though, so not sure if this is a good chck.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8906/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2136709010 | I_kwDOAMm_X85_W5eS | 8753 | Lazy Loading with `DataArray` vs. `Variable` | dcherian 2448579 | closed | 0 | 0 | 2024-02-15T14:42:24Z | 2024-04-04T16:46:54Z | 2024-04-04T16:46:54Z | MEMBER | Discussed in https://github.com/pydata/xarray/discussions/8751
<sup>Originally posted by **ilan-gold** February 15, 2024</sup>
My goal is to get a dataset from [custom io-zarr backend lazy-loaded](https://docs.xarray.dev/en/stable/internals/how-to-add-new-backend.html#how-to-support-lazy-loading). But when I declare a `DataArray` based on the `Variable` which uses `LazilyIndexedArray`, everything is read in. Is this expected? I specifically don't want to have to use dask if possible. I have seen https://github.com/aurghs/xarray-backend-tutorial/blob/main/2.Backend_with_Lazy_Loading.ipynb but it's a little bit different.
While I have a custom backend array inheriting from `ZarrArrayWrapper`, this example using `ZarrArrayWrapper` directly still highlights the same unexpected behavior of everything being read in.
```python
import zarr
import xarray as xr
from tempfile import mkdtemp
import numpy as np
from pathlib import Path
from collections import defaultdict
class AccessTrackingStore(zarr.DirectoryStore):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._access_count = {}
self._accessed = defaultdict(set)
def __getitem__(self, key):
for tracked in self._access_count:
if tracked in key:
self._access_count[tracked] += 1
self._accessed[tracked].add(key)
return super().__getitem__(key)
def get_access_count(self, key):
return self._access_count[key]
def set_key_trackers(self, keys_to_track):
if isinstance(keys_to_track, str):
keys_to_track = [keys_to_track]
for k in keys_to_track:
self._access_count[k] = 0
def get_subkeys_accessed(self, key):
return self._accessed[key]
orig_path = Path(mkdtemp())
z = zarr.group(orig_path / "foo.zarr")
z['array'] = np.random.randn(1000, 1000)
store = AccessTrackingStore(orig_path / "foo.zarr")
store.set_key_trackers(['array'])
z = zarr.group(store)
arr = xr.backends.zarr.ZarrArrayWrapper(z['array'])
lazy_arr = xr.core.indexing.LazilyIndexedArray(arr)
# just `.zarray`
var = xr.Variable(('x', 'y'), lazy_arr)
print('Variable read in ', store.get_subkeys_accessed('array'))
# now everything is read in
da = xr.DataArray(var)
print('DataArray read in ', store.get_subkeys_accessed('array'))
``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8753/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2136724736 | PR_kwDOAMm_X85m_MtN | 8754 | Don't access data when creating DataArray from Variable. | dcherian 2448579 | closed | 0 | 2 | 2024-02-15T14:48:32Z | 2024-04-04T16:46:54Z | 2024-04-04T16:46:53Z | MEMBER | 0 | pydata/xarray/pulls/8754 |
This seems to have been around since 2016-ish, so presumably our backend code path is passing arrays around, not Variables. cc @ilan-gold |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8754/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2224300175 | PR_kwDOAMm_X85rpG4S | 8907 | Trigger hypothesis stateful tests nightly | dcherian 2448579 | closed | 0 | 0 | 2024-04-04T02:16:59Z | 2024-04-04T02:17:49Z | 2024-04-04T02:17:47Z | MEMBER | 0 | pydata/xarray/pulls/8907 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8907/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2098659175 | PR_kwDOAMm_X85k-T6b | 8658 | Stateful tests with Dataset | dcherian 2448579 | closed | 0 | 8 | 2024-01-24T16:34:59Z | 2024-04-03T21:29:38Z | 2024-04-03T21:29:36Z | MEMBER | 0 | pydata/xarray/pulls/8658 | I was curious to see if the hypothesis stateful testing would catch an inconsistent sequence of index manipulation operations like #8646. Turns out PS: this blog post is amazing.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8658/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2000205407 | PR_kwDOAMm_X85fzupc | 8467 | [skip-ci] dev whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-11-18T03:59:29Z | 2024-04-03T21:08:45Z | 2023-11-18T15:20:37Z | MEMBER | 0 | pydata/xarray/pulls/8467 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8467/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989233637 | PR_kwDOAMm_X85fOdAk | 8446 | Remove PseudoNetCDF | dcherian 2448579 | closed | 0 | 0 | 2023-11-12T04:29:50Z | 2024-04-03T21:08:44Z | 2023-11-13T21:53:56Z | MEMBER | 0 | pydata/xarray/pulls/8446 | joining the party
- [x] User visible changes (including notable bug fixes) are documented in |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8446/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2064698904 | PR_kwDOAMm_X85jLHsQ | 8584 | Silence a bunch of CachingFileManager warnings | dcherian 2448579 | closed | 0 | 1 | 2024-01-03T21:57:07Z | 2024-04-03T21:08:27Z | 2024-01-03T22:52:58Z | MEMBER | 0 | pydata/xarray/pulls/8584 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8584/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2102850331 | PR_kwDOAMm_X85lMW8k | 8674 | Fix negative slicing of Zarr arrays | dcherian 2448579 | closed | 0 | 0 | 2024-01-26T20:22:21Z | 2024-04-03T21:08:26Z | 2024-02-10T02:57:32Z | MEMBER | 0 | pydata/xarray/pulls/8674 | Closes #8252 Closes #3921
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8674/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2148245262 | PR_kwDOAMm_X85nmmqX | 8777 | Return a dataclass from Grouper.factorize | dcherian 2448579 | closed | 0 | 0 | 2024-02-22T05:41:29Z | 2024-04-03T21:08:25Z | 2024-03-15T04:47:30Z | MEMBER | 0 | pydata/xarray/pulls/8777 | Toward #8510, builds on #8776 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8777/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2148164557 | PR_kwDOAMm_X85nmU5w | 8775 | [skip-ci] NamedArray: Add lazy indexing array refactoring plan | dcherian 2448579 | closed | 0 | 0 | 2024-02-22T04:25:49Z | 2024-04-03T21:08:21Z | 2024-02-23T22:20:09Z | MEMBER | 0 | pydata/xarray/pulls/8775 | This adds a proposal for decoupling the lazy indexing array machinery, indexing adapter machinery, and Variable's setitem and getitem methods, so that the latter can be migrated to NamedArray. cc @andersy005 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8775/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 2, "eyes": 0 } |
xarray 13221727 | pull | |||||
2198991054 | PR_kwDOAMm_X85qTNFP | 8861 | upstream-dev CI: Fix interp and cumtrapz | dcherian 2448579 | closed | 0 | 0 | 2024-03-21T02:49:40Z | 2024-04-03T21:08:17Z | 2024-03-21T04:16:45Z | MEMBER | 0 | pydata/xarray/pulls/8861 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8861/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1997636679 | PR_kwDOAMm_X85frAC_ | 8460 | Add initialize_zarr | dcherian 2448579 | open | 0 | 8 | 2023-11-16T19:45:05Z | 2024-04-02T15:08:01Z | MEMBER | 1 | pydata/xarray/pulls/8460 |
The intended pattern is: ```python
``` cc @slevang |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8460/reactions", "total_count": 5, "+1": 0, "-1": 0, "laugh": 0, "hooray": 3, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 } |
xarray 13221727 | pull | ||||||
2215539648 | PR_kwDOAMm_X85rLW_p | 8891 | 2024.03.0: Add whats-new | dcherian 2448579 | closed | 0 | 0 | 2024-03-29T15:01:35Z | 2024-03-29T17:07:19Z | 2024-03-29T17:07:17Z | MEMBER | 0 | pydata/xarray/pulls/8891 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8891/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2206047573 | PR_kwDOAMm_X85qrHyn | 8875 | Optimize writes to existing Zarr stores. | dcherian 2448579 | closed | 0 | 0 | 2024-03-25T15:32:47Z | 2024-03-29T14:35:30Z | 2024-03-29T14:35:29Z | MEMBER | 0 | pydata/xarray/pulls/8875 | We need to read existing variables to make sure we append or write to a region with the right encoding. Currently we decode all arrays in a Zarr group. Instead only decode those arrays for which we require encoding information. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8875/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2213636579 | I_kwDOAMm_X86D8Wnj | 8887 | resetting multiindex may be buggy | dcherian 2448579 | open | 0 | 1 | 2024-03-28T16:23:38Z | 2024-03-29T07:59:22Z | MEMBER | What happened?Resetting a MultiIndex dim coordinate preserves the MultiIndex levels as IndexVariables. We should either reset the indexes for the multiindex level variables, or warn asking the users to do so This seems to be the root cause exposed by https://github.com/pydata/xarray/pull/8809 cc @benbovy What did you expect to happen?No response Minimal Complete Verifiable Example```Python import numpy as np import xarray as xr ND DataArray that gets stacked along a multiindexda = xr.DataArray(np.ones((3, 3)), coords={"dim1": [1, 2, 3], "dim2": [4, 5, 6]}) da = da.stack(feature=["dim1", "dim2"]) Extract just the stacked coordinates for saving in a datasetds = xr.Dataset(data_vars={"feature": da.feature}) xr.testing.assertions._assert_internal_invariants(ds.reset_index(["feature", "dim1", "dim2"]), check_default_indexes=False) # succeeds xr.testing.assertions._assert_internal_invariants(ds.reset_index(["feature"]), check_default_indexes=False) # fails, but no warning either ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8887/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2066510805 | I_kwDOAMm_X857LHPV | 8589 | Don't overwrite indexes for region writes, always | dcherian 2448579 | closed | 0 | 2 | 2024-01-04T23:52:18Z | 2024-03-27T16:24:37Z | 2024-03-27T16:24:36Z | MEMBER | What happened?Currently we don't overwrite indexes when I propose we do this for all region writes and completely disallow modifying indexes with a region write. This would match the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8589/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2206385638 | PR_kwDOAMm_X85qsSKm | 8877 | Don't allow overwriting indexes with region writes | dcherian 2448579 | closed | 0 | 0 | 2024-03-25T18:13:19Z | 2024-03-27T16:24:37Z | 2024-03-27T16:24:35Z | MEMBER | 0 | pydata/xarray/pulls/8877 |
cc @slevang |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8877/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1471685307 | I_kwDOAMm_X85XuCK7 | 7344 | Disable bottleneck by default? | dcherian 2448579 | open | 0 | 11 | 2022-12-01T17:26:11Z | 2024-03-27T00:22:41Z | MEMBER | What is your issue?Our choice to enable bottleneck by default results in quite a few issues about numerical stability and funny dtype behaviour: #7336, #7128, #2370, #1346 (and probably more) Shall we disable it by default? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7344/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2123950388 | PR_kwDOAMm_X85mT6XD | 8720 | groupby: Dispatch quantile to flox. | dcherian 2448579 | closed | 0 | 7 | 2024-02-07T21:42:42Z | 2024-03-26T15:08:32Z | 2024-03-26T15:08:30Z | MEMBER | 0 | pydata/xarray/pulls/8720 |
@aulemahal would you be able to test against xclim's test suite. I imagine you're doing a bunch of grouped quantiles. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8720/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2184830377 | PR_kwDOAMm_X85pjN8A | 8829 | Revert "Do not attempt to broadcast when global option ``arithmetic_b… | dcherian 2448579 | closed | 0 | 7 | 2024-03-13T20:27:12Z | 2024-03-20T15:30:12Z | 2024-03-15T03:59:07Z | MEMBER | 0 | pydata/xarray/pulls/8829 | …roadcast=False`` (#8784)" This reverts commit 11f89ecdd41226cf93da8d1e720d2710849cd23e. Reverting #8784 Sadly that PR broke a lot of tests by breaking ```AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data() File ~/repos/xarray/xarray/tests/init.py:329, in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords["numbers"] = ("dim3", numbers_values) 328 obj.encoding = {"foo": "bar"} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj AssertionError: ``` Somehow that code changes whether cc @etienneschalk |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8829/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2189750364 | PR_kwDOAMm_X85p0Epw | 8847 | pandas 3 MultiIndex fixes | dcherian 2448579 | closed | 0 | 0 | 2024-03-16T03:51:06Z | 2024-03-20T15:00:20Z | 2024-03-20T15:00:18Z | MEMBER | 0 | pydata/xarray/pulls/8847 | xref #8844 Closes https://github.com/xarray-contrib/flox/issues/342 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8847/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2189738090 | PR_kwDOAMm_X85p0CKq | 8846 | Support pandas copy-on-write behaviour | dcherian 2448579 | closed | 0 | 2 | 2024-03-16T03:14:46Z | 2024-03-18T16:00:15Z | 2024-03-18T16:00:12Z | MEMBER | 0 | pydata/xarray/pulls/8846 |
```python import numpy as np import pandas as pd pd.set_option("mode.copy_on_write", True) from xarray.core.variable import _possibly_convert_objects string_var = np.array(["a", "bc", "def"], dtype=object) datetime_var = np.array( ["2019-01-01", "2019-01-02", "2019-01-03"], dtype="datetime64[ns]" ) assert _possibly_convert_objects(string_var).flags.writeable assert _possibly_convert_objects(datetime_var).flags.writeable ``` The core issue is that we now get read-only arrays back from pandas here: https://github.com/pydata/xarray/blob/fbcac7611bf9a16750678f93483d3dbe0e261a0a/xarray/core/variable.py#L197-L212 @phofl is this expected? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8846/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2188936276 | I_kwDOAMm_X86CeIRU | 8843 | Get ready for pandas 3 copy-on-write | dcherian 2448579 | closed | 0 | 2 | 2024-03-15T15:51:36Z | 2024-03-18T16:00:14Z | 2024-03-18T16:00:14Z | MEMBER | What is your issue?This line fails with We'll need to fix this before Pandas 3 is released in April: https://github.com/pydata/xarray/blob/c9d3084e98d38a7a9488380789a8d0acfde3256f/xarray/tests/init.py#L329 Here's a test ```python def example(): obj = Dataset() obj["dim2"] = ("dim2", 0.5 * np.arange(9)) obj["time"] = ("time", pd.date_range("2000-01-01", periods=20) print({k: v.data.flags for k, v in obj.variables.items()}) return obj example() pd.set_options("mode.copy_on_write", True) example() ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8843/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2098659703 | I_kwDOAMm_X859FwF3 | 8659 | renaming index variables with `rename_vars` seems buggy | dcherian 2448579 | closed | 0 | 1 | 2024-01-24T16:35:18Z | 2024-03-15T19:21:51Z | 2024-03-15T19:21:51Z | MEMBER | What happened?(xref #8658) I'm not sure what the expected behaviour is here: ```python import xarray as xr import numpy as np from xarray.testing import _assert_internal_invariants ds = xr.Dataset() ds.coords["1"] = ("1", np.array([1], dtype=np.uint32)) ds["1_"] = ("1", np.array([1], dtype=np.uint32)) ds = ds.rename_vars({"1": "0"}) ds ``` It looks like this sequence of operations creates a default index
But then ```python from xarray.testing import _assert_internal_invariants _assert_internal_invariants(ds, check_default_indexes=True)
AssertionError: ({'0'}, set()) ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8659/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2187659148 | I_kwDOAMm_X86CZQeM | 8838 | remove xfail from `test_dataarray.test_to_dask_dataframe()` | dcherian 2448579 | open | 0 | 2 | 2024-03-15T03:43:02Z | 2024-03-15T15:33:31Z | MEMBER | What is your issue?when dask-expr is fixed. Added in https://github.com/pydata/xarray/pull/8837 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8838/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2021856935 | PR_kwDOAMm_X85g81gb | 8509 | Proof of concept - public Grouper objects | dcherian 2448579 | open | 0 | 0 | 2023-12-02T04:52:27Z | 2024-03-15T05:18:18Z | MEMBER | 1 | pydata/xarray/pulls/8509 | Not for merging, just proof that it can be done nicely :) Now builds on #8840 ~Builds on an older version of #8507~ Try it out! ```python import xarray as xr from xarray.core.groupers import SeasonGrouper, SeasonResampler ds = xr.tutorial.open_dataset("air_temperature") custom seasons!ds.air.groupby(time=SeasonGrouper(["JF", "MAM", "JJAS", "OND"])).mean() ds.air.resample(time=SeasonResampler(["DJF", "MAM", "JJAS", "ON"])).count() ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8509/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2187682227 | PR_kwDOAMm_X85ps6tB | 8839 | [skip-ci] Fix upstream-dev env | dcherian 2448579 | closed | 0 | 0 | 2024-03-15T04:08:58Z | 2024-03-15T04:37:59Z | 2024-03-15T04:37:58Z | MEMBER | 0 | pydata/xarray/pulls/8839 | upstream-dev env is broken
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8839/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2187646833 | PR_kwDOAMm_X85psy9g | 8837 | Add dask-expr for windows envs | dcherian 2448579 | closed | 0 | 0 | 2024-03-15T03:27:48Z | 2024-03-15T04:06:05Z | 2024-03-15T04:06:03Z | MEMBER | 0 | pydata/xarray/pulls/8837 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8837/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2184871888 | I_kwDOAMm_X86COn_Q | 8830 | failing tests, all envs | dcherian 2448579 | closed | 0 | 1 | 2024-03-13T20:56:34Z | 2024-03-15T04:06:04Z | 2024-03-15T04:06:04Z | MEMBER | What happened?All tests are failing because of an error in
```AssertionError Traceback (most recent call last) Cell In[3], line 2 1 from xarray.tests import create_test_data ----> 2 create_test_data() File ~/repos/xarray/xarray/tests/init.py:329, in create_test_data(seed, add_attrs, dim_sizes) 327 obj.coords["numbers"] = ("dim3", numbers_values) 328 obj.encoding = {"foo": "bar"} --> 329 assert all(var.values.flags.writeable for var in obj.variables.values()) 330 return obj AssertionError: ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8830/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2184606202 | PR_kwDOAMm_X85picsD | 8827 | Add `dask-expr` to environment-3.12.yml | dcherian 2448579 | closed | 0 | 0 | 2024-03-13T18:07:27Z | 2024-03-13T20:20:46Z | 2024-03-13T20:20:45Z | MEMBER | 0 | pydata/xarray/pulls/8827 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8827/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1308371056 | I_kwDOAMm_X85N_Chw | 6806 | New alignment option: "exact" without broadcasting OR Turn off automatic broadcasting | dcherian 2448579 | closed | 0 | 9 | 2022-07-18T18:43:31Z | 2024-03-13T15:36:35Z | 2024-03-13T15:36:35Z | MEMBER | Is your feature request related to a problem?If we have two objects with dims I'd like a stricter option ( Describe the solution you'd like
It'd be nice to have this as a built-in option so we can use
Describe alternatives you've consideredAn alternative would be to allow control over automatic broadcasting through the Additional contextThis turns up in staggered grid calculations with xgcm where it is easy to mistakenly construct very high-dimensional arrays because of automatic broadcasting. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6806/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2148242023 | PR_kwDOAMm_X85nml9d | 8776 | Refactor Grouper objects | dcherian 2448579 | closed | 0 | 0 | 2024-02-22T05:38:09Z | 2024-03-07T21:50:07Z | 2024-03-07T21:50:04Z | MEMBER | 0 | pydata/xarray/pulls/8776 | Some refactoring towards the Grouper refactor described in #8510
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8776/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2021858121 | PR_kwDOAMm_X85g81wJ | 8510 | Grouper object design doc | dcherian 2448579 | closed | 0 | 6 | 2023-12-02T04:56:54Z | 2024-03-06T02:27:07Z | 2024-03-06T02:27:04Z | MEMBER | 0 | pydata/xarray/pulls/8510 | xref #8509, #6610 Rendered version here @pydata/xarray I've been poking at this on and off for a year now and finally figured out how to do it cleanly (#8509). I wrote up a design doc for 8509 implements two custom Groupers for you to try out :)```python import xarray as xr from xarray.core.groupers import SeasonGrouper, SeasonResampler ds = xr.tutorial.open_dataset("air_temperature") custom seasons!ds.air.groupby(time=SeasonGrouper(["JF", "MAM", "JJAS", "OND"])).mean() ds.air.resample(time=SeasonResampler(["DJF", "MAM", "JJAS", "ON"])).count() ``` All comments are welcome,
1. there are a couple of specific API and design decisions to be made. I'll make some comments pointing these out.
2. I'm also curious about what cc @ilan-gold @ivirshup @aulemahal @tomvothecoder @jbusecke @katiedagon - it would be good to hear what "Groupers" would be useful for your work / projects. I bet you already have examples that fit this proposal |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8510/reactions", "total_count": 8, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 8, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2149485914 | I_kwDOAMm_X86AHo1a | 8778 | Stricter defaults for concat, combine, open_mfdataset | dcherian 2448579 | open | 0 | 2 | 2024-02-22T16:43:38Z | 2024-02-23T04:17:40Z | MEMBER | Is your feature request related to a problem?The defaults for
While "convenient" this really just makes the default experience quite bad with hard-to-understand slowdowns. Describe the solution you'd likeI propose we migrate to Unfortunately, this has a pretty big blast radius so we'd need a long deprecation cycle. Describe alternatives you've consideredNo response Additional contextxref https://github.com/pydata/xarray/issues/4824 xref https://github.com/pydata/xarray/issues/1385 xref https://github.com/pydata/xarray/issues/8231 xref https://github.com/pydata/xarray/issues/5381 xref https://github.com/pydata/xarray/issues/2064 xref https://github.com/pydata/xarray/issues/2217 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8778/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2135011804 | I_kwDOAMm_X85_QbHc | 8748 | release v2024.02.0 | dcherian 2448579 | closed | 0 | keewis 14808389 | 0 | 2024-02-14T19:08:38Z | 2024-02-18T22:52:15Z | 2024-02-18T22:52:15Z | MEMBER | What is your issue?Thanks to @keewis for volunteering at today's meeting :() |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8748/reactions", "total_count": 3, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | |||||
2102852029 | PR_kwDOAMm_X85lMXU0 | 8675 | Fix NetCDF4 C version detection | dcherian 2448579 | closed | 0 | 1 | 2024-01-26T20:23:54Z | 2024-01-27T01:28:51Z | 2024-01-27T01:28:49Z | MEMBER | 0 | pydata/xarray/pulls/8675 | This fixes the failure locally for me. cc @max-sixty |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8675/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2098626592 | PR_kwDOAMm_X85k-Mnt | 8657 | groupby: Don't set `method` by default on flox>=0.9 | dcherian 2448579 | closed | 0 | 0 | 2024-01-24T16:20:57Z | 2024-01-26T16:54:25Z | 2024-01-26T16:54:23Z | MEMBER | 0 | pydata/xarray/pulls/8657 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8657/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2064313690 | I_kwDOAMm_X857Cu1a | 8580 | add py3.12 CI and update pyproject.toml | dcherian 2448579 | closed | 0 | 2 | 2024-01-03T16:26:47Z | 2024-01-17T21:54:13Z | 2024-01-17T21:54:13Z | MEMBER | What is your issue?We haven't done this yet! https://github.com/pydata/xarray/blob/d87ba61c957fc3af77251ca6db0f6bccca1acb82/pyproject.toml#L11-L15 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8580/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2086607437 | I_kwDOAMm_X858XxpN | 8616 | new release 2024.01.0 | dcherian 2448579 | closed | 0 | 0 | 2024-01-17T17:03:20Z | 2024-01-17T19:21:12Z | 2024-01-17T19:21:12Z | MEMBER | What is your issue?Thanks @TomNicholas for volunteering to drive this release! |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8616/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
638947370 | MDU6SXNzdWU2Mzg5NDczNzA= | 4156 | writing sparse to netCDF | dcherian 2448579 | open | 0 | 7 | 2020-06-15T15:33:23Z | 2024-01-09T10:14:00Z | MEMBER | I haven't looked at this too closely but it appears that this is a way to save MultiIndexed datasets to netCDF. So we may be able to do cc @fujiisoup |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4156/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2021763059 | PR_kwDOAMm_X85g8iNJ | 8507 | Deprecate `squeeze` in GroupBy. | dcherian 2448579 | closed | 0 | 2 | 2023-12-02T00:21:43Z | 2024-01-08T03:08:47Z | 2024-01-08T01:05:23Z | MEMBER | 0 | pydata/xarray/pulls/8507 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8507/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2065788086 | PR_kwDOAMm_X85jOw74 | 8585 | Enable Zarr V3 tests in all CI runs. | dcherian 2448579 | closed | 0 | 0 | 2024-01-04T14:45:44Z | 2024-01-05T17:53:08Z | 2024-01-05T17:53:06Z | MEMBER | 0 | pydata/xarray/pulls/8585 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8585/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2064420057 | I_kwDOAMm_X857DIzZ | 8581 | bump min versions | dcherian 2448579 | closed | 0 | 0 | 2024-01-03T17:45:10Z | 2024-01-05T16:13:16Z | 2024-01-05T16:13:15Z | MEMBER | What is your issue?Looks like we can bump a number of min versions: ``` Package Required Policy Status cartopy 0.20 (2021-09-17) 0.21 (2022-09-10) < dask-core 2022.7 (2022-07-08) 2022.12 (2022-12-02) < distributed 2022.7 (2022-07-08) 2022.12 (2022-12-02) < flox 0.5 (2022-05-03) 0.6 (2022-10-12) < iris 3.2 (2022-02-15) 3.4 (2022-12-01) < matplotlib-base 3.5 (2021-11-18) 3.6 (2022-09-16) < numba 0.55 (2022-01-14) 0.56 (2022-09-28) < numpy 1.22 (2022-01-03) 1.23 (2022-06-23) < packaging 21.3 (2021-11-18) 22.0 (2022-12-08) < pandas 1.4 (2022-01-22) 1.5 (2022-09-19) < scipy 1.8 (2022-02-06) 1.9 (2022-07-30) < seaborn 0.11 (2020-09-08) 0.12 (2022-09-06) < typing_extensions 4.3 (2022-07-01) 4.4 (2022-10-07) < zarr 2.12 (2022-06-23) 2.13 (2022-09-27) < ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8581/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2065809896 | PR_kwDOAMm_X85jO1oX | 8586 | Bump min deps. | dcherian 2448579 | closed | 0 | 0 | 2024-01-04T14:59:05Z | 2024-01-05T16:13:16Z | 2024-01-05T16:13:14Z | MEMBER | 0 | pydata/xarray/pulls/8586 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8586/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2066129022 | PR_kwDOAMm_X85jP678 | 8587 | Silence another warning in test_backends.py | dcherian 2448579 | closed | 0 | 1 | 2024-01-04T18:20:49Z | 2024-01-05T16:13:05Z | 2024-01-05T16:13:03Z | MEMBER | 0 | pydata/xarray/pulls/8587 | Using 255 as fillvalue for int8 arrays will not be allowed any more. Previously this overflowed to -1. Now specify that instead. On numpy 1.24.4 ```
array([-1], dtype=int8) ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8587/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2052694433 | PR_kwDOAMm_X85ilhQm | 8565 | Faster encoding functions. | dcherian 2448579 | closed | 0 | 1 | 2023-12-21T16:05:02Z | 2024-01-04T14:25:45Z | 2024-01-04T14:25:43Z | MEMBER | 0 | pydata/xarray/pulls/8565 | Spotted when profiling some write workloads. 1. Speeds up the check for multi-index 2. Speeds up one string encoder by not re-creating variables when not necessary. @benbovy is there a better way? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8565/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2064480451 | I_kwDOAMm_X857DXjD | 8582 | Adopt SPEC 0 instead of NEP-29 | dcherian 2448579 | open | 0 | 1 | 2024-01-03T18:36:24Z | 2024-01-03T20:12:05Z | MEMBER | What is your issue?https://docs.xarray.dev/en/stable/getting-started-guide/installing.html#minimum-dependency-versions says that we follow NEP-29, and I think our min versions script also does that. I propose we follow https://scientific-python.org/specs/spec-0000/ In practice, I think this means we mostly drop Python versions earlier. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8582/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1976752481 | PR_kwDOAMm_X85ekPdj | 8412 | Minimize duplication in `map_blocks` task graph | dcherian 2448579 | closed | 0 | 7 | 2023-11-03T18:30:02Z | 2024-01-03T04:10:17Z | 2024-01-03T04:10:15Z | MEMBER | 0 | pydata/xarray/pulls/8412 | Builds on #8560
cc @max-sixty ``` print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).map_blocks(lambda x: x)))) 779354739 -> 47699827print(len(cloudpickle.dumps(da.chunk(lat=1, lon=1).drop_vars(da.indexes).map_blocks(lambda x: x)))) 15981508``` This is a quick attempt. I think we can generalize this to minimize duplication. The downside is that the graphs are not totally embarrassingly parallel any more.
This PR:
vs main:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8412/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2052952379 | I_kwDOAMm_X856XZE7 | 8568 | Raise when assigning attrs to virtual variables (default coordinate arrays) | dcherian 2448579 | open | 0 | 0 | 2023-12-21T19:24:11Z | 2023-12-21T19:24:19Z | MEMBER | Discussed in https://github.com/pydata/xarray/discussions/8567
<sup>Originally posted by **matthew-brett** December 21, 2023</sup>
Sorry for the introductory question, but we (@ivanov and I) ran into this behavior while experimenting:
```python
import numpy as np
data = np.zeros((3, 4, 5))
ds = xr.DataArray(data, dims=('i', 'j', 'k'))
print(ds['k'].attrs)
```
This shows `{}` as we might reasonably expect. But then:
```python
ds['k'].attrs['foo'] = 'bar'
print(ds['k'].attrs)
```
This also gives `{}`, which we found surprising. We worked out why that was, after a little experimentation (the default coordinate arrays seems to get created on the fly and garbage collected immediately). But it took us a little while. Is that as intended? Is there a way of making this less confusing?
Thanks for any help. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8568/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2052610515 | PR_kwDOAMm_X85ilOq9 | 8564 | Fix mypy type ignore | dcherian 2448579 | closed | 0 | 1 | 2023-12-21T15:15:26Z | 2023-12-21T15:41:13Z | 2023-12-21T15:24:52Z | MEMBER | 0 | pydata/xarray/pulls/8564 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8564/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2047617215 | PR_kwDOAMm_X85iUJ7y | 8560 | Adapt map_blocks to use new Coordinates API | dcherian 2448579 | closed | 0 | 0 | 2023-12-18T23:11:55Z | 2023-12-20T17:11:18Z | 2023-12-20T17:11:16Z | MEMBER | 0 | pydata/xarray/pulls/8560 | Fixes roundtripping of string dtype indexes
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8560/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1954809370 | I_kwDOAMm_X850hAYa | 8353 | Update benchmark suite for asv 0.6.1 | dcherian 2448579 | open | 0 | 0 | 2023-10-20T18:13:22Z | 2023-12-19T05:53:21Z | MEMBER | The new asv version comes with decorators for parameterizing and skipping, and the ability to use https://github.com/airspeed-velocity/asv/releases https://asv.readthedocs.io/en/v0.6.1/writing_benchmarks.html#skipping-benchmarks This might help us reduce benchmark times a bit, or at least simplify the code some. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8353/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2027147099 | I_kwDOAMm_X854089b | 8523 | tree-reduce the combine for `open_mfdataset(..., parallel=True, combine="nested")` | dcherian 2448579 | open | 0 | 4 | 2023-12-05T21:24:51Z | 2023-12-18T19:32:39Z | MEMBER | Is your feature request related to a problem?When Instead we can tree-reduce the combine (example) by switching to
cc @TomNicholas |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8523/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
2033054792 | PR_kwDOAMm_X85hi5U2 | 8532 | Whats-new for 2023.12.0 | dcherian 2448579 | closed | 0 | 0 | 2023-12-08T17:29:47Z | 2023-12-08T19:36:28Z | 2023-12-08T19:36:26Z | MEMBER | 0 | pydata/xarray/pulls/8532 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8532/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2021754904 | PR_kwDOAMm_X85g8gnU | 8506 | Deprecate `squeeze` in GroupBy. | dcherian 2448579 | closed | 0 | 1 | 2023-12-02T00:08:50Z | 2023-12-02T00:13:36Z | 2023-12-02T00:13:36Z | MEMBER | 0 | pydata/xarray/pulls/8506 |
Could use a close-ish review. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8506/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989588884 | I_kwDOAMm_X852lreU | 8448 | mypy 1.7.0 raising errors | dcherian 2448579 | closed | 0 | 0 | 2023-11-12T21:41:43Z | 2023-12-01T22:02:22Z | 2023-12-01T22:02:22Z | MEMBER | What happened?
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8448/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2003229011 | PR_kwDOAMm_X85f9xPG | 8472 | Avoid duplicate Zarr array read | dcherian 2448579 | closed | 0 | 0 | 2023-11-21T00:16:34Z | 2023-12-01T02:58:22Z | 2023-12-01T02:47:03Z | MEMBER | 0 | pydata/xarray/pulls/8472 | We already get the underlying Zarr array in https://github.com/pydata/xarray/blob/bb8511e0894020e180d95d2edb29ed4036ac6447/xarray/backends/zarr.py#L529-L531 and then pass it to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8472/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2015530249 | PR_kwDOAMm_X85gnO8L | 8489 | Minor to_zarr optimizations | dcherian 2448579 | closed | 0 | 0 | 2023-11-28T23:56:32Z | 2023-12-01T02:20:19Z | 2023-12-01T02:18:18Z | MEMBER | 0 | pydata/xarray/pulls/8489 | Avoid repeatedly pinging a remote store by requesting keys at one go. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8489/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1615596004 | I_kwDOAMm_X85gTAnk | 7596 | illustrate time offset arithmetic | dcherian 2448579 | closed | 0 | 2 | 2023-03-08T16:54:15Z | 2023-11-29T01:31:45Z | 2023-11-29T01:31:45Z | MEMBER | Is your feature request related to a problem?We should document changing the time vector using pandas date offsets here This is particularly useful for centering the time stamps after a resampling operation. Related:
- CFTime offsets: https://github.com/pydata/xarray/issues/5687
- Describe the solution you'd likeNo response Describe alternatives you've consideredNo response Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7596/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1997656427 | PR_kwDOAMm_X85frEdb | 8461 | 2023.11.0 Whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-11-16T19:55:12Z | 2023-11-17T21:02:22Z | 2023-11-17T21:02:20Z | MEMBER | 0 | pydata/xarray/pulls/8461 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8461/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1997136566 | PR_kwDOAMm_X85fpRL3 | 8458 | Pin mypy < 1.7 | dcherian 2448579 | closed | 0 | 0 | 2023-11-16T15:31:26Z | 2023-11-16T17:29:04Z | 2023-11-16T17:29:03Z | MEMBER | 0 | pydata/xarray/pulls/8458 | xref #8448 get back to green checks for now. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8458/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1995186323 | PR_kwDOAMm_X85finmE | 8452 | [skip-ci] Small updates to IO docs. | dcherian 2448579 | closed | 0 | 0 | 2023-11-15T17:05:47Z | 2023-11-16T15:19:59Z | 2023-11-16T15:19:57Z | MEMBER | 0 | pydata/xarray/pulls/8452 | Also fixes the RTD failure on main |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8452/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989227042 | PR_kwDOAMm_X85fObtL | 8445 | Pin pint to >=0.22 | dcherian 2448579 | closed | 0 | 3 | 2023-11-12T03:58:40Z | 2023-11-13T19:39:54Z | 2023-11-13T19:39:53Z | MEMBER | 0 | pydata/xarray/pulls/8445 |
We were previously pinned to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8445/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1989212292 | PR_kwDOAMm_X85fOYwT | 8444 | Remove keep_attrs from resample signature | dcherian 2448579 | closed | 0 | 1 | 2023-11-12T02:57:59Z | 2023-11-12T22:53:36Z | 2023-11-12T22:53:35Z | MEMBER | 0 | pydata/xarray/pulls/8444 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8444/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1635949876 | PR_kwDOAMm_X85MpxlL | 7659 | Redo whats-new for 2023.03.0 | dcherian 2448579 | closed | 0 | 0 | 2023-03-22T15:02:38Z | 2023-11-06T04:25:54Z | 2023-03-22T15:42:49Z | MEMBER | 0 | pydata/xarray/pulls/7659 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7659/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1630533356 | PR_kwDOAMm_X85MXo4e | 7643 | Whats-new for release 2023.03.0 | dcherian 2448579 | closed | 0 | 0 | 2023-03-18T19:14:55Z | 2023-11-06T04:25:53Z | 2023-03-20T15:57:36Z | MEMBER | 0 | pydata/xarray/pulls/7643 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7643/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1471673992 | PR_kwDOAMm_X85EFDiU | 7343 | Fix mypy failures | dcherian 2448579 | closed | 0 | 1 | 2022-12-01T17:16:44Z | 2023-11-06T04:25:52Z | 2022-12-01T18:25:07Z | MEMBER | 0 | pydata/xarray/pulls/7343 | { "url": "https://api.github.com/repos/pydata/xarray/issues/7343/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1533942791 | PR_kwDOAMm_X85HahUq | 7440 | v2023.01.0 whats-new | dcherian 2448579 | closed | 0 | 0 | 2023-01-15T18:20:28Z | 2023-11-06T04:25:52Z | 2023-01-18T21:18:49Z | MEMBER | 0 | pydata/xarray/pulls/7440 | Should update the date and delete empty sections before merging |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7440/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1942666419 | PR_kwDOAMm_X85cxojc | 8304 | Move Variable aggregations to NamedArray | dcherian 2448579 | closed | 0 | 6 | 2023-10-13T21:31:01Z | 2023-11-06T04:25:43Z | 2023-10-17T19:14:12Z | MEMBER | 0 | pydata/xarray/pulls/8304 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8304/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1689364566 | PR_kwDOAMm_X85PbeOv | 7796 | Speed up .dt accessor by preserving Index objects. | dcherian 2448579 | closed | 0 | 1 | 2023-04-29T04:22:10Z | 2023-11-06T04:25:42Z | 2023-05-16T17:55:48Z | MEMBER | 0 | pydata/xarray/pulls/7796 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7796/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1973472421 | PR_kwDOAMm_X85eZF4x | 8400 | Better attr diff for `testing.assert_identical` | dcherian 2448579 | closed | 0 | 2 | 2023-11-02T04:15:09Z | 2023-11-04T20:25:37Z | 2023-11-04T20:25:36Z | MEMBER | 0 | pydata/xarray/pulls/8400 |
This gives us better reprs where only differing attributes are shown in the diff. On main:
With this PR: ``` Differing coordinates: L * x (x) %cU1 'a' 'b' Differing variable attributes: foo: bar R * x (x) %cU1 'a' 'c' Differing variable attributes: source: 0 foo: baz ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8400/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1975400777 | PR_kwDOAMm_X85efqSl | 8408 | Generalize explicit_indexing_adapter | dcherian 2448579 | open | 0 | 0 | 2023-11-03T03:29:40Z | 2023-11-03T03:53:25Z | MEMBER | 1 | pydata/xarray/pulls/8408 | Use |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8408/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1950211465 | I_kwDOAMm_X850Pd2J | 8333 | Should NamedArray be interchangeable with other array types? or Should we support the `axis` kwarg? | dcherian 2448579 | open | 0 | 17 | 2023-10-18T16:46:37Z | 2023-10-31T22:26:33Z | MEMBER | What is your issue?Raising @Illviljan's comment from https://github.com/pydata/xarray/pull/8304#discussion_r1363196597. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8333/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1962040911 | PR_kwDOAMm_X85dyZBT | 8373 | Use `opt_einsum` by default if installed. | dcherian 2448579 | closed | 0 | 2 | 2023-10-25T18:59:38Z | 2023-10-28T03:31:07Z | 2023-10-28T03:31:05Z | MEMBER | 0 | pydata/xarray/pulls/8373 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8373/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1672288892 | I_kwDOAMm_X85jrRp8 | 7764 | Support opt_einsum in xr.dot | dcherian 2448579 | closed | 0 | 7 | 2023-04-18T03:29:48Z | 2023-10-28T03:31:06Z | 2023-10-28T03:31:06Z | MEMBER | Is your feature request related to a problem?Shall we support opt_einsum as an optional backend for
Describe the solution you'd likeAdd a Describe alternatives you've consideredWe could create a new package but it seems a bit silly. Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7764/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1952621896 | I_kwDOAMm_X850YqVI | 8337 | Support rolling with numbagg | dcherian 2448579 | open | 0 | 3 | 2023-10-19T16:11:40Z | 2023-10-23T15:46:36Z | MEMBER | Is your feature request related to a problem?We can do plain reductions, and groupby reductions with numbagg. Rolling is the last one left! I don't think coarsen will benefit since it's basically a reshape and reduce on that view, so it should already be accelerated. There may be small gains in handling the boundary conditions but that's probably it. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8337/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1954445639 | I_kwDOAMm_X850fnlH | 8350 | optimize align for scalars at least | dcherian 2448579 | open | 0 | 5 | 2023-10-20T14:48:25Z | 2023-10-20T19:17:39Z | MEMBER | What happened?Here's a simple rescaling calculation: ```python import numpy as np import xarray as xr ds = xr.Dataset( {"a": (("x", "y"), np.ones((300, 400))), "b": (("x", "y"), np.ones((300, 400)))} ) mean = ds.mean() # scalar std = ds.std() # scalar rescaled = (ds - mean) / std ``` The profile for the last line shows 30% (!!!) time spent in This is a small example inspired by a ML pipeline where this normalization is happening very many times in a tight loop. cc @benbovy What did you expect to happen?A fast path for when no reindexing needs to happen. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8350/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1954535213 | PR_kwDOAMm_X85dZT47 | 8351 | [skip-ci] Add benchmarks for Dataset binary ops, chunk | dcherian 2448579 | closed | 0 | 1 | 2023-10-20T15:31:36Z | 2023-10-20T18:08:40Z | 2023-10-20T18:08:38Z | MEMBER | 0 | pydata/xarray/pulls/8351 | xref #8339 xref #8350 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8351/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1954360112 | PR_kwDOAMm_X85dYtpz | 8349 | [skip-ci] dev whats-new | dcherian 2448579 | closed | 0 | 1 | 2023-10-20T14:02:07Z | 2023-10-20T17:28:19Z | 2023-10-20T14:54:30Z | MEMBER | 0 | pydata/xarray/pulls/8349 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8349/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1950480317 | PR_kwDOAMm_X85dLkAj | 8334 | Whats-new: 2023.10.0 | dcherian 2448579 | closed | 0 | 1 | 2023-10-18T19:22:06Z | 2023-10-19T16:00:00Z | 2023-10-19T15:59:58Z | MEMBER | 0 | pydata/xarray/pulls/8334 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8334/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1944347086 | PR_kwDOAMm_X85c2nyz | 8316 | Enable numbagg for reductions | dcherian 2448579 | closed | 0 | 3 | 2023-10-16T04:46:10Z | 2023-10-18T14:54:48Z | 2023-10-18T10:39:30Z | MEMBER | 0 | pydata/xarray/pulls/8316 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8316/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1861954973 | PR_kwDOAMm_X85YhnBZ | 8100 | Remove np.asarray in formatting.py | dcherian 2448579 | closed | 0 | 2 | 2023-08-22T18:08:33Z | 2023-10-18T13:31:25Z | 2023-10-18T10:40:38Z | MEMBER | 0 | pydata/xarray/pulls/8100 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8100/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1943543755 | I_kwDOAMm_X85z2B_L | 8310 | pydata/xarray as monorepo for Xarray and NamedArray | dcherian 2448579 | open | 0 | 1 | 2023-10-14T20:34:51Z | 2023-10-14T21:29:11Z | MEMBER | What is your issue?As we work through refactoring for NamedArray, it's pretty clear that Xarray will depend pretty closely on many files in I propose we use pydata/xarray as a monorepo that serves two packages: NamedArray and Xarray. - We can move as much as is needed to have NamedArray be independent of Xarray, but Xarray will depend quite closely on many utility functions in NamedArray. - We can release both at the same time similar to dask and distributed. - We can re-evaluate if and when NamedArray grows its own community. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8310/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1942673099 | PR_kwDOAMm_X85cxp-D | 8305 | Update labeler.yml to add NamedArray | dcherian 2448579 | closed | 0 | 0 | 2023-10-13T21:39:56Z | 2023-10-14T06:47:08Z | 2023-10-14T06:47:07Z | MEMBER | 0 | pydata/xarray/pulls/8305 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8305/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1942893480 | I_kwDOAMm_X85zzjOo | 8306 | keep_attrs for NamedArray | dcherian 2448579 | open | 0 | 0 | 2023-10-14T02:29:54Z | 2023-10-14T02:31:35Z | MEMBER | What is your issue?Copying over @max-sixty's comment from https://github.com/pydata/xarray/pull/8304#discussion_r1358873522
@pydata/xarray Should we just delete the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8306/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1916012703 | I_kwDOAMm_X85yNAif | 8239 | Address repo-review suggestions | dcherian 2448579 | open | 0 | 7 | 2023-09-27T17:18:40Z | 2023-10-02T20:24:34Z | MEMBER | What is your issue?Here's the output from the Scientific Python Repo Review tool. There's an online version here. On mac I run
A lot of these seem fairly easy to fix. I'll note that there's a large number of General
Projects must have a PyProjectSee https://github.com/pydata/xarray/issues/8239#issuecomment-1739363809 <table> <tr><th>?</th><th>Name</th><th>Description</th></tr> <tr style="color: red;"> <td>❌</td> <td>PP305</td> <td> Specifies xfail_strict
</td>
</tr>
<tr style="color: red;">
<td>❌</td>
<td>PP308</td>
<td>
Specifies useful pytest summary
</td>
</tr>
</table>
Pre-commit<table> <tr><th>?</th><th>Name</th><th>Description</th></tr> <tr style="color: red;"> <td>❌</td> <td>PC110</td> <td> Uses blackUse Must have Must have Must have If Should have something like this in
</td>
</tr>
</table>
MyPy<table> <tr><th>?</th><th>Name</th><th>Description</th></tr> <tr style="color: red;"> <td>❌</td> <td>MY101</td> <td> MyPy strict modeMust have
</td>
</tr>
<tr style="color: red;">
<td>❌</td>
<td>MY103</td>
<td>
MyPy warn unreachable
Must have
</td>
</tr>
<tr style="color: red;">
<td>❌</td>
<td>MY104</td>
<td>
MyPy enables ignore-without-code
Must have
</td>
</tr>
<tr style="color: red;">
<td>❌</td>
<td>MY105</td>
<td>
MyPy enables redundant-expr
Must have
</td>
</tr>
<tr style="color: red;">
<td>❌</td>
<td>MY106</td>
<td>
MyPy enables truthy-bool
Must have
</td>
</tr>
</table>
Ruff<table> <tr><th>?</th><th>Name</th><th>Description</th></tr> <tr style="color: red;"> <td>❌</td> <td>RF101</td> <td> Bugbear must be selectedMust select the flake8-bugbear
</td>
</tr>
</table> |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8239/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);