issues
1 row where state = "open" and user = 11130776 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2095994466 | I_kwDOAMm_X8587lZi | 8646 | `rename_vars` followed by `swap_dims` and `merge` causes swapped dim to reappear | brendan-m-murphy 11130776 | open | 0 | 16 | 2024-01-23T12:31:54Z | 2024-04-05T09:10:15Z | NONE | What happened?I wanted to rename a dimension coordinate for two datasets before merging: Swapping the order of Doing My current solution is to only rename dimension coordinates before saving to netCDF. What did you expect to happen?Merging two datasets with the same coordinates and dimensions (but different data variables) should result in a single dataset with all of the data variables from the two datasets and exactly the same coordinates and dimensions. Minimal Complete Verifiable Example```Python import numpy as np import xarray as xr from xarray.core.utils import Frozen A = np.arange(4).reshape((2, 2)) B = np.arange(4).reshape((2, 2)) + 4 ds1 = xr.Dataset({"A": (["x", "y"], A), "B": (["x", "y"], B)}, coords={"x": ("x", [1, 2]), "y": ("y", [1, 2])}) ds2 = xr.Dataset({"C": (["x", "y"], A), "D": (["x", "y"], B)}, coords={"x": ("x", [1, 2]), "y": ("y", [1, 2])}) assert ds1.dims == Frozen({"x": 2, "y": 2}) assert ds2.dims == Frozen({"x": 2, "y": 2}) ds1_swap = ds1.rename_vars(y="z").swap_dims(y="z") ds2_swap = ds2.rename_vars(y="z").swap_dims(y="z") assert ds1_swap.dims == Frozen({"x": 2, "z": 2}) assert ds2_swap.dims == Frozen({"x": 2, "z": 2}) merging makes the dimension "y" reappear (I would expect this assertion to fail):assert xr.merge([ds1_swap, ds2_swap]).dims == Frozen({"x": 2, "z": 2, "y": 2}) renaming and swapping after the merge causes issues later:ds12 = xr.merge([ds1, ds2]).rename_vars(y="z").swap_dims(y="z") ds3 = xr.Dataset({"E": (["x", "z"], A), "F": (["x", "z"], B)}, coords={"x": ("x", [1, 2]), "z": ("z", [1, 2])}) ds12 and ds3 have the same dimensions:assert ds12.dims == Frozen({"x": 2, "z": 2}) assert ds3.dims == Frozen({"x": 2, "z": 2}) but merging brings back "y"ds123 = xr.merge([ds12, ds3]) assert ds123.dims == Frozen({"x": 2, "z": 2, "y": 2}) as do other operations:ds12_as = ds12.assign_coords(x=(ds12.x + 1)) assert ds12_as.sizes == Frozen({"x": 2, "z": 2, "y": 2}) ``` MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response EnvironmentThe MVCE works in all venvs I've tried including:
INSTALLED VERSIONS
------------------
commit: None
python: 3.10.13 (main, Nov 10 2023, 15:02:19) [GCC 11.4.0]
python-bits: 64
OS: Linux
OS-release: 6.5.0-14-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_GB.UTF-8
LOCALE: ('en_GB', 'UTF-8')
libhdf5: 1.12.2
libnetcdf: 4.9.3-development
xarray: 2023.11.0
pandas: 1.5.3
numpy: 1.26.2
scipy: 1.11.4
netCDF4: 1.6.5
pydap: None
h5netcdf: 1.3.0
h5py: 3.10.0
Nio: None
zarr: None
cftime: 1.6.3
nc_time_axis: 1.4.1
iris: None
bottleneck: None
dask: 2023.12.0
distributed: None
matplotlib: 3.8.2
cartopy: 0.22.0
seaborn: 0.13.0
numbagg: None
fsspec: 2023.12.1
cupy: None
pint: None
sparse: 0.15.1
flox: None
numpy_groupies: None
setuptools: 69.0.2
pip: 23.3.1
conda: None
pytest: 7.4.3
mypy: None
IPython: 8.18.1
sphinx: None
/home/brendan/Documents/inversions/.pymc_venv/lib/python3.10/site-packages/_distutils_hack/__init__.py:33: UserWarning: Setuptools is replacing distutils.
warnings.warn("Setuptools is replacing distutils.")
INSTALLED VERSIONS
------------------
commit: None
python: 3.9.7 (default, Sep 16 2021, 13:09:58)
[GCC 7.5.0]
python-bits: 64
OS: Linux
OS-release: 3.10.0-1160.81.1.el7.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_GB.UTF-8
LOCALE: ('en_GB', 'UTF-8')
libhdf5: None
libnetcdf: None
xarray: 2024.1.0
pandas: 2.2.0
numpy: 1.26.3
scipy: None
netCDF4: None
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: None
nc_time_axis: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: None
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
flox: None
numpy_groupies: None
setuptools: 69.0.3
pip: 23.3.2
conda: None
pytest: None
mypy: None
IPython: 8.18.1
sphinx: None
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8646/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);