issues
1 row where user = 8998112 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
| id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2136832627 | I_kwDOAMm_X85_XXpz | 8755 | to_zarr removes global attributes in destination dataset | pnorton-usgs 8998112 | open | 0 | 12 | 2024-02-15T15:32:10Z | 2024-04-29T19:22:41Z | NONE | What happened?Adding new variables to a zarr dataset with to_zarr() always removes the existing global attributes. New global attributes in the source dataset are not always added to the destination dataset depending on how to_zarr() is called. What did you expect to happen?I would expect that existing global attributes would always be preserved. If there are new global attributes I would expect them to be added to the existing global attributes instead of replacing all existing global attributes. Minimal Complete Verifiable Example```Python import xarray as xr from pyproj import CRS local_zarr = 'sample.zarr' ds_sample = xr.tutorial.load_dataset("air_temperature") Make a local copyds_sample.to_zarr(local_zarr, mode='w') ds_sample = xr.open_dataset(local_zarr, engine='zarr', backend_kwargs={'consolidated':True}, chunks={}, decode_coords=True) Create CRS metadatacrs_meta = CRS.from_epsg(4326).to_cf() ds_new = xr.Dataset(data_vars={"crs": ([], 1, crs_meta)}) ds_new.attrs['note'] = 'please add this' Add all variables from ds_new to the zarrNOTE: This adds the new global attribute but also removesall existing global attributesds_new.to_zarr(local_zarr, mode='a') Add selected variable(s) to zarr datasetNOTE: This does not copy new global attributesand removes all existing global attributesds_new['crs'].to_zarr(local_zarr, mode='a')Re-open local zarr storeds_sample = xr.open_dataset(local_zarr, engine='zarr', backend_kwargs={'consolidated':True}, chunks={}, decode_coords=True) ds_sample ``` MVCE confirmation
Relevant log outputNo response Anything else we need to know?No response Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.11.0 | packaged by conda-forge | (main, Jan 14 2023, 12:26:40) [Clang 14.0.6 ]
python-bits: 64
OS: Darwin
OS-release: 22.6.0
machine: arm64
processor: arm
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.2
libnetcdf: 4.8.1
xarray: 2024.1.1
pandas: 2.2.0
numpy: 1.26.4
scipy: 1.12.0
netCDF4: 1.6.0
pydap: installed
h5netcdf: 1.3.0
h5py: 3.8.0
Nio: None
zarr: 2.17.0
cftime: 1.6.3
nc_time_axis: None
iris: None
bottleneck: 1.3.7
dask: 2024.2.0
distributed: 2024.2.0
matplotlib: 3.8.2
cartopy: 0.22.0
seaborn: None
numbagg: None
fsspec: 2023.12.2
cupy: None
pint: 0.23
sparse: None
flox: None
numpy_groupies: None
setuptools: 69.0.3
pip: 24.0
conda: None
pytest: 8.0.0
mypy: None
IPython: 8.21.0
sphinx: None
|
{
"url": "https://api.github.com/repos/pydata/xarray/issues/8755/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] (
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[number] INTEGER,
[title] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[state] TEXT,
[locked] INTEGER,
[assignee] INTEGER REFERENCES [users]([id]),
[milestone] INTEGER REFERENCES [milestones]([id]),
[comments] INTEGER,
[created_at] TEXT,
[updated_at] TEXT,
[closed_at] TEXT,
[author_association] TEXT,
[active_lock_reason] TEXT,
[draft] INTEGER,
[pull_request] TEXT,
[body] TEXT,
[reactions] TEXT,
[performed_via_github_app] TEXT,
[state_reason] TEXT,
[repo] INTEGER REFERENCES [repos]([id]),
[type] TEXT
);
CREATE INDEX [idx_issues_repo]
ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
ON [issues] ([user]);