home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

1 row where user = 9576982 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

type 1

  • issue 1

state 1

  • open 1

repo 1

  • xarray 1
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1077079208 I_kwDOAMm_X85AMuyo 6069 to_zarr: region not recognised as dataset dimensions Boorhin 9576982 open 0     25 2021-12-10T17:34:42Z 2023-11-15T19:39:07Z   NONE      

What happened: I am trying to write into a zarr a dataset by reading each timesteps and write this time step into a zarr store. For that I prepared the dataset and filled the variable values with 0s and wrote the store with mode='a'. I then perform the operations I need on each variables and try to write the time step of the dataset.

ds.isel(time=t).to_zarr(outfile, region={"time": t})

However I received this error message: ValueError: all keys in ``region`` are not in Dataset dimensions, got ['time'] and ['cell', 'face', 'layer', 'max_cell_node', 'max_face_nodes', 'node', 'siglay'] But

```python

In: ds.dims Out: Frozen(SortedKeysDict({'time': 1465, 'node': 112015, 'layer': 6, 'face': 198364, 'max_face_nodes': 3, 'cell': 991820, 'max_cell_node': 6, 'siglay': 6}))

```

Checking in the API, it comes from ```python

~/.local/lib/python3.8/site-packages/xarray/backends/api.py in _validate_append_dim_and_encoding(ds_to_append, store, append_dim, region, encoding, **open_kwargs) 1330 for k, v in region.items(): 1331 if k not in ds_to_append.dims: -> 1332 raise ValueError( 1333 f"all keys in region are not in Dataset dimensions, got " 1334 f"{list(region)} and {list(ds_to_append.dims)}" ```

What you expected to happen: Incremently append data to the zarr store

Minimal Complete Verifiable Example:

```python import xarray as xr from datetime import datetime,timedelta import numpy as np dt= datetime.now() times= np.arange(dt,dt+timedelta(days=6), timedelta(hours=1)) nodesx,nodesy,layers=np.arange(10,50), np.arange(10,50)+15, np.arange(10) ds=xr.Dataset() ds.coords['time']=('time', times) ds.coords['node_x']=('node', nodesx) ds.coords['node_y']=('node', nodesy) ds.coords['layer']=('layer', layers) outfile='my_zarr' varnames=['potato','banana', 'apple'] for var in varnames: ds[var]=(('time', 'layer', 'node'), np.zeros((len(times), len(layers),len(nodesx)))) ds.to_zarr(outfile, mode='a') for t in range(len(times)): for var in varnames: ds[var].isel(time=t).values += np.random.random((len(layers),len(nodesx))) ds.isel(time=t).to_zarr(outfile, region={"time": t})


ValueError Traceback (most recent call last) <ipython-input-25-8baf03aa01a3> in <module> 2 for var in varnames: 3 ds[var].isel(time=t).values += np.random.random((len(layers),len(nodesx))) ----> 4 ds.isel(time=t).to_zarr(outfile, region={"time": t}) 5

~/.local/lib/python3.8/site-packages/xarray/core/dataset.py in to_zarr(self, store, chunk_store, mode, synchronizer, group, encoding, compute, consolidated, append_dim, region) 1743 encoding = {} 1744 -> 1745 return to_zarr( 1746 self, 1747 store=store,

~/.local/lib/python3.8/site-packages/xarray/backends/api.py in to_zarr(dataset, store, chunk_store, mode, synchronizer, group, encoding, compute, consolidated, append_dim, region) 1457 if mode == "a": 1458 _validate_datatypes_for_zarr_append(dataset) -> 1459 _validate_append_dim_and_encoding( 1460 dataset, 1461 store,

~/.local/lib/python3.8/site-packages/xarray/backends/api.py in _validate_append_dim_and_encoding(ds_to_append, store, append_dim, region, encoding, **open_kwargs) 1330 for k, v in region.items(): 1331 if k not in ds_to_append.dims: -> 1332 raise ValueError( 1333 f"all keys in region are not in Dataset dimensions, got " 1334 f"{list(region)} and {list(ds_to_append.dims)}"

ValueError: all keys in region are not in Dataset dimensions, got ['time'] and ['layer', 'node'] ```

Anything else we need to know?:

Environment:

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.8.10 (default, Sep 28 2021, 16:10:42) [GCC 9.3.0] python-bits: 64 OS: Linux OS-release: 5.4.0-91-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.7.3 xarray: 0.16.2 pandas: 1.2.2 numpy: 1.17.4 scipy: 1.6.2 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: None Nio: None zarr: 2.10.2 cftime: 1.1.0 nc_time_axis: 1.2.0 PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2021.11.2 distributed: 2021.11.2 matplotlib: 3.1.2 cartopy: 0.18.0 seaborn: None numbagg: None pint: None setuptools: 45.2.0 pip: 20.0.2 conda: None pytest: 6.2.1 IPython: 7.13.0 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6069/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 22.284ms · About: xarray-datasette