home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

3 rows where state = "closed", type = "issue" and user = 193170 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

type 1

  • issue · 3 ✖

state 1

  • closed · 3 ✖

repo 1

  • xarray 3
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1174427142 I_kwDOAMm_X85GAFYG 6383 `Dataset.to_zarr` compute=False should allow access to awaitable shaunc 193170 closed 0     5 2022-03-20T03:12:29Z 2022-03-21T00:47:45Z 2022-03-20T03:42:01Z NONE      

What happened?

I have xarray, zarr installed, but not dask, and am trying to call to_zarr in an async routine. I am looking for something I can await. The doc claims that a Dask.Delayed is returned. I understand that if I have a dask client open with asynchronous=True I can await the result.

However, not using Dask. Is there some way to get an awaitable from this object without a dask client?

What did you expect to happen?

I should get something back I can await in my async routine.

Minimal Complete Verifiable Example

```Python import xarray as xr from asyncio import get_event_loop

ds = xr.Dataset(data_vars = dict(x = ('x', [1, 2]))) deld = ds.to_zarr("bar.zarr", compute=False) loop.run_until_complete(deld. ...?) ```

Relevant log output

No response

Anything else we need to know?

No response

Environment

INSTALLED VERSIONS

commit: None python: 3.8.5 (default, Sep 4 2020, 02:22:02) [Clang 10.0.0 ] python-bits: 64 OS: Darwin OS-release: 20.6.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.10.4 libnetcdf: 4.7.3

xarray: 2022.3.0 pandas: 1.4.1 numpy: 1.22.3 scipy: 1.6.2 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: None Nio: None zarr: 2.11.1 cftime: 1.5.1.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.2 dask: 2021.10.0 distributed: 2021.10.0 matplotlib: 3.1.3 cartopy: None seaborn: None numbagg: None fsspec: 2022.02.0 cupy: None pint: None sparse: None setuptools: 60.7.1 pip: 22.0.3 conda: 4.11.0 pytest: 7.1.1 IPython: 7.31.1 sphinx: None

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6383/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
862110000 MDU6SXNzdWU4NjIxMTAwMDA= 5192 Writing np.bool8 data array reads in as int8 shaunc 193170 closed 0     2 2021-04-19T23:33:20Z 2021-04-20T05:19:44Z 2021-04-20T05:19:44Z NONE      

What happened:

I have an dataarray with dtype np.bool_. When I write it using netcdf (engine h5netcdf, or default) and then read in a copy, the copy has dtype int8.

What you expected to happen:

The loaded data array should have dtype bool

Minimal Complete Verifiable Example:

I have had a hard time reducing this to a sample. The data array comes from a larger dataset which exhibits the same problem. I can copy the dataarray using copy() and it still exhibits the problem; however if I build a new data array using the constructor, the new array doesn't exhibit the problem. As far as I can tell, though, the original and the rebuilt dataarray are otherwise identical.

```python

in a pdb session

(Pdb) ci <xarray.DataArray 'cut_inclusive' (cut: 15)> array([False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]) Dimensions without coordinates: cut (Pdb) ci.to_netcdf('foo_ci.nca', engine="h5netcdf") (Pdb) csi = xr.read_dataarray('foo_ci_nca', engine="h5netcdf"); csi <xarray.DataArray 'cut_inclusive' (cut: 15)> array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], dtype=int8) Dimensions without coordinates: cut (Pdb) ci2 = xr.DataArray(ci, dims=('cut', )) (Pdb) ci2.equals(ci) True (Pdb) ci2.to_netcdf('foo_ci2.nca', engine="h5netcdf") (Pdb) csi2 = xr.open_dataarray('foo_ci2.nca', engine="h5netcdf"); csi2 <xarray.DataArray 'cut_inclusive' (cut: 15)> array([False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]) Dimensions without coordinates: cut (Pdb) ci3 = ci.copy() (Pdb) ci3.to_netcdf('foo_ci3.nca', engine="h5netcdf") (Pdb) csi3 = xr.open_dataarray('foo_ci3.nca', engine="h5netcdf"); csi3 <xarray.DataArray 'cut_inclusive' (cut: 15)> array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], dtype=int8) Dimensions without coordinates: cut ``` Anything else we need to know?:

I am at a loss how to investigate why ci and ci3 don't survive round-trip, but ci2 does. Unfortunately, I also have been unable to produce a free-standing example -- whenever I try I get an object that survives round trip intact. I suspect that xarray internals is somewhere/somehow keeping a cache to the original ci (presumably still linked to the overall dataset from which ci came), and this is what is causing the problem, but I don't know where to look. (Suggestions welcome!)

Environment:

Output of <tt>xr.show_versions()</tt> xr.show_versions() INSTALLED VERSIONS ------------------ commit: None python: 3.8.5 (default, Sep 4 2020, 07:30:14) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 5.8.0-48-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.12.0 libnetcdf: None xarray: 0.17.0 pandas: 1.2.4 numpy: 1.20.2 scipy: 1.6.2 netCDF4: None pydap: None h5netcdf: 0.10.0 h5py: 3.2.1 Nio: None zarr: None cftime: None nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2021.04.0 distributed: 2021.04.0 matplotlib: 3.4.1 cartopy: None seaborn: None numbagg: None pint: None setuptools: 51.0.0 pip: 20.3.1 conda: None pytest: 6.2.3 IPython: 7.22.0 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5192/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
758165023 MDU6SXNzdWU3NTgxNjUwMjM= 4658 drop_sel indices in dimension that doesn't have coordinates? shaunc 193170 closed 0     3 2020-12-07T05:17:36Z 2021-01-18T23:59:09Z 2021-01-18T23:59:09Z NONE      

Is your feature request related to a problem? Please describe.

I am trying to drop particular indices from a dimension that doesn't have coordinates.

Following: drop_sel() documentation, but leaving out the coordinate labels: python data = np.random.randn(2, 3) ds = xr.Dataset({"A": (["x", "y"], data)}) ds.drop_sel(y=[1]) gives me an error.

Describe the solution you'd like

I would think drop_isel should exist and work in analogy to drop_sel as isel does to sel.

Describe alternatives you've considered

As far as I know, I could either create coordinates especially to in order to drop, or rebuild a new dataset. Both are not congenial. (I'd be grateful to know if there is actually a straightforward way to do this I've overlooked.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4658/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 2641.805ms · About: xarray-datasette