home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

3 rows where repo = 13221727 and user = 8441217 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 2
  • issue 1

state 1

  • closed 3

repo 1

  • xarray · 3 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
859772411 MDU6SXNzdWU4NTk3NzI0MTE= 5170 to_netcdf is not idempotent when stacking rename and set_coords floriankrb 8441217 closed 0     3 2021-04-16T12:32:59Z 2023-09-25T11:30:35Z 2023-09-25T11:30:35Z CONTRIBUTOR      

After doing ```python

!wget https://storage.ecmwf.europeanweather.cloud/s2s-ai-challenge/dev/xarray-issue/source.nc

ds = xr.open_dataset('source.nc') ds = ds.rename({'number': 'n'}) ds = ds.set_coords('valid_time') print(ds) ds.to_netcdf('out.nc') ds = xr.open_dataset('out.nc') print(ds) ``'valid_time'` is not a coordinate in the netcdf file coordinate is turned into a variable.

<xarray.Dataset> Dimensions: (heightAboveGround: 1, latitude: 121, longitude: 240, n: 2, step: 3, time: 1) Coordinates: * n (n) int64 0 1 * time (time) datetime64[ns] 2020-01-02 * step (step) timedelta64[ns] 1 days 2 days 3 days * heightAboveGround (heightAboveGround) int64 2 * latitude (latitude) float64 90.0 88.5 87.0 ... -87.0 -88.5 -90.0 * longitude (longitude) float64 0.0 1.5 3.0 4.5 ... 355.5 357.0 358.5 valid_time (time, step) datetime64[ns] ... Data variables: t2m (n, time, step, heightAboveGround, latitude, longitude) float32 ... Attributes: GRIB_edition: 2 GRIB_centre: ecmf GRIB_centreDescription: European Centre for Medium-Range Weather Forecasts GRIB_subCentre: 0 Conventions: CF-1.7 institution: European Centre for Medium-Range Weather Forecasts history: 2021-04-15T12:08:59 GRIB to CDM+CF via cfgrib-0....

<xarray.Dataset> Dimensions: (heightAboveGround: 1, latitude: 121, longitude: 240, n: 2, step: 3, time: 1) Coordinates: * n (n) int64 0 1 * time (time) datetime64[ns] 2020-01-02 * step (step) timedelta64[ns] 1 days 2 days 3 days * heightAboveGround (heightAboveGround) int64 2 * latitude (latitude) float64 90.0 88.5 87.0 ... -87.0 -88.5 -90.0 * longitude (longitude) float64 0.0 1.5 3.0 4.5 ... 355.5 357.0 358.5 Data variables: valid_time (time, step) datetime64[ns] ... t2m (n, time, step, heightAboveGround, latitude, longitude) float32 ... Attributes: GRIB_edition: 2 GRIB_centre: ecmf GRIB_centreDescription: European Centre for Medium-Range Weather Forecasts GRIB_subCentre: 0 Conventions: CF-1.7 institution: European Centre for Medium-Range Weather Forecasts history: 2021-04-15T12:08:59 GRIB to CDM+CF via cfgrib-0....

What happened: Output of the MCVE is : Coords written in copy.1.nc: ['number', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time'] Reread and check copy.1.nc: ['number', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time']

Coords written in copy.2.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time'] Reread and check copy.2.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude']

Coords written in copy.4.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time'] Reread and check copy.4.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time']

What you expected to happen Output of the MCVE should be :

Coords written in copy.1.nc: ['number', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time'] Reread and check copy.1.nc: ['number', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time']

Coords written in copy.2.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time'] Reread and check copy.2.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time']

Coords written in copy.4.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time'] Reread and check copy.4.nc: ['n', 'time', 'step', 'heightAboveGround', 'latitude', 'longitude', 'valid_time']

Minimal Complete Verifiable Example:

```python

import xarray as xr import numpy as np

xr.show_versions() print(xr.file)

FILE1 = 'copy.1.nc' FILE2 = 'copy.2.nc' FILE3 = 'copy.3.nc' FILE4 = 'copy.4.nc'

ds = xr.open_dataset('source.nc')

Initial dataset is ok

print(f'Coords written in {FILE1}: {list(ds.coords)}') ds.to_netcdf(FILE1) print(f'Reread and check {FILE1}: {list(xr.open_dataset(FILE1).coords)}')

print(xr.open_dataset(FILE1))

print()

No round trip

ds = ds.rename({'number': 'n'}) ds = ds.set_coords('valid_time') print(f'Coords written in {FILE2}: {list(ds.coords)}') ds.to_netcdf(FILE2) print(f'Reread and check {FILE2}: {list(xr.open_dataset(FILE2).coords)}')

print(xr.open_dataset(FILE2))

print()

Doing a round trip solves the issue :

ds = xr.open_dataset('source.nc') ds = ds.rename({'number': 'n'}) ds.to_netcdf(FILE3) ds = xr.open_dataset(FILE3) ds = ds.set_coords('valid_time') print(f'Coords written in {FILE4}: {list(ds.coords)}') ds.to_netcdf(FILE4) print(f'Reread and check {FILE4}: {list(xr.open_dataset(FILE4).coords)}')

print(xr.open_dataset(FILE4))

```

Anything else we need to know?: The initial source.nc file is available at wget https://storage.ecmwf.europeanweather.cloud/s2s-ai-challenge/dev/xarray-issue/source.nc

Environment:

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.8.6 | packaged by conda-forge | (default, Jan 25 2021, 23:21:18) [GCC 9.3.0] python-bits: 64 OS: Linux OS-release: 3.10.0-1160.15.2.el7.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.12.0 libnetcdf: 4.7.4 xarray: 0.17.0 pandas: 1.2.2 numpy: 1.19.5 scipy: None netCDF4: 1.5.6 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: 2.6.1 cftime: 1.4.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: 0.9.8.5 iris: None bottleneck: None dask: 2021.02.0 distributed: 2021.02.0 matplotlib: 3.3.4 cartopy: None seaborn: None numbagg: None pint: None setuptools: 49.6.0.post20210108 pip: 21.0.1 conda: 4.9.2 pytest: 5.3.1 IPython: 7.20.0 sphinx: 3.4.3

It may be related/duplicate of #4512 and linked to https://github.com/pydata/xarray/pull/4108

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5170/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
297793359 MDExOlB1bGxSZXF1ZXN0MTY5NjMwMzg2 1918 `axis` keyword ignored when applying `np.squeeze` to `DataArray` (#1487) floriankrb 8441217 closed 0     2 2018-02-16T13:58:29Z 2018-02-20T06:28:12Z 2018-02-20T06:27:12Z CONTRIBUTOR   0 pydata/xarray/pulls/1918
  • [X] Closes #1487
  • [x] Tests added
  • [x] Tests passed
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1918/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
297770830 MDExOlB1bGxSZXF1ZXN0MTY5NjEzNTUz 1916 COMPAT: MultiIndex checking is fragile (#1833) floriankrb 8441217 closed 0     2 2018-02-16T12:33:14Z 2018-02-16T21:02:58Z 2018-02-16T21:02:58Z CONTRIBUTOR   0 pydata/xarray/pulls/1916
  • [X] Closes #1833
  • [ 0] Tests added
  • [ 0] Tests passed
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1916/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 30.939ms · About: xarray-datasette