home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 350899839 and user = 39450418 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • maxaragon · 2 ✖

issue 1

  • Let's list all the netCDF files that xarray can't open · 2 ✖

author_association 1

  • NONE 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1320993705 https://github.com/pydata/xarray/issues/2368#issuecomment-1320993705 https://api.github.com/repos/pydata/xarray/issues/2368 IC_kwDOAMm_X85OvMOp maxaragon 39450418 2022-11-19T23:46:35Z 2022-11-20T13:31:45Z NONE

Found another example from ICON NWP model. Files open with netCDF4 library but not with xarray.

import pandas as pd
import xarray as xr
import requests
import os

response = requests.get('https://cloudnet.fmi.fi/api/model-files?site=hyytiala&date=2020-08-25&model=icon-iglo-12-23')
data = response.json()
df = pd.DataFrame(data)
file = df.downloadUrl
for i in file:
    wget.download(i, os.getcwd())

ds = xr.open_dataset('20200825_hyytiala_icon-iglo-12-23.nc')

Error:

```

TypeError Traceback (most recent call last) Cell In [109], line 1 ----> 1 ds = xr.open_dataset('20200825_hyytiala_icon-iglo-12-23.nc')

File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/backends/api.py:531, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, backend_kwargs, kwargs) 519 decoders = _resolve_decoders_kwargs( 520 decode_cf, 521 open_backend_dataset_parameters=backend.open_dataset_parameters, (...) 527 decode_coords=decode_coords, 528 ) 530 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None) --> 531 backend_ds = backend.open_dataset( 532 filename_or_obj, 533 drop_variables=drop_variables, 534 decoders, 535 kwargs, 536 ) 537 ds = _dataset_from_backend_dataset( 538 backend_ds, 539 filename_or_obj, (...) 547 kwargs, 548 ) 549 return ds

File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/backends/netCDF4_.py:569, in NetCDF4BackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, format, clobber, diskless, persist, lock, autoclose) 567 store_entrypoint = StoreBackendEntrypoint() 568 with close_on_error(store): --> 569 ds = store_entrypoint.open_dataset( 570 store, 571 mask_and_scale=mask_and_scale, 572 decode_times=decode_times, 573 concat_characters=concat_characters, 574 decode_coords=decode_coords, 575 drop_variables=drop_variables, 576 use_cftime=use_cftime, 577 decode_timedelta=decode_timedelta, 578 ) 579 return ds

File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/backends/store.py:29, in StoreBackendEntrypoint.open_dataset(self, store, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta) 26 vars, attrs = store.load() 27 encoding = store.get_encoding() ---> 29 vars, attrs, coord_names = conventions.decode_cf_variables( 30 vars, 31 attrs, 32 mask_and_scale=mask_and_scale, 33 decode_times=decode_times, 34 concat_characters=concat_characters, 35 decode_coords=decode_coords, 36 drop_variables=drop_variables, 37 use_cftime=use_cftime, 38 decode_timedelta=decode_timedelta, 39 ) 41 ds = Dataset(vars, attrs=attrs) 42 ds = ds.set_coords(coord_names.intersection(vars))

File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/conventions.py:509, in decode_cf_variables(variables, attributes, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables, use_cftime, decode_timedelta) 507 # Time bounds coordinates might miss the decoding attributes 508 if decode_times: --> 509 _update_bounds_attributes(variables) 511 new_vars = {} 512 for k, v in variables.items():

File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/conventions.py:410, in _update_bounds_attributes(variables) 408 for v in variables.values(): 409 attrs = v.attrs --> 410 has_date_units = "units" in attrs and "since" in attrs["units"] 411 if has_date_units and "bounds" in attrs: 412 if attrs["bounds"] in variables:

TypeError: argument of type 'numpy.float32' is not iterable ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Let's list all the netCDF files that xarray can't open 350899839
1320996459 https://github.com/pydata/xarray/issues/2368#issuecomment-1320996459 https://api.github.com/repos/pydata/xarray/issues/2368 IC_kwDOAMm_X85OvM5r maxaragon 39450418 2022-11-20T00:06:17Z 2022-11-20T00:06:17Z NONE

@andersy005 indeed, I have updated xarray and works now, previous version was:

``` INSTALLED VERSIONS


commit: None python: 3.10.6 (main, Aug 30 2022, 04:58:14) [Clang 13.1.6 (clang-1316.0.21.2.5)] python-bits: 64 OS: Darwin OS-release: 21.6.0 machine: arm64 processor: i386 byteorder: little LC_ALL: None LANG: None LOCALE: (None, 'UTF-8') libhdf5: 1.12.2 libnetcdf: 4.9.0

xarray: 2022.6.0 pandas: 1.4.4 numpy: 1.23.2 scipy: 1.9.1 netCDF4: 1.6.0 pydap: None h5netcdf: None h5py: None Nio: None zarr: None cftime: 1.6.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.5 dask: None distributed: None matplotlib: 3.5.3 cartopy: None seaborn: 0.12.1 numbagg: None fsspec: None cupy: None pint: None sparse: None flox: None numpy_groupies: None setuptools: 63.4.3 pip: 22.2.2 conda: None pytest: None IPython: 8.5.0 sphinx: None ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Let's list all the netCDF files that xarray can't open 350899839

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 318.066ms · About: xarray-datasette