home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where author_association = "NONE" and issue = 514672231 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user 3

  • kuchaale 1
  • b-kode 1
  • HarlanAndrew 1

issue 1

  • RuntimeError: NetCDF: DAP failure · 3 ✖

author_association 1

  • NONE · 3 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1013673140 https://github.com/pydata/xarray/issues/3466#issuecomment-1013673140 https://api.github.com/repos/pydata/xarray/issues/3466 IC_kwDOAMm_X848a2y0 kuchaale 6815953 2022-01-15T12:21:44Z 2022-01-17T10:37:59Z NONE

Hi all, I encountered the same problem when trying to download NASA's GEOS-5 data (see below). It worked occasionally but I had to restart the script several times.

```python import xarray as xr import pandas as pd

URL ='https://opendap.nccs.nasa.gov/dods/GEOS-5/fp/0.25_deg/assim/inst3_3d_asm_Np' ds = xr.open_dataset(URL,engine='netcdf4') var_ls = ['omega', 't', 'v', 'u'] lev_ls = [1000., 975., 950., 925., 900., 875., 850., 825., 800., 775., 750., 700., 650., 600., 550., 500., 450., 400., 350., 300., 275., 250., 225., 200., 175., 150., 125., 100., 70., 50., 30., 10., 5.,3, 2, 1]

time_range = pd.date_range('2021-01-02T12', '2021-01-07', freq = '6H')

for sel_date in time_range: ds_sel = ds[var_ls].sel( time = sel_date, lev = lev_ls, method = 'nearest' ) ouf_date=sel_date.strftime('%Y%m%d%H') outfile = f'geos5_subset_{ouf_date}.nc' print(outfile) ds_sel.to_netcdf(outfile) `` [EDIT] It may have helped to add.loadbeforeto_netcdf` though.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  RuntimeError: NetCDF: DAP failure 514672231
793502837 https://github.com/pydata/xarray/issues/3466#issuecomment-793502837 https://api.github.com/repos/pydata/xarray/issues/3466 MDEyOklzc3VlQ29tbWVudDc5MzUwMjgzNw== HarlanAndrew 57942990 2021-03-09T07:52:55Z 2021-03-09T07:52:55Z NONE

Hi all,

I encountered the same problem when trying to save data from met.no's archived MEPS data on "https://thredds.met.no/thredds/dodsC/meps25epsarchive/" when trying to save the remote dataset as a local netCDF file.

I found that it depends on the size of the dataset I want to save and it works fine up to about 5MB file size. Thus, my workaround is to chunk the data e.g. by selecting only one ensemble member at a time, save the chunk as a file to make sure everything is downloaded, then read in again and do dataset.merge() in order to get one netCDF file.

It seems to me as if the problem could be solved by writing to hard disk in chunks.

Hope this helps to find the root of this issue....

Thanks for opening this issue @b-kode!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  RuntimeError: NetCDF: DAP failure 514672231
548030406 https://github.com/pydata/xarray/issues/3466#issuecomment-548030406 https://api.github.com/repos/pydata/xarray/issues/3466 MDEyOklzc3VlQ29tbWVudDU0ODAzMDQwNg== b-kode 47066389 2019-10-30T17:44:19Z 2019-10-30T17:44:19Z NONE

Hi @max-sixty,

I have added the traceback. And apparently @dcherian already edited the example to its proper format.

Thanks!

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  RuntimeError: NetCDF: DAP failure 514672231

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 18.042ms · About: xarray-datasette