home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

1 row where author_association = "NONE", issue = 449706080 and user = 2067093 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • NowanIlfideme · 1 ✖

issue 1

  • Remote writing NETCDF4 files to Amazon S3 · 1 ✖

author_association 1

  • NONE · 1 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
657798184 https://github.com/pydata/xarray/issues/2995#issuecomment-657798184 https://api.github.com/repos/pydata/xarray/issues/2995 MDEyOklzc3VlQ29tbWVudDY1Nzc5ODE4NA== NowanIlfideme 2067093 2020-07-13T21:17:06Z 2020-07-13T21:17:06Z NONE

I ran into this issue, here's a simple workaround that seems to work:

```python def dataset_to_bytes(ds: xr.Dataset, name: str = "my-dataset") -> bytes: """Converts datset to bytes."""

nc4_ds = netCDF4.Dataset(name, mode="w", diskless=True, memory=ds.nbytes)
nc4_store = NetCDF4DataStore(nc4_ds)
dump_to_store(ds, nc4_store)
res_mem = nc4_ds.close()
res_bytes = res_mem.tobytes()
return res_bytes

```

I tested this using the following:

```python import BytesIO

fname = "REDACTED.nc" ds = xr.load_dataset(fname) ds_bytes = dataset_to_bytes(ds) ds2 = xr.load_dataset(BytesIO(ds_bytes))

assert ds2.equals(ds) and all(ds2.attrs[k]==ds.attrs[k] for k in set(ds2.attrs).union(ds.attrs)) ```

The assertion holds true, however the file size on disk is different. It's possible they were saved using different netCDF4 versions, I haven't had time to test that.

I tried using just ds.to_netcdf() but get the following error:

`ValueError: NetCDF 3 does not support type |S32`

That's because it falls back to the 'scipy' engine. Would be nice to have a non-hacky way to write netcdf4 files to byte streams. :smiley:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Remote writing NETCDF4 files to Amazon S3 449706080

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 11.591ms · About: xarray-datasette