home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where author_association = "MEMBER", issue = 286542795 and user = 1217238 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • shoyer · 4 ✖

issue 1

  • WIP: Compute==False for to_zarr and to_netcdf · 4 ✖

author_association 1

  • MEMBER · 4 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
389296294 https://github.com/pydata/xarray/pull/1811#issuecomment-389296294 https://api.github.com/repos/pydata/xarray/issues/1811 MDEyOklzc3VlQ29tbWVudDM4OTI5NjI5NA== shoyer 1217238 2018-05-15T20:06:58Z 2018-05-15T20:06:58Z MEMBER

(assuming tests pass)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: Compute==False for to_zarr and to_netcdf 286542795
389296253 https://github.com/pydata/xarray/pull/1811#issuecomment-389296253 https://api.github.com/repos/pydata/xarray/issues/1811 MDEyOklzc3VlQ29tbWVudDM4OTI5NjI1Mw== shoyer 1217238 2018-05-15T20:06:49Z 2018-05-15T20:06:49Z MEMBER

Yes, just tried again. I'm open to ideas but would also like to move this issue along first, if possible.

Sounds good, let's go ahead and merge this!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: Compute==False for to_zarr and to_netcdf 286542795
373092099 https://github.com/pydata/xarray/pull/1811#issuecomment-373092099 https://api.github.com/repos/pydata/xarray/issues/1811 MDEyOklzc3VlQ29tbWVudDM3MzA5MjA5OQ== shoyer 1217238 2018-03-14T16:45:25Z 2018-03-14T16:45:25Z MEMBER

To elaborate a little bit on my last comment (which I submitted very quickly when my bus was arriving), the way to make dependent tasks with dask.delayed is to add dummy function arguments, e.g., ```python def finalize_store(store, write): del write # unused store.sync() store.close()

write = dask.array.store(..., compute=False) write_and_close = dask.delayed(finalize_store)(store, write) write_and_close.compute() # writes and syncs ```

Potentially some of this logic could get moved into ArrayWriter.sync()

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: Compute==False for to_zarr and to_netcdf 286542795
373072825 https://github.com/pydata/xarray/pull/1811#issuecomment-373072825 https://api.github.com/repos/pydata/xarray/issues/1811 MDEyOklzc3VlQ29tbWVudDM3MzA3MjgyNQ== shoyer 1217238 2018-03-14T15:54:43Z 2018-03-14T15:54:43Z MEMBER

One potential issue here is the lack of clean-up (which may be unnecessary if autoclose=True). You want to construct a single dask graph with a structure like the following: - Tasks for writing all array data (i.e., from ArrayWriter). - Tasks for calling sync() and close() on each datastore object. These should depend on the appropriate writing tasks. - A single tasks that depends on writing all datastores. This is what the delayed object returned by save_mfdataset should return.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: Compute==False for to_zarr and to_netcdf 286542795

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 177.404ms · About: xarray-datasette