home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 355264812 and user = 306380 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • mrocklin · 2 ✖

issue 1

  • Large pickle overhead in ds.to_netcdf() involving dask.delayed functions · 2 ✖

author_association 1

  • MEMBER 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
417076999 https://github.com/pydata/xarray/issues/2389#issuecomment-417076999 https://api.github.com/repos/pydata/xarray/issues/2389 MDEyOklzc3VlQ29tbWVudDQxNzA3Njk5OQ== mrocklin 306380 2018-08-29T19:32:17Z 2018-08-29T19:32:17Z MEMBER

I wouldn't expect this to sway things too much, but yes, there is a chance that that would happen.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Large pickle overhead in ds.to_netcdf() involving dask.delayed functions 355264812
417072024 https://github.com/pydata/xarray/issues/2389#issuecomment-417072024 https://api.github.com/repos/pydata/xarray/issues/2389 MDEyOklzc3VlQ29tbWVudDQxNzA3MjAyNA== mrocklin 306380 2018-08-29T19:15:10Z 2018-08-29T19:15:10Z MEMBER

It would be nice if dask had a way to consolidate the serialization of these objects, rather than separately serializing them in each task.

You can make it a separate task (often done by wrapping with dask.delayed) and then use that key within other objets. This does create a data dependency though, which can make the graph somewhat more complex.

In normal use of Pickle these things are cached and reused. Unfortunately we can't do this because we're sending the tasks to different machines, each of which will need to deserialize independently.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Large pickle overhead in ds.to_netcdf() involving dask.delayed functions 355264812

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 23.471ms · About: xarray-datasette