home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 473000845 and user = 1217238 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • shoyer · 2 ✖

issue 1

  • .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph · 2 ✖

author_association 1

  • MEMBER 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
542418950 https://github.com/pydata/xarray/issues/3161#issuecomment-542418950 https://api.github.com/repos/pydata/xarray/issues/3161 MDEyOklzc3VlQ29tbWVudDU0MjQxODk1MA== shoyer 1217238 2019-10-15T21:46:58Z 2019-10-15T21:46:58Z MEMBER

I don't remember exactly why I added the allow_lazy argument. I think my original concern was backwards compatibility (when we were first adding dask!) with uses of reduce that expected to be applied to NumPy arrays, not dask arrays.

We do something similar in apply_ufunc with dask='forbidden', but rather than automatically coercing to NumPy arrays we raise an error if a dask array is encountered. This seems much more sensible.

At this point, I think we would probably just remove the argument and always default to allow_lazy=True. Or possibly allow_lazy=False should result is an error instead of automatic coercion, and we should expose/document it as a public argument.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph 473000845
515258121 https://github.com/pydata/xarray/issues/3161#issuecomment-515258121 https://api.github.com/repos/pydata/xarray/issues/3161 MDEyOklzc3VlQ29tbWVudDUxNTI1ODEyMQ== shoyer 1217238 2019-07-26T00:05:35Z 2019-07-26T00:05:35Z MEMBER

Yikes, this is pretty bad!

Thanks for the clear code to reproduce it.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph 473000845

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 223.804ms · About: xarray-datasette