home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where issue = 473000845 sorted by updated_at descending

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user 2

  • shoyer 2
  • dcherian 2

issue 1

  • .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph · 4 ✖

author_association 1

  • MEMBER 4
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
542501513 https://github.com/pydata/xarray/issues/3161#issuecomment-542501513 https://api.github.com/repos/pydata/xarray/issues/3161 MDEyOklzc3VlQ29tbWVudDU0MjUwMTUxMw== dcherian 2448579 2019-10-16T03:57:15Z 2019-10-16T03:57:15Z MEMBER

we would probably just remove the argument and always default to allow_lazy=True.

This seems best.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph 473000845
542418950 https://github.com/pydata/xarray/issues/3161#issuecomment-542418950 https://api.github.com/repos/pydata/xarray/issues/3161 MDEyOklzc3VlQ29tbWVudDU0MjQxODk1MA== shoyer 1217238 2019-10-15T21:46:58Z 2019-10-15T21:46:58Z MEMBER

I don't remember exactly why I added the allow_lazy argument. I think my original concern was backwards compatibility (when we were first adding dask!) with uses of reduce that expected to be applied to NumPy arrays, not dask arrays.

We do something similar in apply_ufunc with dask='forbidden', but rather than automatically coercing to NumPy arrays we raise an error if a dask array is encountered. This seems much more sensible.

At this point, I think we would probably just remove the argument and always default to allow_lazy=True. Or possibly allow_lazy=False should result is an error instead of automatic coercion, and we should expose/document it as a public argument.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph 473000845
542281567 https://github.com/pydata/xarray/issues/3161#issuecomment-542281567 https://api.github.com/repos/pydata/xarray/issues/3161 MDEyOklzc3VlQ29tbWVudDU0MjI4MTU2Nw== dcherian 2448579 2019-10-15T15:52:43Z 2019-10-15T15:52:43Z MEMBER

This ends up being because allow_lazy=False

https://github.com/pydata/xarray/blob/3f9069ba376afa35c0ca83b09a6126dd24cb8127/xarray/core/variable.py#L1412-L1460

I don't see why we need this kwarg or why it shouldn't be True by default. This kwarg is undocumented in dataset.py https://github.com/pydata/xarray/blob/3f9069ba376afa35c0ca83b09a6126dd24cb8127/xarray/core/dataset.py#L3997-L4029

and invisible in dataarray.py https://github.com/pydata/xarray/blob/3f9069ba376afa35c0ca83b09a6126dd24cb8127/xarray/core/dataarray.py#L2106-L2139

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph 473000845
515258121 https://github.com/pydata/xarray/issues/3161#issuecomment-515258121 https://api.github.com/repos/pydata/xarray/issues/3161 MDEyOklzc3VlQ29tbWVudDUxNTI1ODEyMQ== shoyer 1217238 2019-07-26T00:05:35Z 2019-07-26T00:05:35Z MEMBER

Yikes, this is pretty bad!

Thanks for the clear code to reproduce it.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  .reduce() on a DataArray with Dask distributed immediately executes the preceding portions of the computational graph 473000845

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 14.064ms · About: xarray-datasette
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows