home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 495869721 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 2

  • shoyer 1
  • dcherian 1

issue 1

  • arithmetic resulting in inconsistent chunks · 2 ✖

author_association 1

  • MEMBER 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
533327468 https://github.com/pydata/xarray/issues/3323#issuecomment-533327468 https://api.github.com/repos/pydata/xarray/issues/3323 MDEyOklzc3VlQ29tbWVudDUzMzMyNzQ2OA== dcherian 2448579 2019-09-19T22:08:45Z 2019-09-19T22:08:45Z MEMBER

I agree with not enforcing matching chunk sizes.

I've added an ugly version of Dataset.unify_chunks in #3276. Feedback welcome!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  arithmetic resulting in inconsistent chunks 495869721
533190578 https://github.com/pydata/xarray/issues/3323#issuecomment-533190578 https://api.github.com/repos/pydata/xarray/issues/3323 MDEyOklzc3VlQ29tbWVudDUzMzE5MDU3OA== shoyer 1217238 2019-09-19T15:42:40Z 2019-09-19T15:42:40Z MEMBER

I think dask array has some utility functions for "unifying chunks" that we might be able to use inside our map_blocks() function.

Potentially we could also make Dataset.chunks more robust, e.g., have it return None for dimensions with inconsistent chunk sizes rather than raising an error.

Alternatively, we could enforce matching chunksizes on all dask arrays inside a Dataset, as part of xarray's model of a Dataset as a collection of aligned arrays. But this seems unnecessarily limiting, and I am reluctant to add extra complexity to xarray's data model.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  arithmetic resulting in inconsistent chunks 495869721

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 11.291ms · About: xarray-datasette