home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

where issue = 259935100 and user = 1217238 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

These facets timed out: author_association, issue

user 1

  • shoyer · 2 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
334954046 https://github.com/pydata/xarray/issues/1586#issuecomment-334954046 https://api.github.com/repos/pydata/xarray/issues/1586 MDEyOklzc3VlQ29tbWVudDMzNDk1NDA0Ng== shoyer 1217238 2017-10-07T17:52:22Z 2017-10-07T17:52:22Z MEMBER

I'm OK copying encoding, but we do still need to figure out general rules for propagating it.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.copy() drops encoding 259935100
331754521 https://github.com/pydata/xarray/issues/1586#issuecomment-331754521 https://api.github.com/repos/pydata/xarray/issues/1586 MDEyOklzc3VlQ29tbWVudDMzMTc1NDUyMQ== shoyer 1217238 2017-09-25T01:20:08Z 2017-09-25T01:20:08Z MEMBER

I think this was intentional at one point, but to be honest we never carefully defined the semantics for preserving encoding are.

My original thought (along the lines of some of my comments in https://github.com/pydata/xarray/issues/1297), was that we should not propagate encoding in cases where it might no longer be valid, so it should be dropped from most operations. Essentially, encoding should only stay with original files loaded from disk. Hence why it wasn't copied in .copy().

That said, I can see why this rule would be confusing and we haven't done a good job of enforcing it. Possibly a better policy would be "encoding is copied whenever attrs is copied".

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.copy() drops encoding 259935100

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 3186.673ms · About: xarray-datasette