home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

6 rows where issue = 212177054 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 4

  • gerritholl 2
  • shoyer 2
  • jhamman 1
  • stale[bot] 1

author_association 3

  • MEMBER 3
  • CONTRIBUTOR 2
  • NONE 1

issue 1

  • Encoding lost upon concatenation · 6 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
461220070 https://github.com/pydata/xarray/issues/1297#issuecomment-461220070 https://api.github.com/repos/pydata/xarray/issues/1297 MDEyOklzc3VlQ29tbWVudDQ2MTIyMDA3MA== jhamman 2443309 2019-02-06T22:51:19Z 2019-02-06T22:51:19Z MEMBER

closed via #1297

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Encoding lost upon concatenation 212177054
461143792 https://github.com/pydata/xarray/issues/1297#issuecomment-461143792 https://api.github.com/repos/pydata/xarray/issues/1297 MDEyOklzc3VlQ29tbWVudDQ2MTE0Mzc5Mg== stale[bot] 26384082 2019-02-06T18:57:22Z 2019-02-06T18:57:22Z NONE

In order to maintain a list of currently relevant issues, we mark issues as stale after a period of inactivity

If this issue remains relevant, please comment here or remove the stale label; otherwise it will be marked as closed automatically

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Encoding lost upon concatenation 212177054
285101536 https://github.com/pydata/xarray/issues/1297#issuecomment-285101536 https://api.github.com/repos/pydata/xarray/issues/1297 MDEyOklzc3VlQ29tbWVudDI4NTEwMTUzNg== shoyer 1217238 2017-03-08T17:03:31Z 2017-03-08T17:03:31Z MEMBER

Mine retains it always upon concatenation, but if you prefer we could add an argument keep_encoding in analogy with keep_attrs. In that case we'd want to add it wherever we have keep_attrs.

No, I'm happy with your fix here. I'd rather keep encoding as hidden away from the user facing API as possible, because it's only relevant to a fraction of users (those reading and writing netCDF files).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Encoding lost upon concatenation 212177054
285087750 https://github.com/pydata/xarray/issues/1297#issuecomment-285087750 https://api.github.com/repos/pydata/xarray/issues/1297 MDEyOklzc3VlQ29tbWVudDI4NTA4Nzc1MA== gerritholl 500246 2017-03-08T16:19:10Z 2017-03-08T16:19:10Z CONTRIBUTOR

Mine retains it always upon concatenation, but if you prefer we could add an argument keep_encoding in analogy with keep_attrs. In that case we'd want to add it wherever we have keep_attrs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Encoding lost upon concatenation 212177054
284911249 https://github.com/pydata/xarray/issues/1297#issuecomment-284911249 https://api.github.com/repos/pydata/xarray/issues/1297 MDEyOklzc3VlQ29tbWVudDI4NDkxMTI0OQ== shoyer 1217238 2017-03-08T00:56:23Z 2017-03-08T00:56:23Z MEMBER

I guess I'm not opposed to this per se, but preserving encoding in operations is even harder than attrs because it's possible encoding to no longer be valid. So currently most xarray operation simply delete encoding. That said, I can see some reasons for keeping it when concatenating (e.g., because concat is a fundamental part of open_mfdataset).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Encoding lost upon concatenation 212177054
284751866 https://github.com/pydata/xarray/issues/1297#issuecomment-284751866 https://api.github.com/repos/pydata/xarray/issues/1297 MDEyOklzc3VlQ29tbWVudDI4NDc1MTg2Ng== gerritholl 500246 2017-03-07T15:21:25Z 2017-03-07T15:21:25Z CONTRIBUTOR

This is more serious when we are concatenating datasets, because then the encoding is lost for each containing data-array…

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Encoding lost upon concatenation 212177054

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 14.999ms · About: xarray-datasette