home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

6 rows where author_association = "COLLABORATOR" and issue = 1392878100 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • headtr1ck 6

issue 1

  • New deep copy behavior in 2022.9.0 causes maximum recursion error · 6 ✖

author_association 1

  • COLLABORATOR · 6 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1266649409 https://github.com/pydata/xarray/issues/7111#issuecomment-1266649409 https://api.github.com/repos/pydata/xarray/issues/7111 IC_kwDOAMm_X85Lf4lB headtr1ck 43316012 2022-10-04T09:18:25Z 2022-10-04T09:18:25Z COLLABORATOR

I think the behavior of deepcopy in #7112 is correct. I you really want to prevent the ancillary_variables attrs to be deep-copied as well, you can try to add it to the memo dict in deepcopy, e.g.: python memo = {id(da.attrs["ancillary_variables"]): da.attrs["ancillary_variables"]} da_new = deepcopy(da, memo) (untested!)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  New deep copy behavior in 2022.9.0 causes maximum recursion error 1392878100
1264398329 https://github.com/pydata/xarray/issues/7111#issuecomment-1264398329 https://api.github.com/repos/pydata/xarray/issues/7111 IC_kwDOAMm_X85LXS_5 headtr1ck 43316012 2022-10-01T15:27:37Z 2022-10-01T15:27:37Z COLLABORATOR

I added a PR that fixes the broken reprs and deepcopys. The other issues are not addressed yet.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  New deep copy behavior in 2022.9.0 causes maximum recursion error 1392878100
1264335676 https://github.com/pydata/xarray/issues/7111#issuecomment-1264335676 https://api.github.com/repos/pydata/xarray/issues/7111 IC_kwDOAMm_X85LXDs8 headtr1ck 43316012 2022-10-01T11:28:53Z 2022-10-01T11:28:53Z COLLABORATOR

Ok, even xarray.testing.assert_identical fails with recursive definitions. Are we sure that it is a good idea to support this?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  New deep copy behavior in 2022.9.0 causes maximum recursion error 1392878100
1264335114 https://github.com/pydata/xarray/issues/7111#issuecomment-1264335114 https://api.github.com/repos/pydata/xarray/issues/7111 IC_kwDOAMm_X85LXDkK headtr1ck 43316012 2022-10-01T11:25:42Z 2022-10-01T11:25:42Z COLLABORATOR

I will set up a PR for that. Another issue has arisen: the repr is also broken for recursive data. With your example python should also raise a RecursionError when looking at this data?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  New deep copy behavior in 2022.9.0 causes maximum recursion error 1392878100
1264272446 https://github.com/pydata/xarray/issues/7111#issuecomment-1264272446 https://api.github.com/repos/pydata/xarray/issues/7111 IC_kwDOAMm_X85LW0Q- headtr1ck 43316012 2022-10-01T07:08:53Z 2022-10-01T08:35:24Z COLLABORATOR

I think our implementations of copy(deep=True) and __deepcopy__ are reverted, the first should call the latter and not the other way around to be able to pass the memo dict.

This will lead to a bit of duplicate code between __copy__ and __deepcopy__ but would be the correct way.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  New deep copy behavior in 2022.9.0 causes maximum recursion error 1392878100
1263956728 https://github.com/pydata/xarray/issues/7111#issuecomment-1263956728 https://api.github.com/repos/pydata/xarray/issues/7111 IC_kwDOAMm_X85LVnL4 headtr1ck 43316012 2022-09-30T19:45:57Z 2022-09-30T19:45:57Z COLLABORATOR

I basically copied the behavior of Dataset.copy which should already show this problem. In principle we are doing a new_attrs = copy.deepcopy(attrs).

I would claim that the new behavior is correct, but maybe other devs can confirm this.

Coming from netCDF, it does not really make sense to put complex objects in attrs, but I guess for in-memory only it works.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  New deep copy behavior in 2022.9.0 causes maximum recursion error 1392878100

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 13.641ms · About: xarray-datasette