home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where author_association = "MEMBER" and issue = 460254571 sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • shoyer 2

issue 1

  • nc file locked by xarray after (double) da.compute() call · 2 ✖

author_association 1

  • MEMBER · 2 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
507055108 https://github.com/pydata/xarray/issues/3041#issuecomment-507055108 https://api.github.com/repos/pydata/xarray/issues/3041 MDEyOklzc3VlQ29tbWVudDUwNzA1NTEwOA== shoyer 1217238 2019-06-30T17:58:44Z 2019-06-30T17:58:44Z MEMBER

I did a little bit of testing. It appears that in the case of the double call to temp.compute(), temp.__del__ is never called. Apparently in CPython this happens if you have a reference cycle. I'm not quite sure what the source of the reference cycle is here (or why it only appears when calling compute twice), but that seems to be the likely culprit.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  nc file locked by xarray after (double) da.compute() call 460254571
505494480 https://github.com/pydata/xarray/issues/3041#issuecomment-505494480 https://api.github.com/repos/pydata/xarray/issues/3041 MDEyOklzc3VlQ29tbWVudDUwNTQ5NDQ4MA== shoyer 1217238 2019-06-25T15:23:39Z 2019-06-25T15:23:39Z MEMBER

Xarray does its best to automatically close files when they are no longer necessary, but this is pretty challenging to do in general. You are generally best off either explicitly closing files or not rewriting the same filenames.

In this particular case, I suspect that the second call to temp.compute() means that the file manager opened by xr.open_dataset('test.nc') can't get garbage collected, for some reason. I'm not entirely sure what's going on, but see these lines in CachingFileManager if you want to go into the gory details: https://github.com/pydata/xarray/blob/76adf1307cb15d63521b40408a569258bacd3623/xarray/backends/file_manager.py#L190-L218

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  nc file locked by xarray after (double) da.compute() call 460254571

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 16.465ms · About: xarray-datasette
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows