issue_comments
1 row where issue = 100055216 and user = 1217238 sorted by updated_at descending
This data as json, CSV (advanced)
These facets timed out: author_association
issue 1
- Option for closing files with scipy backend · 1 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
228412831 | https://github.com/pydata/xarray/pull/524#issuecomment-228412831 | https://api.github.com/repos/pydata/xarray/issues/524 | MDEyOklzc3VlQ29tbWVudDIyODQxMjgzMQ== | shoyer 1217238 | 2016-06-24T17:45:07Z | 2016-06-24T17:45:07Z | MEMBER | Something like this is still worth exploring (the LRU cache), but this isn't quite the right solution. So I'm closing this PR, just to clean up the list of outstanding PRs :). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option for closing files with scipy backend 100055216 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1