issue_comments
1 row where author_association = "MEMBER", issue = 1720191529 and user = 14808389 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- Produce nightly wheels · 1 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
1557705518 | https://github.com/pydata/xarray/issues/7863#issuecomment-1557705518 | https://api.github.com/repos/pydata/xarray/issues/7863 | IC_kwDOAMm_X85c2LMu | keewis 14808389 | 2023-05-22T18:33:15Z | 2023-05-22T18:33:15Z | MEMBER | I wonder if pure-python projects could get away with asking to install from github? This works today:
That said, I can see a central location being helpful, and we'd certainly be happy to add a scheduled github action to upload built packages. We'd need some help with setting that up, though, as I personally don't have any experience whatsoever in uploading to That would also allow us to stop publishing builds to TestPyPI (not sure if those are nightlies), as that seems to have accumulated quite a few packages already that due to the PyPI policy stay there forever. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Produce nightly wheels 1720191529 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1