issue_comments: 920194613
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/pull/5796#issuecomment-920194613 | https://api.github.com/repos/pydata/xarray/issues/5796 | 920194613 | IC_kwDOAMm_X8422Q41 | 2448579 | 2021-09-15T16:53:10Z | 2021-09-15T16:53:10Z | MEMBER |
yeah this is a problem. Maybe we need to go through and mark the slow tests like in the README_ci.md document
Do you mean skipping those that require bottleneck rather than relying on asv to handle the crash? If so, sounds good
An alternative approach would be to use something similar to our |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
996475523 |