html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/5796#issuecomment-950296261,https://api.github.com/repos/pydata/xarray/issues/5796,950296261,IC_kwDOAMm_X844pF7F,2448579,2021-10-24T10:07:49Z,2021-10-24T10:07:49Z,MEMBER,Thanks @Illviljan this is great work!,"{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,996475523
https://github.com/pydata/xarray/pull/5796#issuecomment-923263203,https://api.github.com/repos/pydata/xarray/issues/5796,923263203,IC_kwDOAMm_X843B-Dj,2448579,2021-09-20T20:16:13Z,2021-09-20T20:16:13Z,MEMBER,"> @dcherian did you have something specific in mind when you linked to https://github.com/jaimergp/scikit-image/blob/main/.github/workflows/benchmarks-cron.yml in #4648? I think I'll remove it otherwise and just focus on
Nothing specific. Just wanted to link their configuration.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,996475523
https://github.com/pydata/xarray/pull/5796#issuecomment-920203704,https://api.github.com/repos/pydata/xarray/issues/5796,920203704,IC_kwDOAMm_X8422TG4,2448579,2021-09-15T17:02:44Z,2021-09-15T17:02:44Z,MEMBER,"Ah one major problem IIUC is that we run a lot of benchmarks with and without bottleneck even if bottleneck isn't involved in the operation being benchmarked.
---
Some of the following should be fixed
---
> ImportError: Pandas requires version '0.12.3' or newer of 'xarray' (version '0.0.0' currently installed).
---
> rolling.Rolling.time_rolling_construct
> xarray.core.merge.MergeError: conflicting values for variable 'x_coords' on objects to be combined. You can skip this check by specifying compat='override'.
---
> IOWriteNetCDFDaskDistributed.time_write
Looks like we're spinning up multiple dask clusters.
> UserWarning: Port 8787 is already in use.
Perhaps you already have a cluster running?
Hosting the HTTP server on port 33423 instead
warnings.warn(","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,996475523
https://github.com/pydata/xarray/pull/5796#issuecomment-920194613,https://api.github.com/repos/pydata/xarray/issues/5796,920194613,IC_kwDOAMm_X8422Q41,2448579,2021-09-15T16:53:10Z,2021-09-15T16:53:10Z,MEMBER,"> Long times, +240 minutes
yeah this is a problem.
Maybe we need to go through and mark the slow tests like in the README_ci.md document
> In that vein, a new private function is defined at `benchmarks.__init__`: `_skip_slow`. This will check if the `ASV_SKIP_SLOW` environment variable has been defined. If set to `1`, it will raise `NotImplementedError` and skip the test. To implement this behavior in other tests, you can add the following attribute:
> Add more required installs? bottleneck for examples crashes tests because it isn't installed. Add all the non-required as well?
Do you mean skipping those that require bottleneck rather than relying on asv to handle the crash? If so, sounds good
> Can a normal user add and remove labels?
An alternative approach would be to use something similar to our `[skip-ci]` and `[test-upstream]` tags in the commit message. Though I think that tag needs to be on every commit you want benchmarked.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,996475523