home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

9 rows where author_association = "MEMBER", issue = 761270240 and user = 10194086 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • mathause · 9 ✖

issue 1

  • CI setup: use mamba and matplotlib-base · 9 ✖

author_association 1

  • MEMBER · 9 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
745413055 https://github.com/pydata/xarray/pull/4672#issuecomment-745413055 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0NTQxMzA1NQ== mathause 10194086 2020-12-15T16:40:28Z 2020-12-15T16:40:28Z MEMBER

Ok, let's get this in. matplotlib-base and nodefaults should be quite uncontroversial. If mamba makes problems it's quickly removed.

I plan to add a whats new entry concerning the CI speed-up (#4672, #4685 & #4694) in #4694

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
744749458 https://github.com/pydata/xarray/pull/4672#issuecomment-744749458 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0NDc0OTQ1OA== mathause 10194086 2020-12-14T22:25:47Z 2020-12-14T22:25:47Z MEMBER

See #4694

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
744710105 https://github.com/pydata/xarray/pull/4672#issuecomment-744710105 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0NDcxMDEwNQ== mathause 10194086 2020-12-14T21:05:38Z 2020-12-14T21:05:38Z MEMBER

I thought that pytest-xdist was not an option as the plot tests are not thread safe (as mpl isn't). But looking again I think that pytest-xdist actually uses uses multiprocessing and not multithreading, so this might actually be worth a try.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
744445362 https://github.com/pydata/xarray/pull/4672#issuecomment-744445362 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0NDQ0NTM2Mg== mathause 10194086 2020-12-14T13:37:36Z 2020-12-14T13:37:36Z MEMBER

I wasn't really able to get to the bottom of this. Still, using mamba and matplotlib-base should speed up the installation step by 2 to 5 minutes. If we are fine switching to the faster but maybe not-as-mature mamba this can be merged on green.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
744040595 https://github.com/pydata/xarray/pull/4672#issuecomment-744040595 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0NDA0MDU5NQ== mathause 10194086 2020-12-13T17:29:32Z 2020-12-13T17:29:32Z MEMBER

Why do we see that much of a speed-up once we downgrade numba on azure pipelines?

Sometimes it also works fine with numba 0.52... So unfortunately I don't know. My suspicion is that we get different CPUs by chance. I added a new step to our CI: cat /proc/cpuinfo (worked with gitbash on windows). Maybe this reveals something.

On my dualboot machine the test suite takes 23 min on windows and 15 min on linux. Thus, already quite a difference but not as large as on azure where the windows test seem to take about twice as long as the linux tests.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
743922542 https://github.com/pydata/xarray/pull/4672#issuecomment-743922542 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0MzkyMjU0Mg== mathause 10194086 2020-12-12T23:59:54Z 2020-12-12T23:59:54Z MEMBER

locally on windows I find no large difference between numba 0.51 and 0.52, so that does not seem to be the root cause...

@keewis there are about 750 xfailed tests in test_units.py. xfail is the correct category but they take much longer than skip. Locally the tests take about 6 min 30 s using xfail but only 45 s using skip. On azure the difference is probably even bigger. Would it be an option to use skip instead? Of course this has to be done carefully, e.g. checking the xpassing tests etc...

What is the difference between pytest.mark.xfail and pytest.xfail?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
743780782 https://github.com/pydata/xarray/pull/4672#issuecomment-743780782 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0Mzc4MDc4Mg== mathause 10194086 2020-12-12T16:33:39Z 2020-12-12T16:33:39Z MEMBER

Thanks for figuring this out. Still, I think I have to test this locally - the time the CI takes is very inconsistent on azure.

Yes, I think this PR is helpful anyway and should bring down the ci time a bit.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
743445866 https://github.com/pydata/xarray/pull/4672#issuecomment-743445866 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0MzQ0NTg2Ng== mathause 10194086 2020-12-11T21:52:05Z 2020-12-11T21:52:05Z MEMBER

Here is what I learned: * The same tests are slow in windows and linux. Just, that those on windows take about twice as long. * The following test seem to be slow: * xarray/tests/test_distributed.py * many tests with sparse, e.g. xarray/tests/test_dataset.py::TestDataset::test_unstack_sparse * plotting tests, especially with FacetGrid

I am not sure what takes long in xarray/tests/test_distributed.py: writing the files or creating the cluster. If it is the latter, it could be possible to only open in once in the module (but I don't know if that actually works or if it has to be closed every time).

https://github.com/pydata/xarray/blob/76d5c0c075628475b555997b82c55dd18a34936e/xarray/tests/test_distributed.py#L118-L119

Windows py37

``` 9.52s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[netcdf4-NETCDF3_CLASSIC] 9.12s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[False-True] 8.66s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[False-False] 8.50s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[netcdf4-NETCDF4] 8.48s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[scipy-NETCDF3_64BIT] 8.34s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[netcdf4-NETCDF4_CLASSIC] 8.34s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[h5netcdf-NETCDF4] 7.72s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[netcdf4-NETCDF4] 7.72s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[h5netcdf-NETCDF4] 7.72s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[True-False] 7.71s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[True-True] 7.42s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[netcdf4-NETCDF4_CLASSIC] 7.35s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[netcdf4-NETCDF3_CLASSIC] 6.45s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[scipy-NETCDF3_64BIT] 6.26s call xarray/tests/test_dataset.py::TestDataset::test_unstack_sparse 6.06s call xarray/tests/test_sparse.py::TestSparseDataArrayAndDataset::test_groupby_bins 5.46s call xarray/tests/test_plot.py::TestDatasetScatterPlots::test_facetgrid_hue_style 5.35s call xarray/tests/test_interp.py::test_interpolate_chunk_advanced[linear] 5.10s call xarray/tests/test_distributed.py::test_dask_distributed_rasterio_integration_test 5.00s call xarray/tests/test_plot.py::TestFacetedLinePlots::test_facetgrid_shape 4.40s call xarray/tests/test_interp.py::test_interpolate_chunk_advanced[nearest] ```

Linux py37

``` 5.78s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[False-True] 5.62s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[False-False] 5.55s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[h5netcdf-NETCDF4] 5.34s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[True-False] 5.31s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[netcdf4-NETCDF4] 5.30s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[netcdf4-NETCDF4_CLASSIC] 5.15s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[netcdf4-NETCDF3_CLASSIC] 5.10s call xarray/tests/test_distributed.py::test_dask_distributed_rasterio_integration_test 4.98s call xarray/tests/test_distributed.py::test_dask_distributed_zarr_integration_test[True-True] 4.91s call xarray/tests/test_distributed.py::test_dask_distributed_cfgrib_integration_test 4.87s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[netcdf4-NETCDF3_CLASSIC] 4.82s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[netcdf4-NETCDF4_CLASSIC] 4.77s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[scipy-NETCDF3_64BIT] 4.75s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[h5netcdf-NETCDF4] 4.67s call xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[netcdf4-NETCDF4] 4.32s call xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[scipy-NETCDF3_64BIT] 3.55s call xarray/tests/test_dataset.py::TestDataset::test_unstack_sparse 3.45s call properties/test_pandas_roundtrip.py::test_roundtrip_dataset 3.07s call xarray/tests/test_plot.py::TestFacetedLinePlots::test_facetgrid_shape 2.70s call xarray/tests/test_plot.py::TestDatasetScatterPlots::test_facetgrid_hue_style 2.67s call xarray/tests/test_sparse.py::TestSparseDataArrayAndDataset::test_groupby_bins ```
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240
742827949 https://github.com/pydata/xarray/pull/4672#issuecomment-742827949 https://api.github.com/repos/pydata/xarray/issues/4672 MDEyOklzc3VlQ29tbWVudDc0MjgyNzk0OQ== mathause 10194086 2020-12-10T22:02:34Z 2020-12-10T22:02:34Z MEMBER

No..., it failed at 99%. I don't entirely get it. The tests were well under way when I left. So I'd really be interested to get the timings of the tests to see what takes so long...

I just tried running the windows CI via Github actions

Yes, that's of course another good alternative.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  CI setup: use mamba and matplotlib-base 761270240

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 66.719ms · About: xarray-datasette