home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

9 rows where issue = 1635470616 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 3

  • keewis 5
  • dcherian 3
  • headtr1ck 1

author_association 2

  • MEMBER 8
  • COLLABORATOR 1

issue 1

  • add timeouts for tests · 9 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1483101852 https://github.com/pydata/xarray/pull/7657#issuecomment-1483101852 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YZlac dcherian 2448579 2023-03-24T16:42:37Z 2023-03-24T16:42:48Z MEMBER

IIRC all distributed tests are in test_distributed.py.

I've also seen a drastic reduction in performance with HDF5 1.12.2 (both netcdf4 and h5netcdf) on one of my colleague's datasets,

The default scheduler is 'threads' and the new netcdf changed some things around thread locking

Maybe this is related to https://github.com/pydata/xarray/issues/7079. I restarted #7488

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1483092471 https://github.com/pydata/xarray/pull/7657#issuecomment-1483092471 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YZjH3 keewis 14808389 2023-03-24T16:35:17Z 2023-03-24T16:37:45Z MEMBER

depends on the test, I guess. Most of them are related to one of the netcdf backends (not sure which, the tests don't specify that), test_dataarray_compute just checks _in_memory (no I/O involved), and the pydap tests use netcdf underneath. So I'd say the issue is with one of the netcdf backends (netcdf4, as that's first in the priority list).

I've also seen a drastic reduction in performance with HDF5 1.12.2 (both netcdf4 and h5netcdf) on one of my colleague's datasets, so maybe that's much more visible on a mac? That doesn't explain the slow test_dataarray_compute, though.

Do we isolate the dask scheduler in any way? I assume that makes use of the builtin (non-distributed) scheduler?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1483074436 https://github.com/pydata/xarray/pull/7657#issuecomment-1483074436 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YZeuE dcherian 2448579 2023-03-24T16:21:23Z 2023-03-24T16:21:23Z MEMBER

FAILED xarray/tests/test_backends.py::TestDask

I'm thinking this failures is some bad interaction between an I/O backend and dask threads. But I'm having trouble figuring out which backend.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1483049816 https://github.com/pydata/xarray/pull/7657#issuecomment-1483049816 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YZYtY keewis 14808389 2023-03-24T16:04:28Z 2023-03-24T16:04:28Z MEMBER

Is it for a particular backend?

Not sure I understand. Can you elaborate, please?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1483009187 https://github.com/pydata/xarray/pull/7657#issuecomment-1483009187 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YZOyj dcherian 2448579 2023-03-24T15:37:34Z 2023-03-24T15:37:34Z MEMBER

Is it for a particular backend?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1481344641 https://github.com/pydata/xarray/pull/7657#issuecomment-1481344641 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YS4aB keewis 14808389 2023-03-23T14:56:17Z 2023-03-23T14:56:17Z MEMBER

since the macos 3.11 CI is not required, I'm tempted to merge this now, and continue to debug elsewhere.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1480905116 https://github.com/pydata/xarray/pull/7657#issuecomment-1480905116 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YRNGc keewis 14808389 2023-03-23T10:00:54Z 2023-03-23T10:05:37Z MEMBER

I guess that means that the CPUs of the windows and macos runners are just slow, or there's other tasks that get prioritized, or something. All of this results in the tests being pretty flaky, so I'm not sure what we can do about it. I don't have access to any, but maybe someone with a mac could try to reproduce?

In any case, I'll increase the timeout a bit again since I think a timeout after ~1hour is better than the CI job being cancelled after 6 hours.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1480136698 https://github.com/pydata/xarray/pull/7657#issuecomment-1480136698 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YORf6 headtr1ck 43316012 2023-03-22T19:26:18Z 2023-03-22T19:26:18Z COLLABORATOR

Does anyone know why those take so long only on macos (and, partially, windows)? Does that have to do with the runner hardware?

Some of there files are 10 numbers, so the netcdf should be ~50kb or so (mostly the header overhead). On no hardware this should take >1min to read and write?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616
1479671510 https://github.com/pydata/xarray/pull/7657#issuecomment-1479671510 https://api.github.com/repos/pydata/xarray/issues/7657 IC_kwDOAMm_X85YMf7W keewis 14808389 2023-03-22T14:29:20Z 2023-03-22T14:29:20Z MEMBER

apparently python 3.11 on windows is also pretty slow, which makes the timeouts appear there as well. But in any case, here's the affected tests: FAILED xarray/tests/test_backends.py::TestDask::test_dask_roundtrip - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestDask::test_deterministic_names - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestDask::test_dataarray_compute - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestDask::test_load_dataset - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestDask::test_load_dataarray - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestDask::test_inline_array - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestPydap::test_cmp_local_file - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestPydap::test_compatible_to_netcdf - Failed: Timeout >90.0s FAILED xarray/tests/test_backends.py::TestPydap::test_dask - Failed: Timeout >90.0s which are the exact same tests that also fail for >300s, so I'm guessing those are the ones that stall.

Does anyone know why those take so long only on macos (and, partially, windows)? Does that have to do with the runner hardware?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add timeouts for tests 1635470616

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 565.979ms · About: xarray-datasette