home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 1499432372

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/pull/7019#issuecomment-1499432372 https://api.github.com/repos/pydata/xarray/issues/7019 1499432372 IC_kwDOAMm_X85ZX4W0 35968931 2023-04-06T18:03:48Z 2023-04-06T18:07:24Z MEMBER

I'm having problems with ensuring the behaviour of the chunks='auto' option is consistent between .chunk and open_dataset. These problems appeared since vendoring dask.array.core.normalize_chunks. Right now the only failing tests use chunks='auto' (e.g xarray/tests/test_backends.py::test_chunking_consintency[auto] - yes there's a typo in that test's name), and they fail because xarray decides on different sizes for the automatically-chosen chunks.

What's weird is that all tests pass for me locally but these failures occur on just some of the CI jobs (and which CI jobs is not even consistent apparently???). I have no idea why this would behave differently on only some of the CI jobs, especially after double-checking that array-chunk-size is being correctly determined from the dask config variable within normalize_chunks.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  1368740629
Powered by Datasette · Queries took 0.695ms · About: xarray-datasette