home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 1556198984

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/7856#issuecomment-1556198984 https://api.github.com/repos/pydata/xarray/issues/7856 1556198984 IC_kwDOAMm_X85cwbZI 14371165 2023-05-21T14:51:55Z 2023-05-21T14:51:55Z MEMBER

Nope, I have not tried that. I suspect things will just self heal then considering the CI without understanding the root cause.

Looking at the backends; we initialize a dict here: https://github.com/pydata/xarray/blob/d8ec3a3f6b02a8b941b484b3d254537af84b5fde/xarray/backends/common.py#L435

Stores each of our entrypoints like this: https://github.com/pydata/xarray/blob/d8ec3a3f6b02a8b941b484b3d254537af84b5fde/xarray/backends/h5netcdf_.py#L438

Then we append the local and other entrypoints together here: https://github.com/pydata/xarray/blob/d8ec3a3f6b02a8b941b484b3d254537af84b5fde/xarray/backends/plugins.py#L106-L116

But load_chunkmanagers doesn't really seem to append from a dict: https://github.com/pydata/xarray/blob/d8ec3a3f6b02a8b941b484b3d254537af84b5fde/xarray/core/parallelcompat.py#L48-L62

Why do the backends use the BACKEND_ENTRYPOINTS strategy? To avoid these cases? Or something else?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  1718410975
Powered by Datasette · Queries took 0.766ms · About: xarray-datasette