home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 964084038

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/5954#issuecomment-964084038 https://api.github.com/repos/pydata/xarray/issues/5954 964084038 IC_kwDOAMm_X845dsFG 1197350 2021-11-09T11:56:30Z 2021-11-09T11:56:30Z MEMBER

Thanks for the info @alexamici!

2. but most backends serialise writes anyway, so the advantage is limited.

I'm not sure I understand this comment, specifically what is meant by "serialise writes". I often use Xarray to do distributed writes to Zarr stores using 100+ distributed dask workers. It works great. We would need the same thing from a TileDB backend.

We are focusing on the user-facing API, but in the end, whether we call it .to, .to_dataset, or .store_dataset is not really a difficult or important question. It's clear we need some generic writing method. The much harder question is the back-end API. As Alessandro says:

Adding support for a single save_dataset entry point to the backend API is trivial, but adding full support for possibly distributed writes looks like it is much more work.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  1047608434
Powered by Datasette · Queries took 0.56ms · About: xarray-datasette