issues: 1694671281
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1694671281 | I_kwDOAMm_X85lAqGx | 7812 | Appending to existing zarr store writes mostly NaN from dask arrays, but not numpy arrays | 4753005 | open | 0 | 1 | 2023-05-03T19:30:13Z | 2023-11-15T18:56:09Z | NONE | What is your issue?I am using Admittedly, the above code seems dangerous, since there is no guarantee that Even if the chunksizes always do match, I am not sure what will happen when appending to an existing store. If the last chunk in the store before appending is not a full chunk, will it be "filled in" when new data are appended to the store? Presumably, but this seems like it could cause problems with parallel writing, since the source chunks from a dask array almost certainly won't line up with the new chunks in the zarr store, unless you've been careful to make it so. In any case, the following change seems to solve the issue, and the zarr store no longer contains |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7812/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
13221727 | issue |