home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 869196682

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/5511#issuecomment-869196682 https://api.github.com/repos/pydata/xarray/issues/5511 869196682 MDEyOklzc3VlQ29tbWVudDg2OTE5NjY4Mg== 25071375 2021-06-27T17:15:20Z 2021-06-27T17:15:20Z CONTRIBUTOR

Hi again, I check a little bit more the behavior of Zarr and Dask and I found that the problem only occurs when the lock option in the 'dask.store' method is set as None or False, below you can find an example: ```py

import numpy as np import zarr import dask.array as da

Writing an small zarr array with 42.2 as the value

z1 = zarr.open('data/example.zarr', mode='w', shape=(152), chunks=(30), dtype='f4') z1[:] = 42.2

resizing the array

z2 = zarr.open('data/example.zarr', mode='a') z2.resize(308)

New data to append

append_data = da.from_array(np.array([50.3] * 156), chunks=(30))

If you pass to the lock parameters None or False you will get the PermissonError or some 0s in the final result

so I think this is the problem when Xarray writes to Zarr with Dask, (I saw in the code that by default use lock = None)

If you put lock = True all the problems disappear.

da.store(append_data, z2, regions=[tuple([slice(152, 308)])], lock=None)

the result can contain many 0s or throw an error

print(z2[:]) ```

Hope this help to fix the bug.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  927617256
Powered by Datasette · Queries took 0.704ms · About: xarray-datasette