home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 984014394

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/6036#issuecomment-984014394 https://api.github.com/repos/pydata/xarray/issues/6036 984014394 IC_kwDOAMm_X846pt46 23300143 2021-12-01T20:08:46Z 2021-12-01T20:12:25Z NONE

@dcherian thanks for your reply. I know Xarray can't do anything about the Dask computations of the chunks. My question was if it was possible to save the Dask chunk informations on the Zarr metadata such that it is not neccessary to calculate them ie. run the the getem() function from Dask that takes too long to run and increases memory.

Following example runs out of memory on my computer. I have 16 GB RAM.

```python

import dask import xarray as xr

chunks = (1, 1, 1) ds = xr.Dataset(data_vars={ "foo": (('x', 'y', 'z'), dask.array.empty((1000, 1000, 1000), chunks=(1000, 1000, 1000)))}) ds.to_zarr(store='data', group='ds.zarr', compute=False, encoding={'foo': {'chunks': chunks}}) ds_loaded = xr.open_zarr(group='ds.zarr', store='data')

```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  1068225524
Powered by Datasette · Queries took 0.805ms · About: xarray-datasette