home / github / issues

Menu
  • GraphQL API
  • Search all tables

issues: 342531772

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
342531772 MDU6SXNzdWUzNDI1MzE3NzI= 2300 zarr and xarray chunking compatibility and `to_zarr` performance 1530840 closed 0     15 2018-07-18T23:58:40Z 2021-04-26T16:37:42Z 2021-04-26T16:37:42Z NONE      

I have a situation where I build large zarr arrays based on chunks which correspond to how I am reading data off a filesystem, for best I/O performance. Then I set these as variables on an xarray dataset which I want to persist to zarr, but with different chunks more optimal for querying.

One problem I ran into is that manually selecting chunks of a dataset prior to to_zarr results in https://github.com/pydata/xarray/blob/66be9c5db7d86ea385c3a4cd4295bfce67e3f25b/xarray/backends/zarr.py#L83

It's difficult for me to understand exactly how to select chunks manually at the dataset level which would also make this zarr "final chunk" constraint happy. I would have been satisfied however with letting zarr choose chunks for me, but could not find a way to trigger this through the xarray API short of "unchunking" it first, which would lead to loading entire variables into memory. I came up with the following hack to trigger zarr's automatic chunking despite having differently defined chunks on my xarray dataset:

python # monkey patch to get zarr to ignore dask chunks and use its own heuristics def copy_func(f): g = types.FunctionType(f.__code__, f.__globals__, name=f.__name__, argdefs=f.__defaults__, closure=f.__closure__) g = functools.update_wrapper(g, f) g.__kwdefaults__ = f.__kwdefaults__ return g orig_determine_zarr_chunks = copy_func(xr.backends.zarr._determine_zarr_chunks) xr.backends.zarr._determine_zarr_chunks = lambda enc_chunks, var_chunks, ndim: orig_determine_zarr_chunks(enc_chunks, None, ndim) The next problem to contend with is that da.store between zarr stores with differing chunks between source and destination is astronomically slow. The first thing to attempt would be to rechunk the dask arrays according to the destination zarr chunks, but xarray's consistent chunks constraint blocks this strategy as far as I can tell. Once again I took the dirty hack approach and inject a rechunking on a per-variable basis during the to_zarr operation, as follows:

python # monkey patch to make dask arrays writable with different chunks than zarr dest # could do without this but would have to contend with 'inconsistent chunks' on dataset def sync_using_zarr_copy(self, compute=True): if self.sources: import dask.array as da rechunked_sources = [source.rechunk(target.chunks) for source, target in zip(self.sources, self.targets)] delayed_store = da.store(rechunked_sources, self.targets, lock=self.lock, compute=compute, flush=True) self.sources = [] self.targets = [] return delayed_store xr.backends.common.ArrayWriter.sync = sync_using_zarr_copy I may have missed something in the API that would have made this easier, or another workaround which would be less hacky, but in any case I'm wondering if this scenario could be handled elegantly in xarray.

I'm not sure if there is a plan going forward to make legal xarray chunks 100% compatible with zarr; if so that would go a fair ways in alleviating the first problem. Alternatively, perhaps the xarray API could expose some ability to adjust chunks according to zarr's liking, as well as the option of defaulting entirely to zarr's heuristics for chunking.

As for the performance issue with differing chunks, I'm not sure whether my rechunking patch could be applied without causing side-effects. Or where the right place to solve this would be-- perhaps it could be more naturally addressed within da.store.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2300/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 1 row from issues_id in issues_labels
  • 15 rows from issue in issue_comments
Powered by Datasette · Queries took 0.554ms · About: xarray-datasette