html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/5168#issuecomment-821655160,https://api.github.com/repos/pydata/xarray/issues/5168,821655160,MDEyOklzc3VlQ29tbWVudDgyMTY1NTE2MA==,1053153,2021-04-16T22:38:08Z,2021-04-16T22:38:08Z,CONTRIBUTOR,It may run even deeper -- there seem to be several checks on dimension sizes that would need special casing. Even simply doing a variable[dim] lookup fails! ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,859577556 https://github.com/pydata/xarray/issues/5168#issuecomment-821285344,https://api.github.com/repos/pydata/xarray/issues/5168,821285344,MDEyOklzc3VlQ29tbWVudDgyMTI4NTM0NA==,1053153,2021-04-16T16:13:09Z,2021-04-16T16:13:09Z,CONTRIBUTOR,"There seems to be some support, but now you have me worried. I have a used xarray mainly for labelling, but not for much computation -- I'm dropping into dask because I need map_overlap. FWIW, calling `dask.compute(arr)` works with unknown chunk sizes, but now I see `arr.compute()` does not. This fooled me into thinking I could use unknown chunk sizes. Now I see that writing to zarr does not work, either. This might torpedo my current design. I see the `compute_chunk_sizes` method, but that seems to trigger computation. I'm running on a dask cluster -- is there anything I can do to salvage the pattern `arr_with_nan_shape.to_dataset().to_zarr(compute=False)` (with our without xarray)? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,859577556