html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/5426#issuecomment-852684328,https://api.github.com/repos/pydata/xarray/issues/5426,852684328,MDEyOklzc3VlQ29tbWVudDg1MjY4NDMyOA==,1217238,2021-06-02T03:19:43Z,2021-06-02T03:19:43Z,MEMBER,"When I pickle the adapter object from this example with cloudpickle, it looks like it's 6536 bytes.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,908971901 https://github.com/pydata/xarray/issues/5426#issuecomment-852681951,https://api.github.com/repos/pydata/xarray/issues/5426,852681951,MDEyOklzc3VlQ29tbWVudDg1MjY4MTk1MQ==,1217238,2021-06-02T03:13:18Z,2021-06-02T03:13:18Z,MEMBER,"> Hrm, the root dependency does appear to be of type > > `xarray.core.indexing.ImplicitToExplicitIndexingAdapter` with size `48 B` > > I'm not sure what's going on with it Well, `sys.getsizeof()` is certainly an under-estimate here, but I suspect the true size (e.g., if you pickle it) is measured in a handful of KB. I would be surprised if Dask is reluctant to serialize such objects.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,908971901 https://github.com/pydata/xarray/issues/5426#issuecomment-852668929,https://api.github.com/repos/pydata/xarray/issues/5426,852668929,MDEyOklzc3VlQ29tbWVudDg1MjY2ODkyOQ==,1217238,2021-06-02T02:40:25Z,2021-06-02T03:09:13Z,MEMBER,"> The only thing that comes to mind is everything being assigned to one worker when the entire task graph has a single node at the base of the task graph. But then work stealing kicks in and things level out (that was a while ago though). Right, so it might help to pipe an option for `inline=True` into `Variable.chunk()` (which is indirectly called via `open_zarr` when chunks are provided).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,908971901 https://github.com/pydata/xarray/issues/5426#issuecomment-852668060,https://api.github.com/repos/pydata/xarray/issues/5426,852668060,MDEyOklzc3VlQ29tbWVudDg1MjY2ODA2MA==,1217238,2021-06-02T02:38:20Z,2021-06-02T02:38:20Z,MEMBER,"> [dask/dask#6203](https://github.com/dask/dask/pull/6203) and [dask/dask#6773](https://github.com/dask/dask/pull/6773) are the maybe relevant issues. I actually don't know if that could have an effect here. I don't know (and a brief search couldn't confirm) whether or not xarray uses `dask.array.from_zarr`. Xarray uses `dask.array.from_array` but not `from_zarr`: https://github.com/pydata/xarray/blob/83eda1a8542a9dbd81bf0e08c8564c044df64c0a/xarray/core/variable.py#L1046-L1068","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,908971901 https://github.com/pydata/xarray/issues/5426#issuecomment-852666904,https://api.github.com/repos/pydata/xarray/issues/5426,852666904,MDEyOklzc3VlQ29tbWVudDg1MjY2NjkwNA==,1217238,2021-06-02T02:35:11Z,2021-06-02T02:35:54Z,MEMBER,"What is `sizeof` supposed to estimate? The size of the computed array or the size of the pickled lazy object? Typically this object would end up in Dask graphs when something is read from an xarray storage backend, e.g., netCDF or Zarr. If the underlying files are accessible everyone (e.g., as is the case for Zarr backed by a cloud object store), then a small size for the serialized object would be appropriate.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,908971901