html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/5219#issuecomment-828072654,https://api.github.com/repos/pydata/xarray/issues/5219,828072654,MDEyOklzc3VlQ29tbWVudDgyODA3MjY1NA==,4801430,2021-04-28T01:31:17Z,2021-04-28T01:31:17Z,CONTRIBUTOR,"Yup this all makes sense thanks for the explanation @rabernat . It does seem like it would be good to drop `encoding[""chunks""]` at some point but I can see how that is tricky timing. I'm assuming it's necessary metadata to keep around when the zarr has been ""opened"" but data has not yet been read, b/c it is used by xarray to read the zarr?
Anyways, we'll continue with the manual deletion for now but I'm inclined to keep this issue open as I do think it would be helpful to eventually figure out how to automatically do this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,868352536
https://github.com/pydata/xarray/issues/5219#issuecomment-828004004,https://api.github.com/repos/pydata/xarray/issues/5219,828004004,MDEyOklzc3VlQ29tbWVudDgyODAwNDAwNA==,4801430,2021-04-27T23:05:02Z,2021-04-27T23:05:28Z,CONTRIBUTOR,"Thanks for the pointer @mathause that is super helpful. And thanks for #5065 @rabernat. If I'm understanding the PR correctly (looks like it evolved a lot!) in most cases matching the example above, we probably would NOT want to use `safe_chunks=False`, correct? B/c if we're writing in parallel, this could lead to data corruption. Instead, we'd want to manually delete the `chunks` item from each variables `encoding` attribute after loading/persisting the data into memory. That way, `to_zarr` would use the dask chunks as the zarr chunks, rather than relying on whatever chunks were used in the ""original"" zarr store (the source of the in-memory Dataset).
Does that sound right? I feel like if I'm reading through the PR comments correctly, this was one of the controversial parts that didnt' end up in the merged PR.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,868352536
https://github.com/pydata/xarray/pull/3649#issuecomment-568142720,https://api.github.com/repos/pydata/xarray/issues/3649,568142720,MDEyOklzc3VlQ29tbWVudDU2ODE0MjcyMA==,4801430,2019-12-21T02:03:21Z,2019-12-21T02:03:21Z,CONTRIBUTOR,Done,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,540601428
https://github.com/pydata/xarray/pull/3649#issuecomment-568109804,https://api.github.com/repos/pydata/xarray/issues/3649,568109804,MDEyOklzc3VlQ29tbWVudDU2ODEwOTgwNA==,4801430,2019-12-20T22:23:56Z,2019-12-20T22:23:56Z,CONTRIBUTOR,Will do!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,540601428
https://github.com/pydata/xarray/pull/3649#issuecomment-568067947,https://api.github.com/repos/pydata/xarray/issues/3649,568067947,MDEyOklzc3VlQ29tbWVudDU2ODA2Nzk0Nw==,4801430,2019-12-20T19:56:19Z,2019-12-20T19:56:56Z,CONTRIBUTOR,"yeah - that makes sense! So if `fill_value is None` then we can have the additional check to make sure it's a complete hypercube. The next question, like you mention, is whether we change the default `fill_value` to `None`. It seems like that would make the most sense to me. That way, the default is to throw an error with incomplete hypercubes (the previous behavior), and the incompleteness is only allowed if you provide a value to fill with. But I can see your point about having the default be `dtypes.NA` for consistency with `merge` and `concat`. Thoughts?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,540601428