html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/2278#issuecomment-493431087,https://api.github.com/repos/pydata/xarray/issues/2278,493431087,MDEyOklzc3VlQ29tbWVudDQ5MzQzMTA4Nw==,46813815,2019-05-17T12:11:21Z,2019-05-17T14:03:38Z,NONE,"Hi,
second test case indicated by Apatlpo on on 12 Jul 2018, brakes
```python
nx, ny, nt = 32, 32, 64
ds = xr.Dataset({}, coords={'x':np.arange(nx),'y':np.arange(ny), 't': np.arange(nt)})
ds = ds.assign(v=ds.t*np.cos(np.pi/180./100*ds.x)*np.cos(np.pi/180./50*ds.y))
ds = ds.chunk({'t': 1, 'x': nx/2, 'y': ny/2})
ds.to_zarr('data.zarr', mode='w')
```
```
python
ds = xr.open_zarr('data.zarr')
ds = ds.chunk({'t': nt, 'x': nx/4, 'y': ny/4})
ds.to_zarr('data_rechunked.zarr', mode='w')
```
Err message is following .
```
ValueError: Final chunk of Zarr array must be the same size or smaller than the first. The specified Zarr chunk encoding is (1, 16, 16), but (64,) in variable Dask chunks ((64,), (8, 8, 8, 8), (8, 8, 8, 8)) is incompatible. Consider rechunking using `chunk()
```
(if I add del ds.v.encoding['chunks'] as follows, it does not break)
```python
nx, ny, nt = 32, 32, 64
ds = xr.Dataset({}, coords={'x':np.arange(nx),'y':np.arange(ny), 't': np.arange(nt)})
ds = ds.assign(v=ds.t*np.cos(np.pi/180./100*ds.x)*np.cos(np.pi/180./50*ds.y))
ds = ds.chunk({'t': 1, 'x': nx/2, 'y': ny/2})
ds.to_zarr('data.zarr', mode='w')
ds = xr.open_zarr('data.zarr')
del ds.v.encoding['chunks']
ds = ds.chunk({'t': nt, 'x': nx/4, 'y': ny/4})
ds.to_zarr('data_rechunked.zarr', mode='w')
```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,340192831
https://github.com/pydata/xarray/issues/2300#issuecomment-493408428,https://api.github.com/repos/pydata/xarray/issues/2300,493408428,MDEyOklzc3VlQ29tbWVudDQ5MzQwODQyOA==,46813815,2019-05-17T10:37:35Z,2019-05-17T10:37:35Z,NONE,"Hi, I'm new to xarray & zarr ,
After reading a zarr file, I re-chunk the data using xarray.Dataset.chunk. Then create a newly chunked data stored as zarr file with xarray.Dataset.to_zarr But I get error message:
'NotImplementedError: Specified zarr chunks (200, 100, 1) would overlap multiple dask chunks ((50, 50, 50, 50), (25, 25, 25, 25), (10000,)). This is not implemented in xarray yet. Consider rechunking the data using `chunk()` or specifying different chunks in encoding.'
My xarray version is12.1, & and my understanding is that according to this post https://github.com/pydata/xarray/issues/2300 .it is fixed, thus so it is implemented to 12.1??
Then why do I get 'notimplemented error ?
Do I have to use 'del dsread.data.encoding['chunks']. each time before using 'Dataset.to_zarr' as a workaround? but probably I am missing somthing. I hope someone can point me out...
I made a notebook here for reproducing the pb.
https://github.com/tinaok/Pangeo-for-beginners/blob/master/3-1%20zarr%20and%20re-chunking%20bug%20report.ipynb
thanks for your help, regards Tina","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,342531772