html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/4035#issuecomment-721504192,https://api.github.com/repos/pydata/xarray/issues/4035,721504192,MDEyOklzc3VlQ29tbWVudDcyMTUwNDE5Mg==,7799184,2020-11-04T04:23:58Z,2020-11-04T04:23:58Z,CONTRIBUTOR,"@shoyer thanks for implementing this, it is going to be very useful. I am trying to write this dataset below:

dsregion:
```
<xarray.Dataset>
Dimensions:    (latitude: 2041, longitude: 4320, time: 31)
Coordinates:
  * latitude   (latitude) float32 -80.0 -79.916664 -79.833336 ... 89.916664 90.0
  * time       (time) datetime64[ns] 2008-10-01T12:00:00 ... 2008-10-31T12:00:00
  * longitude  (longitude) float32 -180.0 -179.91667 ... 179.83333 179.91667
Data variables:
    vo         (time, latitude, longitude) float32 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>
    uo         (time, latitude, longitude) float32 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>
    sst        (time, latitude, longitude) float32 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>
    ssh        (time, latitude, longitude) float32 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>

```

As a region of this other dataset:

dset:
```
<xarray.Dataset>
Dimensions:    (latitude: 2041, longitude: 4320, time: 9490)
Coordinates:
  * latitude   (latitude) float32 -80.0 -79.916664 -79.833336 ... 89.916664 90.0
  * longitude  (longitude) float32 -180.0 -179.91667 ... 179.83333 179.91667
  * time       (time) datetime64[ns] 1993-01-01T12:00:00 ... 2018-12-25T12:00:00
Data variables:
    ssh        (time, latitude, longitude) float64 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>
    sst        (time, latitude, longitude) float64 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>
    uo         (time, latitude, longitude) float64 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>
    vo         (time, latitude, longitude) float64 dask.array<chunksize=(30, 510, 1080), meta=np.ndarray>
```

Using the following call:

```
dsregion.to_zarr(dset_url, region={""time"": slice(5752, 5783)})
```

But I got stuck on the conditional below within `xarray/backends/api.py`:

```
   1347         non_matching_vars = [
   1348             k
   1349             for k, v in ds_to_append.variables.items()
   1350             if not set(region).intersection(v.dims)
   1351         ]
   1352         import ipdb; ipdb.set_trace()
-> 1353         if non_matching_vars:
   1354             raise ValueError(
   1355                 f""when setting `region` explicitly in to_zarr(), all ""
   1356                 f""variables in the dataset to write must have at least ""
   1357                 f""one dimension in common with the region's dimensions ""
   1358                 f""{list(region.keys())}, but that is not ""
   1359                 f""the case for some variables here. To drop these variables ""
   1360                 f""from this dataset before exporting to zarr, write: ""
   1361                 f"".drop({non_matching_vars!r})""
   1362             )
```

Apparently because `time` is not a dimension in coordinate variables [""longitude"", ""latitude""]:

```
ipdb> p non_matching_vars                                
['latitude', 'longitude']
ipdb> p set(region)                                      
{'time'}
```

Should this checking be performed for all variables, or only for data_variables?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,613012939