id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 1368186791,I_kwDOAMm_X85RjN-n,7015,ArrayNotFoundError when saving xarray Dataset as zarr,67194538,closed,0,,,2,2022-09-09T18:30:08Z,2023-05-10T17:28:12Z,2022-09-15T17:09:58Z,NONE,,,,"### What happened? Good day! I'm trying to write an xarray Dataset to zarr but this error appears: ```python ArrayNotFoundError: array not found at path %r' ""array not found at path %r' 'latitude'"" ``` However when I double-check my xarray Dataset, `latitude` is present as a coordinate. ### What did you expect to happen? _No response_ ### Minimal Complete Verifiable Example _No response_ ### MVCE confirmation - [ ] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray. - [ ] Complete example — the example is self-contained, including all data and the text of any traceback. - [ ] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result. - [ ] New issue — a search of GitHub Issues suggests this is not a duplicate. ### Relevant log output ```Python --------------------------------------------------------------------------- ArrayNotFoundError Traceback (most recent call last) Input In [30], in () ----> 1 ds.to_zarr(temp_path) File /srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py:2036, in Dataset.to_zarr(self, store, chunk_store, mode, synchronizer, group, encoding, compute, consolidated, append_dim, region, safe_chunks, storage_options) 2033 if encoding is None: 2034 encoding = {} -> 2036 return to_zarr( 2037 self, 2038 store=store, 2039 chunk_store=chunk_store, 2040 storage_options=storage_options, 2041 mode=mode, 2042 synchronizer=synchronizer, 2043 group=group, 2044 encoding=encoding, 2045 compute=compute, 2046 consolidated=consolidated, 2047 append_dim=append_dim, 2048 region=region, 2049 safe_chunks=safe_chunks, 2050 ) File /srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/api.py:1432, in to_zarr(dataset, store, chunk_store, mode, synchronizer, group, encoding, compute, consolidated, append_dim, region, safe_chunks, storage_options) 1430 # TODO: figure out how to properly handle unlimited_dims 1431 dump_to_store(dataset, zstore, writer, encoding=encoding) -> 1432 writes = writer.sync(compute=compute) 1434 if compute: 1435 _finalize_store(writes, zstore) File /srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/common.py:166, in ArrayWriter.sync(self, compute) 160 import dask.array as da 162 # TODO: consider wrapping targets with dask.delayed, if this makes 163 # for any discernible difference in perforance, e.g., 164 # targets = [dask.delayed(t) for t in self.targets] --> 166 delayed_store = da.store( 167 self.sources, 168 self.targets, 169 lock=self.lock, 170 compute=compute, 171 flush=True, 172 regions=self.regions, 173 ) 174 self.sources = [] 175 self.targets = [] File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py:1223, in store(***failed resolving arguments***) 1221 elif compute: 1222 store_dsk = HighLevelGraph(layers, dependencies) -> 1223 compute_as_if_collection(Array, store_dsk, map_keys, **kwargs) 1224 return None 1226 else: File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py:343, in compute_as_if_collection(cls, dsk, keys, scheduler, get, **kwargs) 340 # see https://github.com/dask/dask/issues/8991. 341 # This merge should be removed once the underlying issue is fixed. 342 dsk2 = HighLevelGraph.merge(dsk2) --> 343 return schedule(dsk2, keys, **kwargs) File /srv/conda/envs/notebook/lib/python3.9/site-packages/distributed/client.py:3014, in Client.get(self, dsk, keys, workers, allow_other_workers, resources, sync, asynchronous, direct, retries, priority, fifo_timeout, actors, **kwargs) 3012 should_rejoin = False 3013 try: -> 3014 results = self.gather(packed, asynchronous=asynchronous, direct=direct) 3015 finally: 3016 for f in futures.values(): File /srv/conda/envs/notebook/lib/python3.9/site-packages/distributed/client.py:2188, in Client.gather(self, futures, errors, direct, asynchronous) 2186 else: 2187 local_worker = None -> 2188 return self.sync( 2189 self._gather, 2190 futures, 2191 errors=errors, 2192 direct=direct, 2193 local_worker=local_worker, 2194 asynchronous=asynchronous, 2195 ) File /srv/conda/envs/notebook/lib/python3.9/site-packages/distributed/utils.py:320, in SyncMethodMixin.sync(self, func, asynchronous, callback_timeout, *args, **kwargs) 318 return future 319 else: --> 320 return sync( 321 self.loop, func, *args, callback_timeout=callback_timeout, **kwargs 322 ) File /srv/conda/envs/notebook/lib/python3.9/site-packages/distributed/utils.py:387, in sync(loop, func, callback_timeout, *args, **kwargs) 385 if error: 386 typ, exc, tb = error --> 387 raise exc.with_traceback(tb) 388 else: 389 return result File /srv/conda/envs/notebook/lib/python3.9/site-packages/distributed/utils.py:360, in sync..f() 358 future = asyncio.wait_for(future, callback_timeout) 359 future = asyncio.ensure_future(future) --> 360 result = yield future 361 except Exception: 362 error = sys.exc_info() File /srv/conda/envs/notebook/lib/python3.9/site-packages/tornado/gen.py:762, in Runner.run(self) 759 exc_info = None 761 try: --> 762 value = future.result() 763 except Exception: 764 exc_info = sys.exc_info() File /srv/conda/envs/notebook/lib/python3.9/site-packages/distributed/client.py:2051, in Client._gather(self, futures, errors, direct, local_worker) 2049 exc = CancelledError(key) 2050 else: -> 2051 raise exception.with_traceback(traceback) 2052 raise exc 2053 if errors == ""skip"": File /srv/conda/envs/notebook/lib/python3.9/site-packages/distributed/protocol/pickle.py:66, in loads() 64 return pickle.loads(x, buffers=buffers) 65 else: ---> 66 return pickle.loads(x) 67 except Exception: 68 logger.info(""Failed to deserialize %s"", x[:10000], exc_info=True) File /srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py:2285, in __setstate__() 2284 def __setstate__(self, state): -> 2285 self.__init__(*state) File /srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py:180, in __init__() 177 self._write_empty_chunks = write_empty_chunks 179 # initialize metadata --> 180 self._load_metadata() 182 # initialize attributes 183 akey = self._key_prefix + attrs_key File /srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py:197, in _load_metadata() 195 """"""(Re)load metadata from store."""""" 196 if self._synchronizer is None: --> 197 self._load_metadata_nosync() 198 else: 199 mkey = self._key_prefix + array_meta_key File /srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py:208, in _load_metadata_nosync() 206 meta_bytes = self._store[mkey] 207 except KeyError: --> 208 raise ArrayNotFoundError(self._path) 209 else: 210 211 # decode and store metadata as instance members 212 meta = self._store._metadata_class.decode_array_metadata(meta_bytes) ArrayNotFoundError: array not found at path %r' ""array not found at path %r' 'latitude'"" ``` ### Anything else we need to know? _No response_ ### Environment
xarray: 2022.3.0 pandas: 1.4.2 numpy: 1.22.4 scipy: 1.8.1 netCDF4: 1.5.8 pydap: installed h5netcdf: 1.0.0 h5py: 3.6.0 Nio: None zarr: 2.11.3 cftime: 1.6.0 nc_time_axis: 1.4.1 PseudoNetCDF: None rasterio: 1.2.10 cfgrib: 0.9.10.1 iris: None bottleneck: 1.3.4 dask: 2022.05.1 distributed: 2022.5.1 matplotlib: 3.5.2 cartopy: 0.20.2 seaborn: 0.11.2 numbagg: None fsspec: 2022.5.0 cupy: None pint: 0.19.2 sparse: 0.13.0 setuptools: 62.3.2 pip: 22.1.2 conda: None pytest: 7.1.2 IPython: 8.4.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7015/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue