html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/5600#issuecomment-880997366,https://api.github.com/repos/pydata/xarray/issues/5600,880997366,MDEyOklzc3VlQ29tbWVudDg4MDk5NzM2Ng==,14808389,2021-07-15T20:38:05Z,2021-07-15T20:53:14Z,MEMBER,"the CI is still running, but `test_backends.py` passes so I'm going to close this. As the issue was also in a released version of `fsspec` the normal CI will keep failing until the next release (which I guess should be soon).
Edit: thanks for helping with the debugging, @martindurant","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,943923579
https://github.com/pydata/xarray/issues/5600#issuecomment-880882850,https://api.github.com/repos/pydata/xarray/issues/5600,880882850,MDEyOklzc3VlQ29tbWVudDg4MDg4Mjg1MA==,14808389,2021-07-15T17:29:57Z,2021-07-15T17:29:57Z,MEMBER,"@martindurant, I think this is intake/filesystem_spec#707. Can you confirm?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,943923579
https://github.com/pydata/xarray/issues/5600#issuecomment-880718355,https://api.github.com/repos/pydata/xarray/issues/5600,880718355,MDEyOklzc3VlQ29tbWVudDg4MDcxODM1NQ==,14808389,2021-07-15T13:59:47Z,2021-07-15T13:59:47Z,MEMBER,"apparently something in `distributed` changed, too, causing the test collection phase to fail with a assertion error (something about `timeout` not being set appropriately in `gen_cluster`, see the [logs](https://github.com/pydata/xarray/runs/3077217144?check_suite_focus=true#step:8:20)). dask/distributed#5022, maybe? cc @crusaderky","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,943923579
https://github.com/pydata/xarray/issues/5600#issuecomment-880692699,https://api.github.com/repos/pydata/xarray/issues/5600,880692699,MDEyOklzc3VlQ29tbWVudDg4MDY5MjY5OQ==,14808389,2021-07-15T13:25:17Z,2021-07-15T13:25:17Z,MEMBER,"there are a few changes to the environment between the [last passing](https://github.com/pydata/xarray/runs/3051941150?check_suite_focus=true#step:6:76) and the [first failing](https://github.com/pydata/xarray/runs/3062081796?check_suite_focus=true#step:6:76) run, but that does include the `fsspec` update.
I also just noticed that we don't test the upstream version of `fsspec` in the upstream-dev CI: should we change that?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,943923579
https://github.com/pydata/xarray/issues/5600#issuecomment-880247548,https://api.github.com/repos/pydata/xarray/issues/5600,880247548,MDEyOklzc3VlQ29tbWVudDg4MDI0NzU0OA==,14808389,2021-07-14T22:19:53Z,2021-07-14T22:19:53Z,MEMBER,"does anyone know what is causing this? A change to either `zarr` or `fsspec`, maybe?
cc @martindurant
For reference, here's the full traceback:
```
_______________________________ test_open_fsspec _______________________________
@requires_zarr
@requires_fsspec
@pytest.mark.filterwarnings(""ignore:deallocating CachingFileManager"")
def test_open_fsspec():
import fsspec
import zarr
if not hasattr(zarr.storage, ""FSStore"") or not hasattr(
zarr.storage.FSStore, ""getitems""
):
pytest.skip(""zarr too old"")
ds = open_dataset(os.path.join(os.path.dirname(__file__), ""data"", ""example_1.nc""))
m = fsspec.filesystem(""memory"")
mm = m.get_mapper(""out1.zarr"")
ds.to_zarr(mm) # old interface
ds0 = ds.copy()
ds0[""time""] = ds.time + pd.to_timedelta(""1 day"")
mm = m.get_mapper(""out2.zarr"")
ds0.to_zarr(mm) # old interface
# single dataset
url = ""memory://out2.zarr""
ds2 = open_dataset(url, engine=""zarr"")
assert ds0 == ds2
# single dataset with caching
url = ""simplecache::memory://out2.zarr""
> ds2 = open_dataset(url, engine=""zarr"")
/home/runner/work/xarray/xarray/xarray/tests/test_backends.py:5150:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/home/runner/work/xarray/xarray/xarray/backends/api.py:497: in open_dataset
backend_ds = backend.open_dataset(
/home/runner/work/xarray/xarray/xarray/backends/zarr.py:839: in open_dataset
ds = store_entrypoint.open_dataset(
/home/runner/work/xarray/xarray/xarray/backends/store.py:27: in open_dataset
vars, attrs, coord_names = conventions.decode_cf_variables(
/home/runner/work/xarray/xarray/xarray/conventions.py:512: in decode_cf_variables
new_vars[k] = decode_cf_variable(
/home/runner/work/xarray/xarray/xarray/conventions.py:360: in decode_cf_variable
var = times.CFDatetimeCoder(use_cftime=use_cftime).decode(var, name=name)
/home/runner/work/xarray/xarray/xarray/coding/times.py:527: in decode
dtype = _decode_cf_datetime_dtype(data, units, calendar, self.use_cftime)
/home/runner/work/xarray/xarray/xarray/coding/times.py:145: in _decode_cf_datetime_dtype
[first_n_items(values, 1) or [0], last_item(values) or [0]]
/home/runner/work/xarray/xarray/xarray/core/formatting.py:72: in first_n_items
return np.asarray(array).flat[:n_desired]
/home/runner/work/xarray/xarray/xarray/core/indexing.py:354: in __array__
return np.asarray(self.array, dtype=dtype)
/home/runner/work/xarray/xarray/xarray/core/indexing.py:419: in __array__
return np.asarray(array[self.key], dtype=None)
/home/runner/work/xarray/xarray/xarray/backends/zarr.py:75: in __getitem__
return array[key.tuple]
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:662: in __getitem__
return self.get_basic_selection(selection, fields=fields)
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:787: in get_basic_selection
return self._get_basic_selection_nd(selection=selection, out=out,
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:830: in _get_basic_selection_nd
return self._get_selection(indexer=indexer, out=out, fields=fields)
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:1125: in _get_selection
self._chunk_getitems(lchunk_coords, lchunk_selection, out, lout_selection,
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:1836: in _chunk_getitems
cdatas = self.chunk_store.getitems(ckeys, on_error=""omit"")
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/storage.py:1085: in getitems
results = self.map.getitems(keys_transformed, on_error=""omit"")
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = , keys = ['time/0']
on_error = 'omit'
def getitems(self, keys, on_error=""raise""):
""""""Fetch multiple items from the store
If the backend is async-able, this might proceed concurrently
Parameters
----------
keys: list(str)
They keys to be fetched
on_error : ""raise"", ""omit"", ""return""
If raise, an underlying exception will be raised (converted to KeyError
if the type is in self.missing_exceptions); if omit, keys with exception
will simply not be included in the output; if ""return"", all keys are
included in the output, but the value will be bytes or an exception
instance.
Returns
-------
dict(key, bytes|exception)
""""""
keys2 = [self._key_to_str(k) for k in keys]
oe = on_error if on_error == ""raise"" else ""return""
try:
out = self.fs.cat(keys2, on_error=oe)
except self.missing_exceptions as e:
raise KeyError from e
out = {
k: (KeyError() if isinstance(v, self.missing_exceptions) else v)
> for k, v in out.items()
}
E AttributeError: 'bytes' object has no attribute 'items'
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,943923579