home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

5 rows where author_association = "MEMBER" and issue = 943923579 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • keewis 5

issue 1

  • ⚠️ Nightly upstream-dev CI failed ⚠️ · 5 ✖

author_association 1

  • MEMBER · 5 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
880997366 https://github.com/pydata/xarray/issues/5600#issuecomment-880997366 https://api.github.com/repos/pydata/xarray/issues/5600 MDEyOklzc3VlQ29tbWVudDg4MDk5NzM2Ng== keewis 14808389 2021-07-15T20:38:05Z 2021-07-15T20:53:14Z MEMBER

the CI is still running, but test_backends.py passes so I'm going to close this. As the issue was also in a released version of fsspec the normal CI will keep failing until the next release (which I guess should be soon).

Edit: thanks for helping with the debugging, @martindurant

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ⚠️ Nightly upstream-dev CI failed ⚠️ 943923579
880882850 https://github.com/pydata/xarray/issues/5600#issuecomment-880882850 https://api.github.com/repos/pydata/xarray/issues/5600 MDEyOklzc3VlQ29tbWVudDg4MDg4Mjg1MA== keewis 14808389 2021-07-15T17:29:57Z 2021-07-15T17:29:57Z MEMBER

@martindurant, I think this is intake/filesystem_spec#707. Can you confirm?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ⚠️ Nightly upstream-dev CI failed ⚠️ 943923579
880718355 https://github.com/pydata/xarray/issues/5600#issuecomment-880718355 https://api.github.com/repos/pydata/xarray/issues/5600 MDEyOklzc3VlQ29tbWVudDg4MDcxODM1NQ== keewis 14808389 2021-07-15T13:59:47Z 2021-07-15T13:59:47Z MEMBER

apparently something in distributed changed, too, causing the test collection phase to fail with a assertion error (something about timeout not being set appropriately in gen_cluster, see the logs). dask/distributed#5022, maybe? cc @crusaderky

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ⚠️ Nightly upstream-dev CI failed ⚠️ 943923579
880692699 https://github.com/pydata/xarray/issues/5600#issuecomment-880692699 https://api.github.com/repos/pydata/xarray/issues/5600 MDEyOklzc3VlQ29tbWVudDg4MDY5MjY5OQ== keewis 14808389 2021-07-15T13:25:17Z 2021-07-15T13:25:17Z MEMBER

there are a few changes to the environment between the last passing and the first failing run, but that does include the fsspec update.

I also just noticed that we don't test the upstream version of fsspec in the upstream-dev CI: should we change that?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ⚠️ Nightly upstream-dev CI failed ⚠️ 943923579
880247548 https://github.com/pydata/xarray/issues/5600#issuecomment-880247548 https://api.github.com/repos/pydata/xarray/issues/5600 MDEyOklzc3VlQ29tbWVudDg4MDI0NzU0OA== keewis 14808389 2021-07-14T22:19:53Z 2021-07-14T22:19:53Z MEMBER

does anyone know what is causing this? A change to either zarr or fsspec, maybe?

cc @martindurant

For reference, here's the full traceback: ``` _______________________________ test_open_fsspec _______________________________ @requires_zarr @requires_fsspec @pytest.mark.filterwarnings("ignore:deallocating CachingFileManager") def test_open_fsspec(): import fsspec import zarr if not hasattr(zarr.storage, "FSStore") or not hasattr( zarr.storage.FSStore, "getitems" ): pytest.skip("zarr too old") ds = open_dataset(os.path.join(os.path.dirname(__file__), "data", "example_1.nc")) m = fsspec.filesystem("memory") mm = m.get_mapper("out1.zarr") ds.to_zarr(mm) # old interface ds0 = ds.copy() ds0["time"] = ds.time + pd.to_timedelta("1 day") mm = m.get_mapper("out2.zarr") ds0.to_zarr(mm) # old interface # single dataset url = "memory://out2.zarr" ds2 = open_dataset(url, engine="zarr") assert ds0 == ds2 # single dataset with caching url = "simplecache::memory://out2.zarr" > ds2 = open_dataset(url, engine="zarr") /home/runner/work/xarray/xarray/xarray/tests/test_backends.py:5150: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /home/runner/work/xarray/xarray/xarray/backends/api.py:497: in open_dataset backend_ds = backend.open_dataset( /home/runner/work/xarray/xarray/xarray/backends/zarr.py:839: in open_dataset ds = store_entrypoint.open_dataset( /home/runner/work/xarray/xarray/xarray/backends/store.py:27: in open_dataset vars, attrs, coord_names = conventions.decode_cf_variables( /home/runner/work/xarray/xarray/xarray/conventions.py:512: in decode_cf_variables new_vars[k] = decode_cf_variable( /home/runner/work/xarray/xarray/xarray/conventions.py:360: in decode_cf_variable var = times.CFDatetimeCoder(use_cftime=use_cftime).decode(var, name=name) /home/runner/work/xarray/xarray/xarray/coding/times.py:527: in decode dtype = _decode_cf_datetime_dtype(data, units, calendar, self.use_cftime) /home/runner/work/xarray/xarray/xarray/coding/times.py:145: in _decode_cf_datetime_dtype [first_n_items(values, 1) or [0], last_item(values) or [0]] /home/runner/work/xarray/xarray/xarray/core/formatting.py:72: in first_n_items return np.asarray(array).flat[:n_desired] /home/runner/work/xarray/xarray/xarray/core/indexing.py:354: in __array__ return np.asarray(self.array, dtype=dtype) /home/runner/work/xarray/xarray/xarray/core/indexing.py:419: in __array__ return np.asarray(array[self.key], dtype=None) /home/runner/work/xarray/xarray/xarray/backends/zarr.py:75: in __getitem__ return array[key.tuple] /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:662: in __getitem__ return self.get_basic_selection(selection, fields=fields) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:787: in get_basic_selection return self._get_basic_selection_nd(selection=selection, out=out, /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:830: in _get_basic_selection_nd return self._get_selection(indexer=indexer, out=out, fields=fields) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:1125: in _get_selection self._chunk_getitems(lchunk_coords, lchunk_selection, out, lout_selection, /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:1836: in _chunk_getitems cdatas = self.chunk_store.getitems(ckeys, on_error="omit") /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/storage.py:1085: in getitems results = self.map.getitems(keys_transformed, on_error="omit") _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <fsspec.mapping.FSMap object at 0x7f3172a8e9a0>, keys = ['time/0'] on_error = 'omit' def getitems(self, keys, on_error="raise"): """Fetch multiple items from the store If the backend is async-able, this might proceed concurrently Parameters ---------- keys: list(str) They keys to be fetched on_error : "raise", "omit", "return" If raise, an underlying exception will be raised (converted to KeyError if the type is in self.missing_exceptions); if omit, keys with exception will simply not be included in the output; if "return", all keys are included in the output, but the value will be bytes or an exception instance. Returns ------- dict(key, bytes|exception) """ keys2 = [self._key_to_str(k) for k in keys] oe = on_error if on_error == "raise" else "return" try: out = self.fs.cat(keys2, on_error=oe) except self.missing_exceptions as e: raise KeyError from e out = { k: (KeyError() if isinstance(v, self.missing_exceptions) else v) > for k, v in out.items() } E AttributeError: 'bytes' object has no attribute 'items' ```
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ⚠️ Nightly upstream-dev CI failed ⚠️ 943923579

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 50.602ms · About: xarray-datasette