home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

1 row where author_association = "MEMBER", issue = 417281904 and user = 5635139 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

These facets timed out: author_association, issue

user 1

  • max-sixty · 1 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
469845973 https://github.com/pydata/xarray/pull/2800#issuecomment-469845973 https://api.github.com/repos/pydata/xarray/issues/2800 MDEyOklzc3VlQ29tbWVudDQ2OTg0NTk3Mw== max-sixty 5635139 2019-03-05T20:32:01Z 2019-03-05T20:32:01Z MEMBER

Thanks a lot @TimoRoth - this looks good!

I think the failures are unrelated. I'll merge shortly unless anyone knows something I don't.

Output pasted below for reference:

``` self = <xarray.tests.test_backends.TestValidateAttrs object at 0x000000B367D524E0> def test_validating_attrs(self): def new_dataset(): return Dataset({'data': ('y', np.arange(10.0))}, {'y': np.arange(10)}) def new_dataset_and_dataset_attrs(): ds = new_dataset() return ds, ds.attrs def new_dataset_and_data_attrs(): ds = new_dataset() return ds, ds.data.attrs def new_dataset_and_coord_attrs(): ds = new_dataset() return ds, ds.coords['y'].attrs for new_dataset_and_attrs in [new_dataset_and_dataset_attrs, new_dataset_and_data_attrs, new_dataset_and_coord_attrs]: ds, attrs = new_dataset_and_attrs() attrs[123] = 'test' with raises_regex(TypeError, 'Invalid name for attr'): ds.to_netcdf('test.nc') ds, attrs = new_dataset_and_attrs() attrs[MiscObject()] = 'test' with raises_regex(TypeError, 'Invalid name for attr'): ds.to_netcdf('test.nc') ds, attrs = new_dataset_and_attrs() attrs[''] = 'test' with raises_regex(ValueError, 'Invalid name for attr'): ds.to_netcdf('test.nc') # This one should work ds, attrs = new_dataset_and_attrs() attrs['test'] = 'test' with create_tmp_file() as tmp_file: ds.to_netcdf(tmp_file) ds, attrs = new_dataset_and_attrs() attrs['test'] = {'a': 5} with raises_regex(TypeError, 'Invalid value for attr'): ds.to_netcdf('test.nc') ds, attrs = new_dataset_and_attrs() attrs['test'] = MiscObject() with raises_regex(TypeError, 'Invalid value for attr'): ds.to_netcdf('test.nc') ds, attrs = new_dataset_and_attrs() attrs['test'] = 5 with create_tmp_file() as tmp_file: ds.to_netcdf(tmp_file) ds, attrs = new_dataset_and_attrs() attrs['test'] = 3.14 with create_tmp_file() as tmp_file: ds.to_netcdf(tmp_file) ds, attrs = new_dataset_and_attrs() attrs['test'] = [1, 2, 3, 4] with create_tmp_file() as tmp_file: ds.to_netcdf(tmp_file) ds, attrs = new_dataset_and_attrs() attrs['test'] = (1.9, 2.5) with create_tmp_file() as tmp_file: ds.to_netcdf(tmp_file) ds, attrs = new_dataset_and_attrs() attrs['test'] = np.arange(5) with create_tmp_file() as tmp_file: ds.to_netcdf(tmp_file) ds, attrs = new_dataset_and_attrs() attrs['test'] = np.arange(12).reshape(3, 4) with create_tmp_file() as tmp_file: > ds.to_netcdf(tmp_file) xarray\tests\test_backends.py:3450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray\core\dataset.py:1323: in to_netcdf compute=compute) xarray\backends\api.py:767: in to_netcdf unlimited_dims=unlimited_dims) xarray\backends\api.py:810: in dump_to_store unlimited_dims=unlimited_dims) xarray\backends\common.py:262: in store self.set_attributes(attributes) xarray\backends\common.py:278: in set_attributes self.set_attribute(k, v) xarray\backends\netCDF4_.py:418: in set_attribute _set_nc_attribute(self.ds, key, value) xarray\backends\netCDF4_.py:294: in _set_nc_attribute obj.setncattr(key, value) netCDF4\_netCDF4.pyx:2781: in netCDF4._netCDF4.Dataset.setncattr ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > ??? E ValueError: multi-dimensional array attributes not supported netCDF4\_netCDF4.pyx:1514: ValueError ```
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Don't use deprecated np.asscalar() 417281904

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 5918.518ms · About: xarray-datasette