home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 494210818 and user = 10554254 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • friedrichknuth · 2 ✖

issue 1

  • convert DataArray to DataSet before combine · 2 ✖

author_association 1

  • NONE 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
532754800 https://github.com/pydata/xarray/pull/3312#issuecomment-532754800 https://api.github.com/repos/pydata/xarray/issues/3312 MDEyOklzc3VlQ29tbWVudDUzMjc1NDgwMA== friedrichknuth 10554254 2019-09-18T16:08:09Z 2019-09-18T16:08:09Z NONE

Opened https://github.com/pydata/xarray/issues/3315 regarding combine_nested() failing when being passed nested DataSets.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  convert DataArray to DataSet before combine 494210818
532419859 https://github.com/pydata/xarray/pull/3312#issuecomment-532419859 https://api.github.com/repos/pydata/xarray/issues/3312 MDEyOklzc3VlQ29tbWVudDUzMjQxOTg1OQ== friedrichknuth 10554254 2019-09-17T22:03:23Z 2019-09-17T23:51:13Z NONE

pytest -q xarray/tests/test_combine.py is telling me that

``` def test_concat_name_symmetry(self): """Inspired by the discussion on GH issue #2777"""

da1 = DataArray(name="a", data=[[0]], dims=["x", "y"])
da2 = DataArray(name="b", data=[[1]], dims=["x", "y"])
da3 = DataArray(name="a", data=[[2]], dims=["x", "y"])
da4 = DataArray(name="b", data=[[3]], dims=["x", "y"])

x_first = combine_nested([[da1, da2], [da3, da4]], concat_dim=["x", "y"])

```

fails with: ```


KeyError Traceback (most recent call last) <ipython-input-13-bc4a941bd0c3> in <module> 3 da3 = xr.DataArray(name="a", data=[[2]], dims=["x", "y"]) 4 da4 = xr.DataArray(name="b", data=[[3]], dims=["x", "y"]) ----> 5 xr.combine_nested([[da1, da2], [da3, da4]], concat_dim=["x", "y"])

~/repos/contribute/xarray/xarray/core/combine.py in combine_nested(objects, concat_dim, compat, data_vars, coords, fill_value, join) 468 ids=False, 469 fill_value=fill_value, --> 470 join=join, 471 ) 472

~/repos/contribute/xarray/xarray/core/combine.py in _nested_combine(datasets, concat_dims, compat, data_vars, coords, ids, fill_value, join) 305 coords=coords, 306 fill_value=fill_value, --> 307 join=join, 308 ) 309 return combined

~/repos/contribute/xarray/xarray/core/combine.py in _combine_nd(combined_ids, concat_dims, data_vars, coords, compat, fill_value, join) 196 compat=compat, 197 fill_value=fill_value, --> 198 join=join, 199 ) 200 (combined_ds,) = combined_ids.values()

~/repos/contribute/xarray/xarray/core/combine.py in _combine_all_along_first_dim(combined_ids, dim, data_vars, coords, compat, fill_value, join) 218 datasets = combined_ids.values() 219 new_combined_ids[new_id] = _combine_1d( --> 220 datasets, dim, compat, data_vars, coords, fill_value, join 221 ) 222 return new_combined_ids

~/repos/contribute/xarray/xarray/core/combine.py in _combine_1d(datasets, concat_dim, compat, data_vars, coords, fill_value, join) 246 compat=compat, 247 fill_value=fill_value, --> 248 join=join, 249 ) 250 except ValueError as err:

~/repos/contribute/xarray/xarray/core/concat.py in concat(objs, dim, data_vars, coords, compat, positions, fill_value, join) 131 "objects, got %s" % type(first_obj) 132 ) --> 133 return f(objs, dim, data_vars, coords, compat, positions, fill_value, join) 134 135

~/repos/contribute/xarray/xarray/core/concat.py in _dataset_concat(datasets, dim, data_vars, coords, compat, positions, fill_value, join) 363 for k in datasets[0].variables: 364 if k in concat_over: --> 365 vars = ensure_common_dims([ds.variables[k] for ds in datasets]) 366 combined = concat_vars(vars, dim, positions) 367 assert isinstance(combined, Variable)

~/repos/contribute/xarray/xarray/core/concat.py in <listcomp>(.0) 363 for k in datasets[0].variables: 364 if k in concat_over: --> 365 vars = ensure_common_dims([ds.variables[k] for ds in datasets]) 366 combined = concat_vars(vars, dim, positions) 367 assert isinstance(combined, Variable)

~/repos/contribute/xarray/xarray/core/utils.py in getitem(self, key) 383 384 def getitem(self, key: K) -> V: --> 385 return self.mapping[key] 386 387 def iter(self) -> Iterator[K]:

KeyError: 'a' ```

It looks like the existing combine_nested() routine actually wants a DataArray and fails if passed a DataSet.

The following should work with current master.

da1 = xr.DataArray(name="a", data=[[0]], dims=["x", "y"]) da2 = xr.DataArray(name="b", data=[[1]], dims=["x", "y"]) da3 = xr.DataArray(name="a", data=[[2]], dims=["x", "y"]) da4 = xr.DataArray(name="b", data=[[3]], dims=["x", "y"]) xr.combine_nested([[da1, da2], [da3, da4]], concat_dim=["x", "y"])

While converting to DataSet will cause the same error expressed by the test.

ds1 = da1.to_dataset() ds2 = da2.to_dataset() ds3 = da3.to_dataset() ds4 = da4.to_dataset() xr.combine_nested([[ds1, ds2], [ds3, ds4]], concat_dim=["x", "y"])

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  convert DataArray to DataSet before combine 494210818

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 13.491ms · About: xarray-datasette