issue_comments
5 rows where issue = 441341354 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- xr.merge fails when passing dict · 5 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
491092770 | https://github.com/pydata/xarray/issues/2948#issuecomment-491092770 | https://api.github.com/repos/pydata/xarray/issues/2948 | MDEyOklzc3VlQ29tbWVudDQ5MTA5Mjc3MA== | max-sixty 5635139 | 2019-05-09T22:38:51Z | 2019-05-09T22:38:51Z | MEMBER | I'm trying to think through if there would ever be ambiguity between packed and as specified; I think it's fine. One note is that I'm not sure it's that helpful though. In the case listed above, it's just as good to put them in a Dataset: ```python In [2]: xr.merge([objects]) Out[2]: <xarray.Dataset> Dimensions: (bar: 1, foo: 2) Dimensions without coordinates: bar, foo Data variables: a (foo) int64 1 2 b (bar) int64 3 In [3]: xr.Dataset(objects) Out[3]: <xarray.Dataset> Dimensions: (bar: 1, foo: 2) Dimensions without coordinates: bar, foo Data variables: a (foo) int64 1 2 b (bar) int64 3 ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xr.merge fails when passing dict 441341354 | |
490434825 | https://github.com/pydata/xarray/issues/2948#issuecomment-490434825 | https://api.github.com/repos/pydata/xarray/issues/2948 | MDEyOklzc3VlQ29tbWVudDQ5MDQzNDgyNQ== | mathause 10194086 | 2019-05-08T10:29:09Z | 2019-05-08T10:29:09Z | MEMBER | Yes, you are right - I did not take the Would it make sense to pack them?
Then we would need to check |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xr.merge fails when passing dict 441341354 | |
490217040 | https://github.com/pydata/xarray/issues/2948#issuecomment-490217040 | https://api.github.com/repos/pydata/xarray/issues/2948 | MDEyOklzc3VlQ29tbWVudDQ5MDIxNzA0MA== | shoyer 1217238 | 2019-05-07T19:08:55Z | 2019-05-07T19:08:55Z | MEMBER |
Yes, absolutely! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xr.merge fails when passing dict 441341354 | |
490191774 | https://github.com/pydata/xarray/issues/2948#issuecomment-490191774 | https://api.github.com/repos/pydata/xarray/issues/2948 | MDEyOklzc3VlQ29tbWVudDQ5MDE5MTc3NA== | max-sixty 5635139 | 2019-05-07T18:17:29Z | 2019-05-07T18:17:29Z | MEMBER | +1 Though I think for these we should have better error messages. Ideally I think we'd a) check it were an iterable, and b) check each item as processed for one of the valid types, raising with a clear message on failures. Thoughts? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xr.merge fails when passing dict 441341354 | |
490188370 | https://github.com/pydata/xarray/issues/2948#issuecomment-490188370 | https://api.github.com/repos/pydata/xarray/issues/2948 | MDEyOklzc3VlQ29tbWVudDQ5MDE4ODM3MA== | shoyer 1217238 | 2019-05-07T18:07:43Z | 2019-05-07T18:07:43Z | MEMBER | You can pass an iterable of dicts into merge, not a single dict, e.g., ```
So I think this is working as expected. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xr.merge fails when passing dict 441341354 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 3