issue_comments
7 rows where issue = 180516114 and user = 1217238 sorted by updated_at descending
This data as json, CSV (advanced)
These facets timed out: issue
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
391805626 | https://github.com/pydata/xarray/issues/1026#issuecomment-391805626 | https://api.github.com/repos/pydata/xarray/issues/1026 | MDEyOklzc3VlQ29tbWVudDM5MTgwNTYyNg== | shoyer 1217238 | 2018-05-24T17:59:31Z | 2018-05-24T17:59:31Z | MEMBER | Indeed, it looks like this works now. Extending the example from the first post:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
multidim groupby on dask arrays: dask.array.reshape error 180516114 | |
286181363 | https://github.com/pydata/xarray/issues/1026#issuecomment-286181363 | https://api.github.com/repos/pydata/xarray/issues/1026 | MDEyOklzc3VlQ29tbWVudDI4NjE4MTM2Mw== | shoyer 1217238 | 2017-03-13T17:28:40Z | 2017-03-13T17:28:40Z | MEMBER | This is what I was looking for:
So in this case (where the chunk size is already 1), dask.array.reshape could actually work fine and the error is unnecessary (we don't have the exploding task issue). So this could potentially be fixed upstream in dask. For now, the best work-around (because you don't have any memory concerns) is to "rechunk" into a single block along the last axis before reshaping, e.g., |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
multidim groupby on dask arrays: dask.array.reshape error 180516114 | |
286152275 | https://github.com/pydata/xarray/issues/1026#issuecomment-286152275 | https://api.github.com/repos/pydata/xarray/issues/1026 | MDEyOklzc3VlQ29tbWVudDI4NjE1MjI3NQ== | shoyer 1217238 | 2017-03-13T15:58:29Z | 2017-03-13T15:58:29Z | MEMBER | @byersiiasa What matters for dask's |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
multidim groupby on dask arrays: dask.array.reshape error 180516114 | |
286123584 | https://github.com/pydata/xarray/issues/1026#issuecomment-286123584 | https://api.github.com/repos/pydata/xarray/issues/1026 | MDEyOklzc3VlQ29tbWVudDI4NjEyMzU4NA== | shoyer 1217238 | 2017-03-13T14:29:12Z | 2017-03-13T14:29:12Z | MEMBER | That array is loaded in numpy already - can you share the dask version? On Mon, Mar 13, 2017 at 2:57 AM byersiiasa notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
multidim groupby on dask arrays: dask.array.reshape error 180516114 | |
285893380 | https://github.com/pydata/xarray/issues/1026#issuecomment-285893380 | https://api.github.com/repos/pydata/xarray/issues/1026 | MDEyOklzc3VlQ29tbWVudDI4NTg5MzM4MA== | shoyer 1217238 | 2017-03-11T19:23:55Z | 2017-03-11T19:23:55Z | MEMBER | @byersiiasa can you share what |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
multidim groupby on dask arrays: dask.array.reshape error 180516114 | |
250997873 | https://github.com/pydata/xarray/issues/1026#issuecomment-250997873 | https://api.github.com/repos/pydata/xarray/issues/1026 | MDEyOklzc3VlQ29tbWVudDI1MDk5Nzg3Mw== | shoyer 1217238 | 2016-10-02T21:38:30Z | 2016-10-02T21:38:30Z | MEMBER | It would look something like this:
1. Verify that |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
multidim groupby on dask arrays: dask.array.reshape error 180516114 | |
250986266 | https://github.com/pydata/xarray/issues/1026#issuecomment-250986266 | https://api.github.com/repos/pydata/xarray/issues/1026 | MDEyOklzc3VlQ29tbWVudDI1MDk4NjI2Ng== | shoyer 1217238 | 2016-10-02T18:20:36Z | 2016-10-02T18:20:36Z | MEMBER | This was an intentional change -- see https://github.com/dask/dask/pull/1469 Previously, we created lots of teeny tasks, which tended to negate any out of core benefits. The problem is that reshape promises an order to the elements it reshape which tends to split across existing chunks of dask arrays. We could work around this in xarray by adding custom logic to |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
multidim groupby on dask arrays: dask.array.reshape error 180516114 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1