issue_comments
8 rows where author_association = "MEMBER", issue = 129150619 and user = 306380 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- Cannot write dask Dataset to NetCDF file · 8 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
184351600 | https://github.com/pydata/xarray/issues/729#issuecomment-184351600 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE4NDM1MTYwMA== | mrocklin 306380 | 2016-02-15T19:16:26Z | 2016-02-15T19:16:26Z | MEMBER | Looking at the task graph my first guess is that @shoyer is correct, and that we've found another case that the scheduler should be able to handle well, but doesn't. This hasn't happened in a while, but it always leads to improvements whenever we find such a problem. For a case this complex I think we either need to reduce it to a particular graph motif on which we schedule poorly or we first need to develop a better way to visualize traces of the scheduler's behavior. I've started a separate dask issue: https://github.com/dask/dask/issues/994 For the near future I don't have a solution to @Scheibs 's research problem (sorry!) This will probably require tweaking dask scheduler internals which probably won't happen by me in the next couple of weeks. I'm very happy that people brought this to my attention though. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 | |
184344834 | https://github.com/pydata/xarray/issues/729#issuecomment-184344834 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE4NDM0NDgzNA== | mrocklin 306380 | 2016-02-15T18:51:13Z | 2016-02-15T18:51:13Z | MEMBER | Slowly taking a look at this now. Large PDF for the full computation, if anyone is interested: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 | |
184283995 | https://github.com/pydata/xarray/issues/729#issuecomment-184283995 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE4NDI4Mzk5NQ== | mrocklin 306380 | 2016-02-15T16:29:42Z | 2016-02-15T16:29:42Z | MEMBER | Downloaded |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 | |
182663029 | https://github.com/pydata/xarray/issues/729#issuecomment-182663029 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE4MjY2MzAyOQ== | mrocklin 306380 | 2016-02-11T01:00:45Z | 2016-02-11T01:00:45Z | MEMBER | My apologies for the slow response (very busy week, lots of exciting stuff, sadly results in poor user response). I can access that page easily but the download seems to halt after 11MB |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 | |
181401562 | https://github.com/pydata/xarray/issues/729#issuecomment-181401562 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE4MTQwMTU2Mg== | mrocklin 306380 | 2016-02-08T14:42:49Z | 2016-02-08T14:42:49Z | MEMBER | mrocklin continuum io On Mon, Feb 8, 2016 at 1:10 AM, Scheibs notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 | |
180808528 | https://github.com/pydata/xarray/issues/729#issuecomment-180808528 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE4MDgwODUyOA== | mrocklin 306380 | 2016-02-06T16:47:35Z | 2016-02-06T16:47:35Z | MEMBER | I would generally send such a large file by hosting it at a web-accessible location. Perhaps you are at an institution were you have access to host files online? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 | |
177527321 | https://github.com/pydata/xarray/issues/729#issuecomment-177527321 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE3NzUyNzMyMQ== | mrocklin 306380 | 2016-01-31T15:32:44Z | 2016-01-31T15:33:01Z | MEMBER | Sorry for the delay in response. Nothing here seems dangerous to me. @shoyer does the writeup above raise any questions for you? If convenient, it would be interesting to see the output of a few of the dask profilers:
``` python import cachey from dask.diagnostics import CacheProfiler, ResourceProfiler, Profiler, visualize with Profiler() as prof, CacheProfiler(metric=cachey.nbytes) as cprof, ResourceProfiler() as rprof: # call the final dataset.to_netcdf() function visualize([prof, cprof, rprof], file_path='profile.html') ``` And then upload that file somewhere, perhaps to a gist. In order to make this run to completion you might have to operate on a subset of the dataset. Alternatively, is there a way for me to recreate a version of this dataset on my local machine? @shoyer is there a way to capture the metadata of netcdf files and reinstantiate empty copies of them on another machine? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 | |
176385048 | https://github.com/pydata/xarray/issues/729#issuecomment-176385048 | https://api.github.com/repos/pydata/xarray/issues/729 | MDEyOklzc3VlQ29tbWVudDE3NjM4NTA0OA== | mrocklin 306380 | 2016-01-28T20:15:33Z | 2016-01-28T20:15:33Z | MEMBER | @Scheibs can you try calling these lines to remove multi-threading and see if the problem persists?
I agree with @shoyer that it would be very useful to see what you're doing that causes this problem. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot write dask Dataset to NetCDF file 129150619 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1