issue_comments
8 rows where issue = 550335922 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue 1
- documentation build issues on RTD · 8 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
782938240 | https://github.com/pydata/xarray/issues/3697#issuecomment-782938240 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDc4MjkzODI0MA== | keewis 14808389 | 2021-02-21T22:21:55Z | 2021-02-25T13:52:03Z | MEMBER | we didn't see this for quite some time, so I assume we can close this. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 | |
590925187 | https://github.com/pydata/xarray/issues/3697#issuecomment-590925187 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDU5MDkyNTE4Nw== | fmaussion 10050469 | 2020-02-25T15:30:39Z | 2020-02-25T15:30:39Z | MEMBER |
Wouldn't we loose the possibility explore older version of the docs? Or is doctr also providing this service? It seems so silly to have to reivent readthedocs just because of their CI... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 | |
579397908 | https://github.com/pydata/xarray/issues/3697#issuecomment-579397908 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDU3OTM5NzkwOA== | keewis 14808389 | 2020-01-28T18:49:17Z | 2020-01-28T22:42:50Z | MEMBER | it may be that the timeouts are not caused by RTD: I have been trying to build the documentation several times and it sometimes pauses while trying to read / build (?)
```pytb
KeyboardInterrupt Traceback (most recent call last)
<ipython-input-4-2ef53683336b> in <module>
----> 1 ds.to_netcdf('manipulated-example-data.nc')
.../xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf)
1543 unlimited_dims=unlimited_dims,
1544 compute=compute,
-> 1545 invalid_netcdf=invalid_netcdf,
1546 )
1547
.../xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf)
1095 return writer, store
1096
-> 1097 writes = writer.sync(compute=compute)
1098
1099 if path_or_file is None:
.../xarray/backends/common.py in sync(self, compute)
202 compute=compute,
203 flush=True,
--> 204 regions=self.regions,
205 )
206 self.sources = []
.../lib/python3.7/site-packages/dask/array/core.py in store(sources, targets, lock, regions, compute, return_stored, **kwargs)
921
922 if compute:
--> 923 result.compute(**kwargs)
924 return None
925 else:
.../lib/python3.7/site-packages/dask/base.py in compute(self, **kwargs)
163 dask.base.compute
164 """
--> 165 (result,) = compute(self, traverse=False, **kwargs)
166 return result
167
.../lib/python3.7/site-packages/dask/base.py in compute(*args, **kwargs)
434 keys = [x.__dask_keys__() for x in collections]
435 postcomputes = [x.__dask_postcompute__() for x in collections]
--> 436 results = schedule(dsk, keys, **kwargs)
437 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
438
.../lib/python3.7/site-packages/dask/threaded.py in get(dsk, result, cache, num_workers, pool, **kwargs)
79 get_id=_thread_get_id,
80 pack_exception=pack_exception,
---> 81 **kwargs
82 )
83
.../lib/python3.7/site-packages/dask/local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs)
473 # Main loop, wait on tasks to finish, insert new ones
474 while state["waiting"] or state["ready"] or state["running"]:
--> 475 key, res_info, failed = queue_get(queue)
476 if failed:
477 exc, tb = loads(res_info)
.../lib/python3.7/site-packages/dask/local.py in queue_get(q)
131
132 def queue_get(q):
--> 133 return q.get()
134
135
.../lib/python3.7/queue.py in get(self, block, timeout)
168 elif timeout is None:
169 while not self._qsize():
--> 170 self.not_empty.wait()
171 elif timeout < 0:
172 raise ValueError("'timeout' must be a non-negative number")
.../lib/python3.7/threading.py in wait(self, timeout)
294 try: # restore state no matter what (e.g., KeyboardInterrupt)
295 if timeout is None:
--> 296 waiter.acquire()
297 gotit = True
298 else:
KeyboardInterrupt:
```
Is there something that could cause a dead lock? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 | |
579420906 | https://github.com/pydata/xarray/issues/3697#issuecomment-579420906 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDU3OTQyMDkwNg== | dcherian 2448579 | 2020-01-28T19:42:43Z | 2020-01-28T19:42:43Z | MEMBER | Yeah I run into this occasionally. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 | |
577197732 | https://github.com/pydata/xarray/issues/3697#issuecomment-577197732 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDU3NzE5NzczMg== | crusaderky 6213168 | 2020-01-22T14:08:20Z | 2020-01-22T14:08:20Z | MEMBER | The obvious downside is that anybody with a link to one of the internal pages of our documentation will have the link broken. Also I'm unsure how straightforward it will be to rebuild all of our historical versions. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 | |
577197322 | https://github.com/pydata/xarray/issues/3697#issuecomment-577197322 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDU3NzE5NzMyMg== | crusaderky 6213168 | 2020-01-22T14:07:25Z | 2020-01-22T14:07:25Z | MEMBER | Very glad to upvote anything that rids us of the RTD CI! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 | |
574847265 | https://github.com/pydata/xarray/issues/3697#issuecomment-574847265 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDU3NDg0NzI2NQ== | keewis 14808389 | 2020-01-15T20:42:11Z | 2020-01-15T20:42:11Z | MEMBER | not sure on downsides, but I think we could use this to provide a documentation preview for PRs? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 | |
574837235 | https://github.com/pydata/xarray/issues/3697#issuecomment-574837235 | https://api.github.com/repos/pydata/xarray/issues/3697 | MDEyOklzc3VlQ29tbWVudDU3NDgzNzIzNQ== | rabernat 1197350 | 2020-01-15T20:15:17Z | 2020-01-15T20:15:17Z | MEMBER | Many projects have moved away from RTD for this reason. We can easily build the docs in travis and then use doctr to deploy them. Is there a downside to this? |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
documentation build issues on RTD 550335922 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 5