home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where issue = 550335922 and user = 14808389 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date)

user 1

  • keewis · 3 ✖

issue 1

  • documentation build issues on RTD · 3 ✖

author_association 1

  • MEMBER 3
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
782938240 https://github.com/pydata/xarray/issues/3697#issuecomment-782938240 https://api.github.com/repos/pydata/xarray/issues/3697 MDEyOklzc3VlQ29tbWVudDc4MjkzODI0MA== keewis 14808389 2021-02-21T22:21:55Z 2021-02-25T13:52:03Z MEMBER

we didn't see this for quite some time, so I assume we can close this.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  documentation build issues on RTD 550335922
579397908 https://github.com/pydata/xarray/issues/3697#issuecomment-579397908 https://api.github.com/repos/pydata/xarray/issues/3697 MDEyOklzc3VlQ29tbWVudDU3OTM5NzkwOA== keewis 14808389 2020-01-28T18:49:17Z 2020-01-28T22:42:50Z MEMBER

it may be that the timeouts are not caused by RTD: I have been trying to build the documentation several times and it sometimes pauses while trying to read / build (?) dask.rst. This is the traceback if I cause a KeyboardInterrupt:

```pytb KeyboardInterrupt Traceback (most recent call last) <ipython-input-4-2ef53683336b> in <module> ----> 1 ds.to_netcdf('manipulated-example-data.nc') .../xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf) 1543 unlimited_dims=unlimited_dims, 1544 compute=compute, -> 1545 invalid_netcdf=invalid_netcdf, 1546 ) 1547 .../xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf) 1095 return writer, store 1096 -> 1097 writes = writer.sync(compute=compute) 1098 1099 if path_or_file is None: .../xarray/backends/common.py in sync(self, compute) 202 compute=compute, 203 flush=True, --> 204 regions=self.regions, 205 ) 206 self.sources = [] .../lib/python3.7/site-packages/dask/array/core.py in store(sources, targets, lock, regions, compute, return_stored, **kwargs) 921 922 if compute: --> 923 result.compute(**kwargs) 924 return None 925 else: .../lib/python3.7/site-packages/dask/base.py in compute(self, **kwargs) 163 dask.base.compute 164 """ --> 165 (result,) = compute(self, traverse=False, **kwargs) 166 return result 167 .../lib/python3.7/site-packages/dask/base.py in compute(*args, **kwargs) 434 keys = [x.__dask_keys__() for x in collections] 435 postcomputes = [x.__dask_postcompute__() for x in collections] --> 436 results = schedule(dsk, keys, **kwargs) 437 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)]) 438 .../lib/python3.7/site-packages/dask/threaded.py in get(dsk, result, cache, num_workers, pool, **kwargs) 79 get_id=_thread_get_id, 80 pack_exception=pack_exception, ---> 81 **kwargs 82 ) 83 .../lib/python3.7/site-packages/dask/local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs) 473 # Main loop, wait on tasks to finish, insert new ones 474 while state["waiting"] or state["ready"] or state["running"]: --> 475 key, res_info, failed = queue_get(queue) 476 if failed: 477 exc, tb = loads(res_info) .../lib/python3.7/site-packages/dask/local.py in queue_get(q) 131 132 def queue_get(q): --> 133 return q.get() 134 135 .../lib/python3.7/queue.py in get(self, block, timeout) 168 elif timeout is None: 169 while not self._qsize(): --> 170 self.not_empty.wait() 171 elif timeout < 0: 172 raise ValueError("'timeout' must be a non-negative number") .../lib/python3.7/threading.py in wait(self, timeout) 294 try: # restore state no matter what (e.g., KeyboardInterrupt) 295 if timeout is None: --> 296 waiter.acquire() 297 gotit = True 298 else: KeyboardInterrupt: ```

Is there something that could cause a dead lock?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  documentation build issues on RTD 550335922
574847265 https://github.com/pydata/xarray/issues/3697#issuecomment-574847265 https://api.github.com/repos/pydata/xarray/issues/3697 MDEyOklzc3VlQ29tbWVudDU3NDg0NzI2NQ== keewis 14808389 2020-01-15T20:42:11Z 2020-01-15T20:42:11Z MEMBER

not sure on downsides, but I think we could use this to provide a documentation preview for PRs?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  documentation build issues on RTD 550335922

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 2956.755ms · About: xarray-datasette