home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

8 rows where issue = 197083082 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user 3

  • shoyer 4
  • mrocklin 3
  • rabernat 1

issue 1

  • Switch to shared Lock (SerializableLock if possible) for reading/writing · 8 ✖

author_association 1

  • MEMBER 8
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
270427441 https://github.com/pydata/xarray/pull/1179#issuecomment-270427441 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI3MDQyNzQ0MQ== shoyer 1217238 2017-01-04T17:12:58Z 2017-01-04T17:12:58Z MEMBER

In it goes. We're using conda-forge for all Travis-CI builds now.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082
270289632 https://github.com/pydata/xarray/pull/1179#issuecomment-270289632 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI3MDI4OTYzMg== mrocklin 306380 2017-01-04T03:53:58Z 2017-01-04T03:53:58Z MEMBER

It's up now on conda-forge if you're interested in switching over.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082
270289244 https://github.com/pydata/xarray/pull/1179#issuecomment-270289244 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI3MDI4OTI0NA== shoyer 1217238 2017-01-04T03:49:17Z 2017-01-04T03:49:17Z MEMBER

Hmm. It looks like we need dask 0.13 in conda to make the distributed integration pass.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082
270194818 https://github.com/pydata/xarray/pull/1179#issuecomment-270194818 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI3MDE5NDgxOA== shoyer 1217238 2017-01-03T19:03:22Z 2017-01-03T19:03:22Z MEMBER

I will update the xarray/dask-distributed integration and submit this later today. @rabernat it should solve your issues with to_netcdf.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082
270194206 https://github.com/pydata/xarray/pull/1179#issuecomment-270194206 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI3MDE5NDIwNg== mrocklin 306380 2017-01-03T19:00:53Z 2017-01-03T19:00:53Z MEMBER

Dask 0.13.0 has been released

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082
270193755 https://github.com/pydata/xarray/pull/1179#issuecomment-270193755 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI3MDE5Mzc1NQ== rabernat 1197350 2017-01-03T18:59:10Z 2017-01-03T18:59:10Z MEMBER

Is there a clear fail case we can use as a test to demonstrate the value here?

I have found a fail case related to distributed: attempting to use to_netcdf() with a dask.distributed client fails because the threading.Lock() can't be serialized. A SerializableLock would overcome this problem.

Consider this example:

```python import dask.array as da from distributed import Client import xarray as xr

def create_and_store_dataset(): shape = (10000, 1000) chunks = (1000, 1000) data = da.zeros(shape, chunks=chunks) ds = xr.DataArray(data).to_dataset() ds.to_netcdf('test_dataset.nc') print("Success!")

create_and_store_dataset() client = Client() create_and_store_dataset() ```

The first call succeeds, while the second fails with TypeError: can't pickle thread.lock objects.

When using the distributed client, I can successfully call .store on the underlying dask array if I pass lock=SerializableLock().

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082
268940513 https://github.com/pydata/xarray/pull/1179#issuecomment-268940513 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI2ODk0MDUxMw== shoyer 1217238 2016-12-23T04:51:52Z 2016-12-23T04:51:52Z MEMBER

@mrocklin We could update our dask-distributed integration tests to avoid lock=False, but that will need to wait until the next dask release for tests to pass on CI.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082
268710453 https://github.com/pydata/xarray/pull/1179#issuecomment-268710453 https://api.github.com/repos/pydata/xarray/issues/1179 MDEyOklzc3VlQ29tbWVudDI2ODcxMDQ1Mw== mrocklin 306380 2016-12-22T03:35:56Z 2016-12-22T03:35:56Z MEMBER

Is there a clear fail case we can use as a test to demonstrate the value here?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Switch to shared Lock (SerializableLock if possible) for reading/writing 197083082

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 15.248ms · About: xarray-datasette