home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

5 rows where author_association = "MEMBER", issue = 252543868 and user = 6213168 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • crusaderky · 5 ✖

issue 1

  • Dataset.__repr__ computes dask variables · 5 ✖

author_association 1

  • MEMBER · 5 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
325424275 https://github.com/pydata/xarray/issues/1522#issuecomment-325424275 https://api.github.com/repos/pydata/xarray/issues/1522 MDEyOklzc3VlQ29tbWVudDMyNTQyNDI3NQ== crusaderky 6213168 2017-08-28T17:44:32Z 2017-08-28T17:44:32Z MEMBER

Travis is failing in a few environments, but I just tested that I get the exact same errors in the master branch

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.__repr__ computes dask variables 252543868
325415311 https://github.com/pydata/xarray/issues/1522#issuecomment-325415311 https://api.github.com/repos/pydata/xarray/issues/1522 MDEyOklzc3VlQ29tbWVudDMyNTQxNTMxMQ== crusaderky 6213168 2017-08-28T17:12:51Z 2017-08-28T17:17:09Z MEMBER

Given this: ``` def kernel(): print("Kernel invoked!") return numpy.array([100, 200])

data = dask.array.Array(name='foo', dask={('foo', 0): (kernel, )}, chunks=((2,),), dtype=float) This correctly computes the coord once: ds = xarray.Dataset(coords={'z': ('z', data)}) While this computes it twice: ds = xarray.Dataset() ds.coords['z'] = ('z', data) ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.__repr__ computes dask variables 252543868
325373215 https://github.com/pydata/xarray/issues/1522#issuecomment-325373215 https://api.github.com/repos/pydata/xarray/issues/1522 MDEyOklzc3VlQ29tbWVudDMyNTM3MzIxNQ== crusaderky 6213168 2017-08-28T14:42:33Z 2017-08-28T14:48:12Z MEMBER

This is in Jupyter:

``` def kernel(): print("Kernel invoked!") return numpy.array([100, 200])

data = dask.array.Array(name='foo', dask={('foo', 0): (kernel, )}, chunks=((2,),), dtype=float)

ds = xarray.Dataset( data_vars={ 'foo': xarray.DataArray(data, dims=['x'], coords={'x': [10, 20]}), 'bar': xarray.DataArray([1, 2], dims=['x'], coords={'x': [10, 20]}), }) ds.coords['y'] = ('y', [3, 4]) ds.coords['x2'] = ('x', data) ds Output: Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Kernel invoked! Out[2]: <xarray.Dataset> Dimensions: (x: 2, y: 2) Coordinates: * x (x) int64 10 20 * y (y) int64 3 4 x2 (x) float64 100 200 Data variables: foo (x) float64 100 200 bar (x) int64 1 2 ```

YIKES! Here you have two separate issues: one is with repr() as expected. The other is the fact that Jupyter invokes getattr() on a plethora of _ipython_* attributes that xarray doesn't define, and there was an issue where all non-index coords were blindly being converted to indexVariable (which loads the data) every. single. time.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.__repr__ computes dask variables 252543868
325374125 https://github.com/pydata/xarray/issues/1522#issuecomment-325374125 https://api.github.com/repos/pydata/xarray/issues/1522 MDEyOklzc3VlQ29tbWVudDMyNTM3NDEyNQ== crusaderky 6213168 2017-08-28T14:45:45Z 2017-08-28T14:45:45Z MEMBER

There is a separate problem where index coords are computed twice. Didn't fix it yet and I am afraid of a domino effect. The problem is in merge.py:merge_coords(): _assert_compat_valid(compat) coerced = coerce_pandas_values(objs) aligned = deep_align(coerced, join=join, copy=False, indexes=indexes) expanded = expand_variable_dicts(aligned) priority_vars = _get_priority_vars(aligned, priority_arg, compat=compat) variables = merge_variables(expanded, priority_vars, compat=compat) assert_unique_multiindex_level_names(variables) Here, both expand_variable_dicts() and _get_priority_vars() compute the dask array.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.__repr__ computes dask variables 252543868
325289130 https://github.com/pydata/xarray/issues/1522#issuecomment-325289130 https://api.github.com/repos/pydata/xarray/issues/1522 MDEyOklzc3VlQ29tbWVudDMyNTI4OTEzMA== crusaderky 6213168 2017-08-28T08:10:02Z 2017-08-28T08:10:02Z MEMBER

working on this now

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.__repr__ computes dask variables 252543868

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 30.793ms · About: xarray-datasette