home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 144957100 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 2

  • shoyer 1
  • pwolfram 1

author_association 2

  • CONTRIBUTOR 1
  • MEMBER 1

issue 1

  • Load fails following squeeze · 2 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
204161956 https://github.com/pydata/xarray/issues/813#issuecomment-204161956 https://api.github.com/repos/pydata/xarray/issues/813 MDEyOklzc3VlQ29tbWVudDIwNDE2MTk1Ng== shoyer 1217238 2016-03-31T22:46:57Z 2016-03-31T22:47:31Z MEMBER

This looks like another dask.array bug (because it only turns up before data is loaded). CC @mrocklin @jcrist

xarray's squeeze just uses array indexing under the covers (e.g., x.squeeze() -> x.data[..., 0, ...] where you should replace ... by an appropriate number of slices :). So the usual strategy would be to verify that squeezing one of these variables in dask gives the same error message (e.g., ds['something'].data[..., 0, ...].compute()), and then construct a failing test case for dask that doesn't rely on xarray by making an array with synthetic data with the same dtype and chunks.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Load fails following squeeze 144957100
204021074 https://github.com/pydata/xarray/issues/813#issuecomment-204021074 https://api.github.com/repos/pydata/xarray/issues/813 MDEyOklzc3VlQ29tbWVudDIwNDAyMTA3NA== pwolfram 4295853 2016-03-31T17:00:00Z 2016-03-31T17:00:00Z CONTRIBUTOR

For reference, the data in acase.isel(Nb=layernum).sel(Np=np.where(idx)[1]) is

<xarray.Dataset> Dimensions: (Np: 1449, Nr: 1, Nt-1: 27, Time: 28) Coordinates: yearoffset |S4 '1700' Nb float64 1.029e+03 rlzn (Nr) int64 0 time (Time) datetime64[ns] 1724-01-01 1724-01-02 1724-01-03 ... * Np (Np) int64 18295 18296 18297 18298 18299 18300 18301 ... * Nr (Nr) int64 0 * Nt-1 (Nt-1) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ... * Time (Time) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ... Data variables: lat (Nr, Time, Np) float64 4.027e+05 4.027e+05 4.027e+05 ... notoutcropped (Nr, Time, Np) int64 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ... lon (Nr, Time, Np) float64 4.775e+05 4.825e+05 4.875e+05 ... dtdays (Nr, Nt-1) float64 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 ...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Load fails following squeeze 144957100

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 3285.368ms · About: xarray-datasette