home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

7 rows where author_association = "CONTRIBUTOR" and user = 6875882 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 4

  • Use deepcopy recursively on numpy arrays 4
  • Surprising deepcopy semantics with dtype='object' 1
  • Extracting `formatting_html` as a standalone library? 1
  • Coord name not set when `concat`ing along a DataArray 1

user 1

  • darikg · 7 ✖

author_association 1

  • CONTRIBUTOR · 7 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
830685713 https://github.com/pydata/xarray/issues/5240#issuecomment-830685713 https://api.github.com/repos/pydata/xarray/issues/5240 MDEyOklzc3VlQ29tbWVudDgzMDY4NTcxMw== darikg 6875882 2021-05-01T19:55:01Z 2021-05-01T19:55:01Z CONTRIBUTOR

Thanks @keewis! I was confusing dimension names and variable names. I would support raising or falling back to a reasonably sane default -- the reason I stumbled on this was having a None coord name was breaking the _repr_html in Jupyter and causing a much more confusing error message

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Coord name not set when `concat`ing along a DataArray 873713013
797099807 https://github.com/pydata/xarray/issues/5022#issuecomment-797099807 https://api.github.com/repos/pydata/xarray/issues/5022 MDEyOklzc3VlQ29tbWVudDc5NzA5OTgwNw== darikg 6875882 2021-03-11T22:45:13Z 2021-03-11T22:45:13Z CONTRIBUTOR

Thank you both! I'll report back if I get anywhere with it

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Extracting `formatting_html` as a standalone library? 828805728
682038763 https://github.com/pydata/xarray/pull/4379#issuecomment-682038763 https://api.github.com/repos/pydata/xarray/issues/4379 MDEyOklzc3VlQ29tbWVudDY4MjAzODc2Mw== darikg 6875882 2020-08-27T15:58:49Z 2020-08-27T15:58:49Z CONTRIBUTOR

Thanks for everybody's help!

Also just wanted to say, you guys have an incredible test suite.

{
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 1,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Use deepcopy recursively on numpy arrays 686495257
681999585 https://github.com/pydata/xarray/pull/4379#issuecomment-681999585 https://api.github.com/repos/pydata/xarray/issues/4379 MDEyOklzc3VlQ29tbWVudDY4MTk5OTU4NQ== darikg 6875882 2020-08-27T14:52:27Z 2020-08-27T14:52:27Z CONTRIBUTOR

I went with @keewis's last suggestion because I think it makes the most sense for deepcopy to behave as expected even with older versions of numpy

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Use deepcopy recursively on numpy arrays 686495257
681097210 https://github.com/pydata/xarray/pull/4379#issuecomment-681097210 https://api.github.com/repos/pydata/xarray/issues/4379 MDEyOklzc3VlQ29tbWVudDY4MTA5NzIxMA== darikg 6875882 2020-08-26T20:08:43Z 2020-08-26T20:08:43Z CONTRIBUTOR

Oh, sorry, I actually can reproduce it locally.

Turns out arrays in numpy 1.15 don't have __array_function__. So this:

python if deep: if hasattr(data, "__array_function__") or isinstance( data, dask_array_type ): data = copy.deepcopy(data) elif not isinstance(data, PandasIndexAdapter): # pandas.Index is immutable data = np.array(data)

could just be

python if deep and not isinstance(data, PandasIndexAdapter): data = copy.deepcopy(data)

But I'm not familiar with __array_function__ or why it's being used here, any thoughts?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Use deepcopy recursively on numpy arrays 686495257
681041552 https://github.com/pydata/xarray/pull/4379#issuecomment-681041552 https://api.github.com/repos/pydata/xarray/issues/4379 MDEyOklzc3VlQ29tbWVudDY4MTA0MTU1Mg== darikg 6875882 2020-08-26T18:13:32Z 2020-08-26T18:13:32Z CONTRIBUTOR

Weird, I can't reproduce that locally, even after reverting to numpy 1.15. Trying with a non-object class

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Use deepcopy recursively on numpy arrays 686495257
680363901 https://github.com/pydata/xarray/issues/4362#issuecomment-680363901 https://api.github.com/repos/pydata/xarray/issues/4362 MDEyOklzc3VlQ29tbWVudDY4MDM2MzkwMQ== darikg 6875882 2020-08-26T00:35:22Z 2020-08-26T00:35:22Z CONTRIBUTOR

Sure! I'll be back in a couple days when I get a chance

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Surprising deepcopy semantics with dtype='object' 683649612

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 17.167ms · About: xarray-datasette