home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

9 rows where issue = 521887948 and user = 5635139 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user 1

  • max-sixty · 9 ✖

issue 1

  • Silence sphinx warnings · 9 ✖

author_association 1

  • MEMBER 9
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
555802108 https://github.com/pydata/xarray/pull/3516#issuecomment-555802108 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1NTgwMjEwOA== max-sixty 5635139 2019-11-20T02:07:29Z 2019-11-20T02:07:29Z MEMBER

Done! https://twitter.com/xarray_dev/status/1196963951340392448?s=20

(And in case others see this: we're going to be calling out a few others who have made big contributions recently too; we're really excited and appreciative of those who've recently made such prolific contributions)

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
555776225 https://github.com/pydata/xarray/pull/3516#issuecomment-555776225 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1NTc3NjIyNQ== max-sixty 5635139 2019-11-20T00:17:06Z 2019-11-20T00:17:06Z MEMBER

@keewis do you have a twitter username and first name? We're going to start calling out big new contributors on our twitter and release notes. (@keewis is also OK if you prefer!)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
555336127 https://github.com/pydata/xarray/pull/3516#issuecomment-555336127 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1NTMzNjEyNw== max-sixty 5635139 2019-11-19T05:10:38Z 2019-11-19T05:10:38Z MEMBER

It would be awesome to be able to enable that @keewis, so we don't regress in future. We can merge this for now and come back to that?

I may have some time later this week to look more directly.

Great work, again!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
555074077 https://github.com/pydata/xarray/pull/3516#issuecomment-555074077 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1NTA3NDA3Nw== max-sixty 5635139 2019-11-18T15:45:52Z 2019-11-18T15:45:52Z MEMBER

Great! That's cool we can start checking warnings.

I think wrapping the numpy docstrings seems pretty reasonable—what are you concerns? Any thoughts @dcherian ?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
554790762 https://github.com/pydata/xarray/pull/3516#issuecomment-554790762 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1NDc5MDc2Mg== max-sixty 5635139 2019-11-17T21:44:50Z 2019-11-17T21:44:50Z MEMBER

would it be reasonable to always require a format of Name <https://github.com/user>? The warnings regarding these was due to inconsistencies with the link location (mostly a trailing slash or using http instead of https).

For sure. How's your regex-fu for find-replacing? I can try and have a go otherwise

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
554642708 https://github.com/pydata/xarray/pull/3516#issuecomment-554642708 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1NDY0MjcwOA== max-sixty 5635139 2019-11-16T14:33:13Z 2019-11-16T14:33:13Z MEMBER

Yes good idea @keewis

Where do you see the excessive memory consumption? When building locally?

I see the tests failing from:

`` Sphinx parallel build error: RuntimeError: Non Expected warning in/home/vsts/work/1/s/doc/plotting.rst` line 570

[error]The operation was canceled.

```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
554594058 https://github.com/pydata/xarray/pull/3516#issuecomment-554594058 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1NDU5NDA1OA== max-sixty 5635139 2019-11-16T02:12:55Z 2019-11-16T02:12:55Z MEMBER

There are also a lot of those that reference xray but apart from that would work fine.

I think it'd be good to do a find/replace on those, if that solves our issues?

Or are there lots that remain despite that?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
553621149 https://github.com/pydata/xarray/pull/3516#issuecomment-553621149 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1MzYyMTE0OQ== max-sixty 5635139 2019-11-13T21:52:52Z 2019-11-13T21:52:52Z MEMBER

I took a shot at using :orphan:, but doc/README.rst seems like it should be excluded from the documentation?

Is doc/README.rst some special page we have to have? It seems to have no content value at the moment? Could we remove?

despite the documentation claiming otherwise (and indeed, DataArrayGroupBy explicitly implements quantile while DatasetGroupBy doesn't). Does this deserve its own issue?

Definitely, good spot

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948
553222315 https://github.com/pydata/xarray/pull/3516#issuecomment-553222315 https://api.github.com/repos/pydata/xarray/issues/3516 MDEyOklzc3VlQ29tbWVudDU1MzIyMjMxNQ== max-sixty 5635139 2019-11-13T03:28:14Z 2019-11-13T03:28:14Z MEMBER

Awesome, thanks @keewis !!

I think the changes in https://github.com/pydata/xarray/commit/5d9d263e40e1f67910cbefaf96d46a91c560b8b5 are good (and having one standard is good for long-term readability)

For old whatsnew entries, I think it's fine that the references have decayed if we can silence the warnings (maybe easiest way is your suggestion to just remove the links, though)

Should we put an issue in for the numpy docstrings?

As a follow-up, is there a way to add a check in CI so these don't stack up again as people make changes? Or maybe difficult without stamping them all out?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Silence sphinx warnings 521887948

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 841.023ms · About: xarray-datasette