issue_comments
11 rows where author_association = "MEMBER" and issue = 618828102 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue 1
- Auto chunk · 11 ✖
| id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 633691103 | https://github.com/pydata/xarray/pull/4064#issuecomment-633691103 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYzMzY5MTEwMw== | dcherian 2448579 | 2020-05-25T19:20:52Z | 2020-05-25T19:20:52Z | MEMBER | Thanks @AndrewWilliams3142 |
{
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 632093104 | https://github.com/pydata/xarray/pull/4064#issuecomment-632093104 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYzMjA5MzEwNA== | keewis 14808389 | 2020-05-21T13:42:08Z | 2020-05-21T13:42:08Z | MEMBER |
|
{
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 631604681 | https://github.com/pydata/xarray/pull/4064#issuecomment-631604681 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYzMTYwNDY4MQ== | dcherian 2448579 | 2020-05-20T17:05:41Z | 2020-05-20T17:05:41Z | MEMBER | Thanks @AndrewWilliams3142 . We should add a test for |
{
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629355884 | https://github.com/pydata/xarray/pull/4064#issuecomment-629355884 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTM1NTg4NA== | dcherian 2448579 | 2020-05-15T16:28:14Z | 2020-05-15T16:28:14Z | MEMBER | ah right. this is now rechunking chunked objects to a single chunk when |
{
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629199017 | https://github.com/pydata/xarray/pull/4064#issuecomment-629199017 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTE5OTAxNw== | keewis 14808389 | 2020-05-15T12:09:18Z | 2020-05-15T12:12:05Z | MEMBER | then you'll have to run For future reference, github does not work terribly well with rebases (or merges without a merge commit) so it would be good to avoid those. Edit: ignore the |
{
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629192230 | https://github.com/pydata/xarray/pull/4064#issuecomment-629192230 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTE5MjIzMA== | keewis 14808389 | 2020-05-15T11:52:27Z | 2020-05-15T11:52:27Z | MEMBER | if you have any unpushed commits: could you push them now? While they would not be lost, it's way easier to handle them now than after the force-push |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629189848 | https://github.com/pydata/xarray/pull/4064#issuecomment-629189848 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTE4OTg0OA== | keewis 14808389 | 2020-05-15T11:46:09Z | 2020-05-15T11:46:09Z | MEMBER | there's something wrong with the merge. Are you able to resolve that by yourself, or should I fix it for you? |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629171237 | https://github.com/pydata/xarray/pull/4064#issuecomment-629171237 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTE3MTIzNw== | keewis 14808389 | 2020-05-15T10:57:29Z | 2020-05-15T11:01:53Z | MEMBER | no, the If you need more explanations on Edit: the book even has a section on Github |
{
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629162166 | https://github.com/pydata/xarray/pull/4064#issuecomment-629162166 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTE2MjE2Ng== | keewis 14808389 | 2020-05-15T10:33:56Z | 2020-05-15T10:33:56Z | MEMBER | to merge |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629149680 | https://github.com/pydata/xarray/pull/4064#issuecomment-629149680 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTE0OTY4MA== | keewis 14808389 | 2020-05-15T10:04:58Z | 2020-05-15T10:09:13Z | MEMBER | no need to rebase, simply merge Edit: the failing |
{
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 | |
| 629149164 | https://github.com/pydata/xarray/pull/4064#issuecomment-629149164 | https://api.github.com/repos/pydata/xarray/issues/4064 | MDEyOklzc3VlQ29tbWVudDYyOTE0OTE2NA== | kmuehlbauer 5821660 | 2020-05-15T10:03:42Z | 2020-05-15T10:03:42Z | MEMBER |
@AndrewWilliams3142 This is due to flake8 applying new changes from pycodestyle. xarray-devs already dived into this, see https://github.com/pydata/xarray/pull/4057. You might just need to rebase with current master, to make the error go away. |
{
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Auto chunk 618828102 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] (
[html_url] TEXT,
[issue_url] TEXT,
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[created_at] TEXT,
[updated_at] TEXT,
[author_association] TEXT,
[body] TEXT,
[reactions] TEXT,
[performed_via_github_app] TEXT,
[issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
ON [issue_comments] ([user]);
user 3