issues
5 rows where type = "pull" and user = 3404817 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: title, comments, created_at (date), updated_at (date), closed_at (date)
| id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 302447879 | MDExOlB1bGxSZXF1ZXN0MTcyOTc1OTY4 | 1965 | avoid integer overflow when decoding large time numbers | j08lue 3404817 | closed | 0 | 6 | 2018-03-05T20:21:20Z | 2018-05-01T12:41:28Z | 2018-05-01T12:41:28Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1965 | The issue: This is in way the back side of #1859: By ensuring that e.g. '2001-01-01' in
|
{
"url": "https://api.github.com/repos/pydata/xarray/issues/1965/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
xarray 13221727 | pull | |||||
| 319132629 | MDExOlB1bGxSZXF1ZXN0MTg1MTMxMjg0 | 2096 | avoid integer overflow when decoding large time numbers | j08lue 3404817 | closed | 0 | 3 | 2018-05-01T07:02:24Z | 2018-05-01T12:41:13Z | 2018-05-01T12:41:08Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/2096 |
|
{
"url": "https://api.github.com/repos/pydata/xarray/issues/2096/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
xarray 13221727 | pull | |||||
| 292231408 | MDExOlB1bGxSZXF1ZXN0MTY1NTgzMTIw | 1863 | test decoding num_dates in float types | j08lue 3404817 | closed | 0 | 4 | 2018-01-28T19:34:52Z | 2018-02-10T12:16:26Z | 2018-02-02T02:01:47Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1863 |
|
{
"url": "https://api.github.com/repos/pydata/xarray/issues/1863/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
xarray 13221727 | pull | |||||
| 197514417 | MDExOlB1bGxSZXF1ZXN0OTkzMjkwNjc= | 1184 | Add test for issue 1140 | j08lue 3404817 | closed | 0 | 2 | 2016-12-25T20:37:16Z | 2017-03-30T23:08:40Z | 2017-03-30T23:08:40Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1184 | 1140 |
{
"url": "https://api.github.com/repos/pydata/xarray/issues/1184/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
xarray 13221727 | pull | |||||
| 190690822 | MDExOlB1bGxSZXF1ZXN0OTQ1OTgxMzI= | 1133 | use safe_cast_to_index to sanitize DataArrays for groupby | j08lue 3404817 | closed | 0 | 2 | 2016-11-21T11:33:08Z | 2016-12-19T17:12:31Z | 2016-12-19T17:11:57Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1133 | Fixes https://github.com/pydata/xarray/issues/1132 Let me know whether this is a valid bug fix or I am misunderstanding something. |
{
"url": "https://api.github.com/repos/pydata/xarray/issues/1133/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
xarray 13221727 | pull |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] (
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[number] INTEGER,
[title] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[state] TEXT,
[locked] INTEGER,
[assignee] INTEGER REFERENCES [users]([id]),
[milestone] INTEGER REFERENCES [milestones]([id]),
[comments] INTEGER,
[created_at] TEXT,
[updated_at] TEXT,
[closed_at] TEXT,
[author_association] TEXT,
[active_lock_reason] TEXT,
[draft] INTEGER,
[pull_request] TEXT,
[body] TEXT,
[reactions] TEXT,
[performed_via_github_app] TEXT,
[state_reason] TEXT,
[repo] INTEGER REFERENCES [repos]([id]),
[type] TEXT
);
CREATE INDEX [idx_issues_repo]
ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
ON [issues] ([user]);