pull_requests
5 rows where user = 3404817
This data as json, CSV (advanced)
Suggested facets: title, created_at (date), updated_at (date), closed_at (date), merged_at (date)
| id ▼ | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | auto_merge | repo | url | merged_by |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 94598132 | MDExOlB1bGxSZXF1ZXN0OTQ1OTgxMzI= | 1133 | closed | 0 | use safe_cast_to_index to sanitize DataArrays for groupby | j08lue 3404817 | Fixes https://github.com/pydata/xarray/issues/1132 Let me know whether this is a valid bug fix or I am misunderstanding something. | 2016-11-21T11:33:08Z | 2016-12-19T17:12:31Z | 2016-12-19T17:11:57Z | 2016-12-19T17:11:57Z | 34fd2b6cb94dfb824c5371c37b6eb5e70a88260f | 0 | 1270b7657594daf6e87ff53a792d26e98f7eb08a | 260674d19a2c04a6b5f2c1d7369eb2eef26926b0 | CONTRIBUTOR | xarray 13221727 | https://github.com/pydata/xarray/pull/1133 | ||||
| 99329067 | MDExOlB1bGxSZXF1ZXN0OTkzMjkwNjc= | 1184 | closed | 0 | Add test for issue 1140 | j08lue 3404817 | #1140 | 2016-12-25T20:37:16Z | 2017-03-30T23:08:40Z | 2017-03-30T23:08:40Z | 346779b42a4807d34a5314eb4bf362faac0627a9 | 0 | ebc7702edaf4c4e4212d763021d181483c286953 | 88cc396f5117c09c76e15d6383aeca32b4d4a8bd | CONTRIBUTOR | xarray 13221727 | https://github.com/pydata/xarray/pull/1184 | |||||
| 165583120 | MDExOlB1bGxSZXF1ZXN0MTY1NTgzMTIw | 1863 | closed | 0 | test decoding num_dates in float types | j08lue 3404817 | - [x] Closes #1859 - [x] Tests added (for all bug fixes or enhancements) - [x] Tests passed (for all non-documentation changes) - [x] try to find the origin of the difference in behaviour between `v0.10.0` and current base | 2018-01-28T19:34:52Z | 2018-02-10T12:16:26Z | 2018-02-02T02:01:47Z | 2018-02-02T02:01:47Z | becd77c44d9436a142db4a98de2b30d388ee6d2b | 0 | d1659b66d4b93ea4cf67281c761fb1dd7cf315b5 | 015daca45bd7be32377bdf429c02117d5955452c | CONTRIBUTOR | xarray 13221727 | https://github.com/pydata/xarray/pull/1863 | ||||
| 172975968 | MDExOlB1bGxSZXF1ZXN0MTcyOTc1OTY4 | 1965 | closed | 0 | avoid integer overflow when decoding large time numbers | j08lue 3404817 | The issue: `int32` time data in seconds or so leads to an overflow in time decoding. This is in way the back side of #1859: By ensuring that `_NS_PER_TIME_DELTA` is integer, we got rid of round-off errors that were due to casting to float but now we are getting `int` overflow in this line: https://github.com/pydata/xarray/blob/0e73e240107caee3ffd1a1149f0150c390d43251/xarray/coding/times.py#L169-L170 e.g. '2001-01-01' in `seconds since 1970-01-01` means `np.array([978307200]) * int(1e9)` which gives `288686080` that gets decoded to '1970-01-01T00:00:00.288686080' -- note also the trailing digits. Something is very wrong here. - [x] Tests added (for all bug fixes or enhancements) - [x] Tests passed (for all non-documentation changes) - [x] Fully documented, including `whats-new.rst` for all changes | 2018-03-05T20:21:20Z | 2018-05-01T12:41:28Z | 2018-05-01T12:41:28Z | f63ee22022040a325e2965d0fd704454c7f44c78 | 0 | 9f852780219088f2d3efdecf4ec787e00fcf334f | 33095885e6a4d7b2504ced5d9d4d34f1d6e872e2 | CONTRIBUTOR | xarray 13221727 | https://github.com/pydata/xarray/pull/1965 | |||||
| 185131284 | MDExOlB1bGxSZXF1ZXN0MTg1MTMxMjg0 | 2096 | closed | 0 | avoid integer overflow when decoding large time numbers | j08lue 3404817 | - [x] Closes #1965 - [x] Tests added (for all bug fixes or enhancements) - [ ] Tests passed (for all non-documentation changes) - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API | 2018-05-01T07:02:24Z | 2018-05-01T12:41:13Z | 2018-05-01T12:41:08Z | 1017f0396a086218e030ce2348827ce62c25c10a | 0 | 2bfd60ca95a759dc38d4cc7288f21b9a978976ee | d1e1440dc5d0bc9c341da20fde85b56f2a3c1b5b | CONTRIBUTOR | xarray 13221727 | https://github.com/pydata/xarray/pull/2096 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [pull_requests] (
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[number] INTEGER,
[state] TEXT,
[locked] INTEGER,
[title] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[body] TEXT,
[created_at] TEXT,
[updated_at] TEXT,
[closed_at] TEXT,
[merged_at] TEXT,
[merge_commit_sha] TEXT,
[assignee] INTEGER REFERENCES [users]([id]),
[milestone] INTEGER REFERENCES [milestones]([id]),
[draft] INTEGER,
[head] TEXT,
[base] TEXT,
[author_association] TEXT,
[auto_merge] TEXT,
[repo] INTEGER REFERENCES [repos]([id]),
[url] TEXT,
[merged_by] INTEGER REFERENCES [users]([id])
);
CREATE INDEX [idx_pull_requests_merged_by]
ON [pull_requests] ([merged_by]);
CREATE INDEX [idx_pull_requests_repo]
ON [pull_requests] ([repo]);
CREATE INDEX [idx_pull_requests_milestone]
ON [pull_requests] ([milestone]);
CREATE INDEX [idx_pull_requests_assignee]
ON [pull_requests] ([assignee]);
CREATE INDEX [idx_pull_requests_user]
ON [pull_requests] ([user]);