pull_requests
8 rows where user = 1312546
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date), merged_at (date)
id ▼ | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | auto_merge | repo | url | merged_by |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
136845216 | MDExOlB1bGxSZXF1ZXN0MTM2ODQ1MjE2 | 1515 | closed | 0 | Added show_commit_url to asv.conf | TomAugspurger 1312546 | This should setup the proper links from the published output to the commit on Github. FYI the benchmarks should be running stably now, and posted to http://pandas.pydata.org/speed/xarray. http://pandas.pydata.org/speed/xarray/regressions.xml has an RSS feed to the regressions. | 2017-08-21T21:17:10Z | 2017-08-23T16:01:50Z | 2017-08-23T16:01:50Z | 2017-08-23T16:01:50Z | 8e541deca2e20efe080aa1bca566d9966ea2f244 | 0 | d95d8c49bc06d448123916f33895a74910c0cd6b | f9464fd74d49b2d89bf973a810b7e78720304989 | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/1515 | ||||
246048071 | MDExOlB1bGxSZXF1ZXN0MjQ2MDQ4MDcx | 2693 | closed | 0 | Update asv.conf.json | TomAugspurger 1312546 | Is xarray 3.5+ now? Congrats, I didn't realize that. This started failing the benchmark machine, which I was tending to last night. | 2019-01-19T13:45:51Z | 2019-01-19T19:42:48Z | 2019-01-19T17:45:20Z | 2019-01-19T17:45:20Z | ec255eba7cce749c25e1d7b6f0a7fc537ff61841 | 0 | 067bbc7c7e92dba61dbde00220ffc75a2d63fbde | 385b36cdd34431b4f6f14aad1f222f989e7e2de2 | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/2693 | ||||
349609437 | MDExOlB1bGxSZXF1ZXN0MzQ5NjA5NDM3 | 3598 | closed | 0 | Fix map_blocks HLG layering | TomAugspurger 1312546 | [x] closes #3599 This fixes an issue with the HighLevelGraph noted in https://github.com/pydata/xarray/pull/3584, and exposed by a recent change in Dask to do more HLG fusion. cc @dcherian. | 2019-12-05T19:41:23Z | 2019-12-07T04:30:19Z | 2019-12-07T04:30:19Z | 2019-12-07T04:30:19Z | cafcaeea897894e3a2f44a38bd33c50a48c86215 | 0 | 0ea4ff84efecee4e788a7cb188dc461c3aba9f91 | 87a25b64898c94ea1e2a2e7a06d31ef602b116bf | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/3598 | ||||
462276410 | MDExOlB1bGxSZXF1ZXN0NDYyMjc2NDEw | 4303 | closed | 0 | Update map_blocks and map_overlap docstrings | TomAugspurger 1312546 | This reference an `obj` argument that only exists in parallel. The object being referenced is actually `self`. | 2020-08-03T16:27:45Z | 2020-08-03T18:35:43Z | 2020-08-03T18:06:10Z | 2020-08-03T18:06:10Z | 5200a182f324be21423fd2f8214b8ef04b5845ce | 0 | d519422d601a32cfaee15341396cda944412df63 | f99c6cca2df959df3db3c57592db97287fd28f15 | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/4303 | ||||
462346718 | MDExOlB1bGxSZXF1ZXN0NDYyMzQ2NzE4 | 4305 | closed | 0 | Fix map_blocks examples | TomAugspurger 1312546 | The examples on master raised with ```pytb ValueError: Result from applying user function has unexpected coordinate variables {'month'}. ``` This PR updates the example to include the `month` coordinate. `pytest --doctest-modules` passes on these three now. | 2020-08-03T19:06:58Z | 2020-08-04T07:27:08Z | 2020-08-04T03:38:51Z | 2020-08-04T03:38:51Z | e1dafe676812409834ccac3418ecf47600b00615 | 0 | 127883d539553b773f73bdd7471be1a2987ef7d3 | 5200a182f324be21423fd2f8214b8ef04b5845ce | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/4305 | ||||
488908129 | MDExOlB1bGxSZXF1ZXN0NDg4OTA4MTI5 | 4432 | closed | 0 | Fix optimize for chunked DataArray | TomAugspurger 1312546 | Previously we generated in invalidate Dask task graph, becuase the lines removed here dropped keys that were referenced elsewhere in the task graph. The original implementation had a comment indicating that this was to cull: https://github.com/pydata/xarray/blob/502a988ad5b87b9f3aeec3033bf55c71272e1053/xarray/core/variable.py#L384 Just spot-checking things, I think we're OK here though. Something like `dask.visualize(arr[[0]], optimize_graph=True)` indicates that we're OK. <!-- Feel free to remove check-list items aren't relevant to your change --> - [x] Closes #3698 - [x] Tests added - [x] Passes `isort . && black . && mypy . && flake8` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [x] New functions/methods are listed in `api.rst` | 2020-09-17T20:16:08Z | 2020-09-18T13:20:45Z | 2020-09-17T23:19:23Z | 2020-09-17T23:19:23Z | 9a8a62ba551e737dc87e39aded2f7cc788ff118d | 0 | 381aaf8cc37502907506011f8cb9f4149e229d2d | b0d8d93665dbb6d28e33dfd28ad27036c20c60bf | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/4432 | ||||
489549321 | MDExOlB1bGxSZXF1ZXN0NDg5NTQ5MzIx | 4438 | closed | 0 | Fixed dask.optimize on datasets | TomAugspurger 1312546 | Another attempt to fix #3698. The issue with my fix in is that we hit `Variable._dask_finalize` in both `dask.optimize` and `dask.persist`. We want to do the culling of unnecessary tasks (`test_persist_Dataset`) but only in the persist case, not optimize (`test_optimize`). <!-- Feel free to remove check-list items aren't relevant to your change --> - [x] Closes #3698 - [x] Tests added - [x] Passes `isort . && black . && mypy . && flake8` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` | 2020-09-18T21:30:17Z | 2020-09-20T05:21:58Z | 2020-09-20T05:21:58Z | 2020-09-20T05:21:57Z | 13c09dc28ec8ff791c6d87e2d8e80c362c65ffd4 | 0 | 8c501dfb560635465f769ccef79cfff9db1fd9d7 | 0c26211566d620b2f81dd79c15f8afcc37faacbc | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/4438 | ||||
768360740 | PR_kwDOAMm_X84tzEEk | 5906 | closed | 0 | Avoid accessing slow .data in unstack | TomAugspurger 1312546 | - [x] Closes https://github.com/pydata/xarray/issues/5902 - [x] Passes `pre-commit run --all-files` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` - [x] New functions/methods are listed in `api.rst` | 2021-10-28T13:39:36Z | 2021-10-29T15:29:39Z | 2021-10-29T15:14:43Z | 2021-10-29T15:14:43Z | b2ed62e95e452894dfd0a0aa156c3c7b0236c257 | 0 | 6363a761c62605bcf3552cacce1c48d5b543eeca | c210f8b9e3356590ee0d4e25dbb21b93cf7a5309 | MEMBER | xarray 13221727 | https://github.com/pydata/xarray/pull/5906 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [pull_requests] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [state] TEXT, [locked] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [body] TEXT, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [merged_at] TEXT, [merge_commit_sha] TEXT, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [draft] INTEGER, [head] TEXT, [base] TEXT, [author_association] TEXT, [auto_merge] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [url] TEXT, [merged_by] INTEGER REFERENCES [users]([id]) ); CREATE INDEX [idx_pull_requests_merged_by] ON [pull_requests] ([merged_by]); CREATE INDEX [idx_pull_requests_repo] ON [pull_requests] ([repo]); CREATE INDEX [idx_pull_requests_milestone] ON [pull_requests] ([milestone]); CREATE INDEX [idx_pull_requests_assignee] ON [pull_requests] ([assignee]); CREATE INDEX [idx_pull_requests_user] ON [pull_requests] ([user]);