home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

8 rows where type = "pull" and user = 1312546 sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

type 1

  • pull · 8 ✖

state 1

  • closed 8

repo 1

  • xarray 8
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1038531231 PR_kwDOAMm_X84tzEEk 5906 Avoid accessing slow .data in unstack TomAugspurger 1312546 closed 0     4 2021-10-28T13:39:36Z 2021-10-29T15:29:39Z 2021-10-29T15:14:43Z MEMBER   0 pydata/xarray/pulls/5906
  • [x] Closes https://github.com/pydata/xarray/issues/5902
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5906/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
704668670 MDExOlB1bGxSZXF1ZXN0NDg5NTQ5MzIx 4438 Fixed dask.optimize on datasets TomAugspurger 1312546 closed 0     3 2020-09-18T21:30:17Z 2020-09-20T05:21:58Z 2020-09-20T05:21:58Z MEMBER   0 pydata/xarray/pulls/4438

Another attempt to fix #3698. The issue with my fix in is that we hit Variable._dask_finalize in both dask.optimize and dask.persist. We want to do the culling of unnecessary tasks (test_persist_Dataset) but only in the persist case, not optimize (test_optimize).

  • [x] Closes #3698
  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4438/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
703881154 MDExOlB1bGxSZXF1ZXN0NDg4OTA4MTI5 4432 Fix optimize for chunked DataArray TomAugspurger 1312546 closed 0     8 2020-09-17T20:16:08Z 2020-09-18T13:20:45Z 2020-09-17T23:19:23Z MEMBER   0 pydata/xarray/pulls/4432

Previously we generated in invalidate Dask task graph, becuase the lines removed here dropped keys that were referenced elsewhere in the task graph. The original implementation had a comment indicating that this was to cull: https://github.com/pydata/xarray/blob/502a988ad5b87b9f3aeec3033bf55c71272e1053/xarray/core/variable.py#L384

Just spot-checking things, I think we're OK here though. Something like dask.visualize(arr[[0]], optimize_graph=True) indicates that we're OK.

  • [x] Closes #3698
  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4432/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
672281867 MDExOlB1bGxSZXF1ZXN0NDYyMzQ2NzE4 4305 Fix map_blocks examples TomAugspurger 1312546 closed 0     5 2020-08-03T19:06:58Z 2020-08-04T07:27:08Z 2020-08-04T03:38:51Z MEMBER   0 pydata/xarray/pulls/4305

The examples on master raised with

pytb ValueError: Result from applying user function has unexpected coordinate variables {'month'}.

This PR updates the example to include the month coordinate. pytest --doctest-modules passes on these three now.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4305/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
672195744 MDExOlB1bGxSZXF1ZXN0NDYyMjc2NDEw 4303 Update map_blocks and map_overlap docstrings TomAugspurger 1312546 closed 0     1 2020-08-03T16:27:45Z 2020-08-03T18:35:43Z 2020-08-03T18:06:10Z MEMBER   0 pydata/xarray/pulls/4303

This reference an obj argument that only exists in parallel. The object being referenced is actually self.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4303/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
533555794 MDExOlB1bGxSZXF1ZXN0MzQ5NjA5NDM3 3598 Fix map_blocks HLG layering TomAugspurger 1312546 closed 0     2 2019-12-05T19:41:23Z 2019-12-07T04:30:19Z 2019-12-07T04:30:19Z MEMBER   0 pydata/xarray/pulls/3598

[x] closes #3599

This fixes an issue with the HighLevelGraph noted in https://github.com/pydata/xarray/pull/3584, and exposed by a recent change in Dask to do more HLG fusion.

cc @dcherian.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3598/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
400997415 MDExOlB1bGxSZXF1ZXN0MjQ2MDQ4MDcx 2693 Update asv.conf.json TomAugspurger 1312546 closed 0     1 2019-01-19T13:45:51Z 2019-01-19T19:42:48Z 2019-01-19T17:45:20Z MEMBER   0 pydata/xarray/pulls/2693

Is xarray 3.5+ now? Congrats, I didn't realize that.

This started failing the benchmark machine, which I was tending to last night.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2693/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
251773472 MDExOlB1bGxSZXF1ZXN0MTM2ODQ1MjE2 1515 Added show_commit_url to asv.conf TomAugspurger 1312546 closed 0     0 2017-08-21T21:17:10Z 2017-08-23T16:01:50Z 2017-08-23T16:01:50Z MEMBER   0 pydata/xarray/pulls/1515

This should setup the proper links from the published output to the commit on Github.

FYI the benchmarks should be running stably now, and posted to http://pandas.pydata.org/speed/xarray. http://pandas.pydata.org/speed/xarray/regressions.xml has an RSS feed to the regressions.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1515/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 20.317ms · About: xarray-datasette
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows