home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

4 rows where repo = 13221727, type = "pull" and user = 10720577 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

type 1

  • pull · 4 ✖

state 1

  • closed 4

repo 1

  • xarray · 4 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
438537597 MDExOlB1bGxSZXF1ZXN0Mjc0NTQ3OTcz 2930 Bugfix/coords not deep copy pletchm 10720577 closed 0     3 2019-04-29T22:43:10Z 2023-01-20T10:58:37Z 2019-05-02T22:46:36Z CONTRIBUTOR   0 pydata/xarray/pulls/2930

This pull request fixes a bug that prevented making a complete deep copy of a DataArray or Dataset, because the coords weren't being deep copied. It took a small fix in the IndexVariable.copy method. This method now allows both deep and shallow copies of coords to be made.

This pull request corresponds to this issue https://github.com/pydata/xarray/issues/1463.

  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2930/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
439823329 MDExOlB1bGxSZXF1ZXN0Mjc1NTQ3ODcx 2936 BUGFIX: deep-copy wasn't copying coords, bug fixed within IndexVariable pletchm 10720577 closed 0     13 2019-05-02T22:58:40Z 2019-05-09T22:31:02Z 2019-05-08T14:44:25Z CONTRIBUTOR   0 pydata/xarray/pulls/2936

This pull request fixes a bug that prevented making a complete deep copy of a DataArray or Dataset, because the coords weren't being deep copied. It took a small fix in the IndexVariable.copy method. This method now allows both deep and shallow copies of coords to be made.

This pull request corresponds to this issue https://github.com/pydata/xarray/issues/1463. - [x] Tests added - [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2936/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
442394463 MDExOlB1bGxSZXF1ZXN0Mjc3NTE3NTQ2 2953 Mark test for copying coords of dataarray and dataset with xfail pletchm 10720577 closed 0     1 2019-05-09T19:25:40Z 2019-05-09T22:17:56Z 2019-05-09T22:17:53Z CONTRIBUTOR   0 pydata/xarray/pulls/2953

Mark test for copying coords of dataarray and dataset with xfail. It looks like the test fails for the shallow copy, and apparently only on Windows for some reason. In Windows coords seem to be immutable unless it's one dataarray deep copied from another (which is why only the deep=False test fails). So I decided to just mark the tests as xfail for now (but I'd be happy to create an issue and look into it more in the future).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2953/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
408340215 MDExOlB1bGxSZXF1ZXN0MjUxNjExMDM5 2757 Allow expand_dims() method to support inserting/broadcasting dimensions with size>1 pletchm 10720577 closed 0     4 2019-02-08T21:59:36Z 2019-03-26T02:42:11Z 2019-03-26T02:41:48Z CONTRIBUTOR   0 pydata/xarray/pulls/2757

This pull request enhances the expand_dims method for both Dataset and DataArray objects to support inserting/broadcasting dimensions with size > 1. It corresponds to this issue https://github.com/pydata/xarray/issues/2710.

Changes:

  1. dataset.expand_dims() method take dict like object where values represent length of dimensions or coordinates of dimesnsions
  2. dataarray.expand_dims() method take dict like object where values represent length of dimensions or coordinates of dimesnsions
  3. Add alternative option to passing a dict to the dim argument, which is now an optional kwarg, passing in each new dimension as its own kwarg
  4. Add expand_dims enhancement from issue 2710 to whats-new.rst

Included:

  • [ ] Tests added
  • [ ] Fully documented, including whats-new.rst for all changes and api.rst for new API

What's new:

All of the old functionality is still there, so it shouldn't break anyone's existing code that uses it.

You can now pass a dim as a dict, where the keys are the new dimensions and the values are either integers (giving the length of the new dimensions) or iterables (giving the coordinates of the new dimensions). ``` import numpy as np import xarray as xr

original = xr.Dataset({'x': ('a', np.random.randn(3)), 'y': (['b', 'a'], np.random.randn(4, 3))}, coords={'a': np.linspace(0, 1, 3), 'b': np.linspace(0, 1, 4), 'c': np.linspace(0, 1, 5)}, attrs={'key': 'entry'}) original <xarray.Dataset> Dimensions: (a: 3, b: 4, c: 5) Coordinates: * a (a) float64 0.0 0.5 1.0 * b (b) float64 0.0 0.3333 0.6667 1.0 * c (c) float64 0.0 0.25 0.5 0.75 1.0 Data variables: x (a) float64 -1.556 0.2178 0.6319 y (b, a) float64 0.5273 0.6652 0.3418 1.858 ... -0.3519 0.8088 0.8753 Attributes: key: entry original.expand_dims({"d": 4, "e": ["l", "m", "n"]}) <xarray.Dataset> Dimensions: (a: 3, b: 4, c: 5, d: 4, e: 3) Coordinates: * e (e) <U1 'l' 'm' 'n' * a (a) float64 0.0 0.5 1.0 * b (b) float64 0.0 0.3333 0.6667 1.0 * c (c) float64 0.0 0.25 0.5 0.75 1.0 Dimensions without coordinates: d Data variables: x (d, e, a) float64 -1.556 0.2178 0.6319 ... -1.556 0.2178 0.6319 y (d, e, b, a) float64 0.5273 0.6652 0.3418 ... -0.3519 0.8088 0.8753 Attributes: key: entry Or, equivalently, you can pass the new dimensions as kwargs instead of a dictionary. original.expand_dims(d=4, e=["l", "m", "n"]) ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2757/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 804.303ms · About: xarray-datasette