home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

7 rows where repo = 13221727 and user = 23738400 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, closed_at, created_at (date), updated_at (date), closed_at (date)

type 2

  • issue 4
  • pull 3

state 2

  • closed 5
  • open 2

repo 1

  • xarray · 7 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1555829604 PR_kwDOAMm_X85IeHqq 7475 Update error message when saving multiindex OriolAbril 23738400 closed 0     0 2023-01-24T23:44:14Z 2023-02-25T11:57:59Z 2023-02-24T20:16:43Z CONTRIBUTOR   0 pydata/xarray/pulls/7475

I have updated the error message that gets printed when trying to save a multiindex following some discussion in an ArviZ repo with @dcherian.

  • [x] Related to #1077 and https://github.com/arviz-devs/arviz/issues/2165
  • [ ] Tests added: haven't but can do if necessary using pytest.raises(..., match)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7475/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1240234432 I_kwDOAMm_X85J7HnA 6620 Using the html repr in the documentation OriolAbril 23738400 open 0     1 2022-05-18T16:53:46Z 2022-05-18T17:46:26Z   CONTRIBUTOR      

What is your issue?

most (if not all) of xarray documentation is written as rst files and using the ipython directive. Due to this, the html repr is not use in the documentation. I find the html repr to be much more informative and intuitive, especially for beginners and I think it would be great to use it in the documentation. There are multiple ways to do this (not necessarly incompatible between them):

  • Use jupyter sphinx instead of ipython to run and embed code cells from rst files. I use this in the documentation of xarray-einstats for example
  • Use jupyter notebooks instead of rst. We use this in arviz and xarray-einstats docs. However, in order to keep using all the sphinx roles and directives used the sphinx configuration would need to be modified to use myst-nb instead of nbsphinx
  • Use myst notebooks instead or rst. Also used in ArviZ, also needs myst-nb instead of nbsphinx.

afaik, nbsphinx can be changed with myst-nb without needing to do any changes to the documentation, then rst files could be progressively updated to ipynb, myst or any other format supported by jupytext, rst, markdown and notebook sources can all be used at the same time to generate the documentation and link from one to the other with sphinx roles and cross references.

Is this something that sounds interesting? I could update the infrastructure at some point whenever I have time and update a page (to any or multiple of the options above) as an example and then let other people take over

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6620/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
1198717318 PR_kwDOAMm_X8418Vqw 6464 Add xarray-einstats to ecosystem page OriolAbril 23738400 closed 0     1 2022-04-09T17:12:37Z 2022-04-09T19:40:39Z 2022-04-09T19:37:01Z CONTRIBUTOR   0 pydata/xarray/pulls/6464

Adds a mention of xarray-einstats to the ecosystem page. Related to #3322.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6464/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
626215981 MDExOlB1bGxSZXF1ZXN0NDI0MjQ5MzI4 4103 keep attrs in reset_index OriolAbril 23738400 closed 0     7 2020-05-28T05:00:50Z 2020-06-05T19:42:54Z 2020-06-05T19:39:10Z CONTRIBUTOR   0 pydata/xarray/pulls/4103

Modifies the code in reset_index to keep attributes when converting an indexing coordinate to non indexing coordinate. I have added tests for single index for both dataarray and dataset. Not sure if both are needed as they end up calling the same base function.

Regarding multiindex, I think it is not possible to keep the metadata as it is removed when creating the multiindex/stacking.

  • [x] Closes #4101
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4103/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
626063031 MDU6SXNzdWU2MjYwNjMwMzE= 4101 reset_index does not keep attributes OriolAbril 23738400 closed 0     1 2020-05-27T22:01:09Z 2020-06-05T19:39:10Z 2020-06-05T19:39:10Z CONTRIBUTOR      

If an indexing coordinate with attributes is converted to a non indexing coordinate with reset_index, the attributes are lost. I am not sure it is a bug, but I think they should keep the attributes and reset_coords does keep the attributes of reset coordinates.

MCVE Code Sample

```python temp = 15 + 8 * np.random.randn(2, 2, 3) time = xr.DataArray(pd.date_range('2014-09-06', periods=3), dims=["time"]).assign_attrs({"attr": 23}) coord = xr.DataArray([[-99.83, -99.32], [-99.79, -99.23]], dims=["x", "y"]).assign_attrs({"coord": True})

ds = xr.Dataset( { 'temperature': (['x', 'y', 'time'], temp), }, coords={ 'coord_0': coord, 'time': time, } ) ds

both coord_0 and time have attributes

ds.reset_index("time")

coordinate time_ does not have attributes anymore

ds.reset_coords("coord_0")

data variable coord_0 still has attributes

```

Expected Output

I would expect attributes to be kept.

Possible solution

I was wondering if changing this line and this other line to

vars_to_create[str(d) + "_"] = Variable(d, index, variables[d].attrs)

could solve this. If so I'll send a PR whenever I can.

Versions

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.6.9 (default, Apr 18 2020, 01:56:04) [GCC 8.4.0] python-bits: 64 OS: Linux OS-release: 4.15.0-101-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.6.3 xarray: 0.15.1 pandas: 1.0.3 numpy: 1.18.4 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: None distributed: None matplotlib: 3.2.1 cartopy: None seaborn: 0.10.1 numbagg: None setuptools: 42.0.2 pip: 20.1.1 conda: None pytest: 4.6.2 IPython: 7.14.0 sphinx: 2.0.0
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4101/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
462122623 MDU6SXNzdWU0NjIxMjI2MjM= 3056 Argument and its type joined in docs OriolAbril 23738400 closed 0     2 2019-06-28T16:48:27Z 2019-08-02T21:17:43Z 2019-08-02T21:17:43Z CONTRIBUTOR      

This issue has nearly no effect on users, but (at least for me) it is quite bothering aesthetically. It also looks like something simple to solve even though I am not an sphinx expert so I am not sure.

In the docstring, there is a colon between the name of one argument and its type, however, in the online documentation, the name and the type are smashed together. In the case of xr.align:

I think it would be more readable and appealing if there was some kind of separation between the two.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3056/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
457716471 MDU6SXNzdWU0NTc3MTY0NzE= 3032 apply_ufunc should preemptively broadcast OriolAbril 23738400 open 0     11 2019-06-18T22:02:36Z 2019-06-19T22:27:27Z   CONTRIBUTOR      

Code Sample

I am having some troubles understanding apply_ufunc broadcasting rules. As I had some trouble understanding the docs, I am not 100% sure it is a bug, but I am quite sure. I will try to explain why with the following really simple example.

```python import xarray as xr import numpy as np

a = xr.DataArray(data=np.random.normal(size=(7, 3)), dims=["dim1", "dim2"]) c = xr.DataArray(data=np.random.normal(size=(5, 6)), dims=["dim3", "dim4"])

def func(x,y): print(x.shape) print(y.shape) return ```

The function defined always raises an error when trying to call apply_ufunc, but this is intended, as the shapes have already been printed by then, and this keeps the example as simple as possible.

Problem description

```python xr.apply_ufunc(func, a, c)

Out

(7, 3, 1, 1)

(5, 6)

```

Here, a has been kind of broadcasted, but I would expect the shapes of a and c to be the same as when calling xr.broadcast, as there are no input core dims, so all dimensions are broadcasted. However:

```python print([ary.shape for ary in xr.broadcast(a,c)])

[(7, 3, 5, 6), (7, 3, 5, 6)]

```

Using different input core dims does not get rid of the problem, instead I believe it shows some more issues:

```python xr.apply_ufunc(func, a, c, input_core_dims=[["dim1"],[]])

(3, 1, 1, 7), expected (3, 5, 6, 7)

(5, 6), expected (3, 5, 6)

xr.apply_ufunc(func, a, c, input_core_dims=[[],["dim3"]])

(7, 3, 1), expected (7, 3, 6)

(6, 5), expected (7, 3, 6, 5)

xr.apply_ufunc(func, a, c, input_core_dims=[["dim1"],["dim3"]])

(3, 1, 7), expected (3, 6, 7)

(6, 5), expected (3, 6, 5)

```

Is this current behaviour what should be expected?

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.6.8 (default, Jan 14 2019, 11:02:34) [GCC 8.0.1 20180414 (experimental) [trunk revision 259383]] python-bits: 64 OS: Linux OS-release: 4.15.0-52-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.2 libnetcdf: 4.6.3 xarray: 0.12.1 pandas: 0.24.2 numpy: 1.16.4 scipy: 1.3.0 netCDF4: 1.5.1.2 pydap: None h5netcdf: None h5py: 2.9.0 Nio: None zarr: None cftime: 1.0.3.4 nc_time_axis: None PseudonetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: None distributed: None matplotlib: 3.1.0 cartopy: None seaborn: None setuptools: 41.0.0 pip: 19.1.1 conda: None pytest: 4.5.0 IPython: 7.5.0 sphinx: 2.0.1
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3032/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 76.194ms · About: xarray-datasette