home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 1263952252

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/7111#issuecomment-1263952252 https://api.github.com/repos/pydata/xarray/issues/7111 1263952252 IC_kwDOAMm_X85LVmF8 1828519 2022-09-30T19:41:30Z 2022-09-30T19:41:30Z CONTRIBUTOR

I get a similar error for different structures and if I do something like data_arr.where(data_arr > 5, drop=True). In this case I have dask array based DataArrays and dask ends up trying to hash the object and it ends up in a loop trying to get xarray to hash the DataArray or something and xarray trying to hash the DataArrays inside .attrs.

``` In [9]: import dask.array as da

In [15]: a = xr.DataArray(da.zeros(5.0), attrs={}, dims=("a_dim",))

In [16]: b = xr.DataArray(da.zeros(8.0), attrs={}, dims=("b_dim",))

In [20]: a.attrs["other"] = b

In [24]: lons = xr.DataArray(da.random.random(8), attrs={"ancillary_variables": [b]})

In [25]: lats = xr.DataArray(da.random.random(8), attrs={"ancillary_variables": [b]})

In [26]: b.attrs["some_attr"] = [lons, lats]

In [27]: cond = a > 5

In [28]: c = a.where(cond, drop=True) ... File ~/miniconda3/envs/satpy_py310/lib/python3.10/site-packages/dask/utils.py:1982, in _HashIdWrapper.hash(self) 1981 def hash(self): -> 1982 return id(self.wrapped)

RecursionError: maximum recursion depth exceeded while calling a Python object

```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  1392878100
Powered by Datasette · Queries took 0.788ms · About: xarray-datasette