html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/7229#issuecomment-1320983738,https://api.github.com/repos/pydata/xarray/issues/7229,1320983738,IC_kwDOAMm_X85OvJy6,39069044,2022-11-19T22:32:51Z,2022-11-19T22:32:51Z,CONTRIBUTOR,"> the change to explicitly constructing the `attrs` instead of working around quirks of `apply_ufunc` sounds good to me: when discussing this in the last meeting we did get the feeling that in the long run it would be better to think about redesigning that part of `apply_ufunc`.
Yeah I think this would be worth doing eventually. Trying to index a list of attributes of unpredictable length doesn't feel very xarray-like.
Any further refinements to the current approach of reconstructing attributes after `apply_ufunc` here, or is this good to go?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1424732975
https://github.com/pydata/xarray/pull/7229#issuecomment-1306498356,https://api.github.com/repos/pydata/xarray/issues/7229,1306498356,IC_kwDOAMm_X85N35U0,39069044,2022-11-08T01:45:59Z,2022-11-08T01:45:59Z,CONTRIBUTOR,"The latest commit should do what we want, consistently taking attrs of `x`. Casting to `DataArray` along with the explicit `broadcast` operation gets us empty attrs on both the variable and the coords.
The only way it deviates from this (spelled out in the tests) is to pull coord attrs from x, then y, then cond if any of these are scalars. I think this makes sense because if I pass `[DataArray, scalar, scalar]` for example I wouldn't expect it to drop the coordinate attrs.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1424732975
https://github.com/pydata/xarray/pull/7229#issuecomment-1304244938,https://api.github.com/repos/pydata/xarray/issues/7229,1304244938,IC_kwDOAMm_X85NvTLK,39069044,2022-11-04T20:55:02Z,2022-11-05T03:37:35Z,CONTRIBUTOR,"I considered the `[scalar, data_array, whatever]` possibility but this seems like a serious edge case. Passing a scalar condition is the same as just taking either `x` or `y` so what's the point?
As far as passing bare arrays, despite what the docstrings say it seems like you can actually do this with `DataArray.where`, etc:
```python
>>> x = xr.DataArray([1, 1], coords={""x"": [0, 1]})
>>> cond = xr.DataArray([True, False], coords={""x"": [0, 1]})
>>> x.where(cond, other=np.array([0, 2]))
array([1, 2])
Coordinates:
* x (x) int64 0 1
```
Which I don't think makes sense, but is mostly a separate issue. You do get a broadcast error if `y` is a bare array with different size.
After poking around I agree that this isn't easy to totally fix. I sort of started to go down the route of `1.` but it looked quite complicated. Option `2.` of wrapping non-xarray objects with `xr.Variable` only in `xr.where` might help us out a little here, I can try it. But, I think the current solution in this PR gets us back to the pre-#6461 behavior while still allowing for scalars and giving predictable behavior for everything except `cond=scalar` which I don't think is very important.
I'm just keen to get this merged in some form because the regression of #6461 is pretty bad. For example:
```python
ds = xr.tutorial.load_dataset('air_temperature')
xr.where(ds.air>10, ds.air, 10, keep_attrs=True).to_netcdf('foo.nc')
# completely fails because the time attrs have been overwritten by ds.air attrs
ValueError: failed to prevent overwriting existing key units in attrs on variable 'time'. This is probably an encoding field used by xarray to describe how a variable is serialized. To proceed, remove this key from the variable's attributes manually.
```
I hit exactly this issue on some existing scripts so this is preventing me from upgrading beyond `xarray=2022.03.0`.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1424732975