html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/7463#issuecomment-1427503304,https://api.github.com/repos/pydata/xarray/issues/7463,1427503304,IC_kwDOAMm_X85VFfjI,118520620,2023-02-13T08:02:55Z,2023-02-13T08:02:55Z,NONE,"> Yes I think we should, but I might have missed the rationale behind allowing it if this is intentional.
> 
> EDIT: perhaps better to issue a warning first to avoid some breaking change. We could also try to fix it (make a deep copy) at the same time as deprecating it, but that might be tricky without again introducing performance regressions.

I would assume that deepcopy are completely going to copy the object, so if I change internal parts (like coordinates here), 1st object shall not change. It definitely affects the performance, but otherwise `deepcopy` is not `deep` anymore","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1550792876
https://github.com/pydata/xarray/pull/2930#issuecomment-1398221681,https://api.github.com/repos/pydata/xarray/issues/2930,1398221681,IC_kwDOAMm_X85TVytx,118520620,2023-01-20T10:58:22Z,2023-01-20T10:58:37Z,NONE,"seems like the same issue again:
```python
import xarray as xr

xarr1 = xr.DataArray(
    np.zeros([2]), 
    coords=dict(x=[0.0, 1.0]), # important to use 'float' here! with 'int' it is working fine
    dims=(""x"")
)
print(xarr1.x.data[0]) # 0.0

xarr2 = xarr1.copy(deep=True)
xarr2.x.data[0] = 45
print(xarr1.x.data[0]) # gives 45
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,438537597