issues: 512205079
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
512205079 | MDU6SXNzdWU1MTIyMDUwNzk= | 3445 | Merge fails when sparse Dataset has overlapping dimension values | 4605410 | open | 0 | 3 | 2019-10-24T22:08:12Z | 2021-07-08T17:43:57Z | NONE | Sparse numpy arrays used in a merge operation seem to fail under certain coordinate settings. for example, this works perfectly: ```python import xarray as xr import numpy as np data_array1 = xr.DataArray(data,name='default', dims=['source','receiver','time'], coords={'source':['X.1'], 'receiver':['X.2'], 'time':time}).to_dataset() data_array2 = xr.DataArray(data,name='default', dims=['source','receiver','time'], coords={'source':['X.2'], 'receiver':['X.1'], 'time':time}).to_dataset() dataset1 = xr.merge([data_array1,data_array2]) ``` But this raises an ```python import xarray as xr import numpy as np import sparse data = sparse.COO.from_numpy(np.random.uniform(-1,1,(1,1,100))) time = np.linspace(0,1,num=100) data_array1 = xr.DataArray(data,name='default', dims=['source','receiver','time'], coords={'source':['X.1'], 'receiver':['X.2'], 'time':time}).to_dataset() data_array2 = xr.DataArray(data,name='default', dims=['source','receiver','time'], coords={'source':['X.2'], 'receiver':['X.1'], 'time':time}).to_dataset() dataset1 = xr.merge([data_array1,data_array2]) ``` I have noticed this occurs when the merger would seem to add dimensions filled with nan values. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3445/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
13221727 | issue |