id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 257079041,MDU6SXNzdWUyNTcwNzkwNDE=,1571,to_netcdf fails for engine=h5netcdf when using dask-backed arrays,8982598,closed,0,,,2,2017-09-12T15:08:27Z,2019-02-12T05:39:19Z,2019-02-12T05:39:19Z,CONTRIBUTOR,,,,"When using dask-backed datasets/arrays it does not seem possible to use the 'h5netcdf' engine to write to disk: ```python import xarray as xr ds = xr.Dataset({'a': ('x', [1, 2])}, {'x': [3, 4]}).chunk() ds.to_netcdf(""test.h5"", engine='h5netcdf') ``` results in the error: ```bash ... h5py/h5a.pyx in h5py.h5a.open() KeyError: ""Can't open attribute (can't locate attribute: 'dask')"" ``` Not sure if this is a xarray or h5netcdf issue - or some inherent limitation in which case apologies!","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 130753818,MDU6SXNzdWUxMzA3NTM4MTg=,742,merge and align DataArrays/Datasets on different domains,8982598,closed,0,,,11,2016-02-02T17:27:17Z,2017-01-23T22:42:18Z,2017-01-23T22:42:18Z,CONTRIBUTOR,,,,"Firstly, I think `xarray` is great and for the type of physics simulations I run n-dimensional labelled arrays is _exactly_ what I need. But, and I may be missing something, is there a way to merge (or concatenate/update) DataArrays with _different_ domains on the same coordinates? For example consider this setup: ``` python import xarray as xr x1 = [100] y1 = [1, 2, 3, 4, 5] dat1 = [[101, 102, 103, 104, 105]] x2 = [200] y2 = [3, 4, 5, 6] # different size and domain dat2 = [[203, 204, 205, 206]] da1 = xr.DataArray(dat1, dims=['x', 'y'], coords={'x': x1, 'y': y1}) da2 = xr.DataArray(dat2, dims=['x', 'y'], coords={'x': x2, 'y': y2}) ``` I would like to aggregate such DataArrays into a new, single DataArray with `nan` padding such that: ``` python >>> merge(da1, da2, align=True) # made up syntax array([[ 101., 102., 103., 104., 105., nan], [ nan, nan, 203., 204., 205., 206.]]) Coordinates: * x (x) int64 100 200 * y (y) int64 1 2 3 4 5 6 ``` Here is a quick function I wrote to do such but I would worried about the performance of 'expanding' the new data to the old data's size every iteration (i.e. supposing that the first argument is a large DataArray that you are adding to but doesn't necessarily contain the dimensions already). ``` python def xrmerge(*das, accept_new=True): da = das[0] for new_da in das[1:]: # Expand both to have same dimensions, padding with NaN da, new_da = xr.align(da, new_da, join='outer') # Fill NaNs one way or the other re. accept_new da = new_da.fillna(da) if accept_new else da.fillna(new_da) return da ``` Might this be (or is this already!) possible in simpler form in `xarray`? I know _Datasets_ have `merge` and `update` methods but I couldn't make them work as above. I also notice there are possible plans ( #417 ) to introduce a `merge` function for DataArrays. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/742/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue