id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 1110623911,I_kwDOAMm_X85CMsan,6183,[FEATURE]: dimension attribute are lost when stacking an xarray,32069530,closed,0,,,2,2022-01-21T15:49:47Z,2022-03-17T17:11:44Z,2022-03-17T17:11:44Z,NONE,,,,"### Is your feature request related to a problem? No ### Describe the solution you'd like Hi all, when stacking an array `my_xarray.stack(multi=('x','y'))`, the attributes of stacked dimension (x and y in the example) are lost. It would be nice to keep them. It would also be nice to have them back when unstacking. Thanks ### Describe alternatives you've considered _No response_ ### Additional context _No response_","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 371906566,MDU6SXNzdWUzNzE5MDY1NjY=,2494,Concurrent acces with multiple processes using open_mfdataset,32069530,closed,0,,,4,2018-10-19T10:52:46Z,2018-10-26T12:37:30Z,2018-10-26T12:37:30Z,NONE,,,,"Hi everyone, First: thanks to the developers for this amazing xarray library ! Great piece of work ! Here comes my troubles: I run several (about 500) independant processes (dask distributed) that need simultaneous reading (only) access to a same (group of) netcdf files. I only pass the files-path strings to the processes to avoid pickling a netcdf python-object ([issue](https://github.com/Unidata/netcdf4-python/issues/437)). In each process, I run ```python with xr.open_mfdataset(myfiles_path, concat_dim='t', engine='h5netcdf') as myfile: x = myfile['x'].data y = myfile['y'].data ``` but it leads to typical errors for many concurrent access that fail... : Invalid id or Exception: CancelledError(""('mul-484a58bf5830233021e08456b45eb60d', 0, 0)"",), ... I was using netCDF4 module with parallel option set to True, when playing with a single netcdf file and it was running fine: ```python myfile = Dataset(seedsurf_path,'r', parallel=True) x = myfile['x'] y = myfile['y'] myfile.close() ``` Parallel option for open_mfdataset() seems to be dedicated to multithreaded access only. Is there somthing that can be done for multi-processes access ? Thanks #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.6.6.final.0 python-bits: 64 OS: Linux OS-release: 3.12.53-60.30-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.8 pandas: 0.23.4 numpy: 1.12.1 scipy: 0.19.1 netCDF4: 1.2.4 h5netcdf: 0.6.2 h5py: 2.7.0 Nio: None zarr: 2.2.0 bottleneck: 1.2.1 cyordereddict: None dask: 0.19.0 distributed: 1.23.0 matplotlib: 2.2.3 cartopy: 0.16.0 seaborn: None setuptools: 40.2.0 pip: 18.0 conda: None pytest: None IPython: 6.5.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2494/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue