id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 377356113,MDU6SXNzdWUzNzczNTYxMTM=,2542,"full_like, ones_like, zeros_like should retain subclasses",500246,closed,0,,,4,2018-11-05T11:22:49Z,2023-11-05T06:27:31Z,2023-11-05T06:27:31Z,CONTRIBUTOR,,,,"#### Code Sample, ```python # Your code here import numpy import xarray class MyDataArray(xarray.DataArray): pass da = MyDataArray(numpy.arange(5)) da2 = xarray.zeros_like(da) print(type(da), type(da2)) ``` #### Problem description I would expect that `type(da2) is type(da)`, but this is not the case. The type of `da` is always ``. Rather, the output of this script is: ``` ``` #### Expected Output I would hope as an output: ``` ``` In principle changing this could break people's code, so if a change is implemented it should probably be through an optional keyword argument to the `full_like`/`ones_like`/`zeros_like` family. #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.7.0.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-754.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.7 pandas: 0.23.2 numpy: 1.15.2 scipy: 1.1.0 netCDF4: 1.4.0 h5netcdf: 0.6.1 h5py: 2.8.0 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.18.1 distributed: 1.22.0 matplotlib: 3.0.0 cartopy: 0.16.0 seaborn: 0.9.0 setuptools: 39.2.0 pip: 18.0 conda: None pytest: 3.2.2 IPython: 6.4.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2542/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned,13221727,issue 352999600,MDU6SXNzdWUzNTI5OTk2MDA=,2377,Comparing scalar xarray with ma.masked fails with ValueError: assignment destination is read-only,500246,closed,0,,,5,2018-08-22T15:11:54Z,2023-05-17T16:06:01Z,2023-05-17T16:06:01Z,CONTRIBUTOR,,,,"#### Code Sample, a copy-pastable example if possible ```python xarray.DataArray(0) > numpy.ma.masked ``` #### Problem description This results in `ValueError: assignment destination is read-only`: ``` --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () ----> 1 xarray.DataArray(0) > numpy.ma.masked /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/dataarray.py in func(self, other) 1808 1809 variable = (f(self.variable, other_variable) -> 1810 if not reflexive 1811 else f(other_variable, self.variable)) 1812 coords = self.coords._merge_raw(other_coords) /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in func(self, other) 1580 if not reflexive 1581 else f(other_data, self_data)) -> 1582 result = Variable(dims, new_data) 1583 return result 1584 return func /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in __init__(self, dims, data, attrs, encoding, fastpath) 260 unrecognized encoding items. 261 """""" --> 262 self._data = as_compatible_data(data, fastpath=fastpath) 263 self._dims = self._parse_dimensions(dims) 264 self._attrs = None /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in as_compatible_data(data, fastpath) 177 dtype, fill_value = dtypes.maybe_promote(data.dtype) 178 data = np.asarray(data, dtype=dtype) --> 179 data[mask] = fill_value 180 else: 181 data = np.asarray(data) ValueError: assignment destination is read-only ``` #### Expected Output To be consistent, the result should be identical to the result of ``` (xarray.DataArray([0,0]) > numpy.ma.masked)[0] ``` which would be ``` xarray.DataArray(nan) ``` #### Output of ``xr.show_versions()``
xarray.show_versions() INSTALLED VERSIONS ------------------ commit: None python: 3.7.0.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-754.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.7 pandas: 0.23.2 numpy: 1.14.5 scipy: 1.1.0 netCDF4: 1.4.0 h5netcdf: 0.6.1 h5py: 2.8.0 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.18.1 distributed: 1.22.0 matplotlib: 2.2.2 cartopy: 0.16.0 seaborn: 0.9.0 setuptools: 39.2.0 pip: 18.0 conda: None pytest: 3.2.2 IPython: 6.4.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2377/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1608352581,I_kwDOAMm_X85f3YNF,7581,xr.where loses attributes despite keep_attrs=True,500246,closed,0,,,1,2023-03-03T10:14:34Z,2023-04-06T01:58:45Z,2023-04-06T01:58:45Z,CONTRIBUTOR,,,,"### What happened? I'm using `xarray.where` to mask data: `xr.where(ds == ds.attrs[""_FillValue""], nan, ds)`. This loses the attributes on `ds` even if I pass `keep_attrs=True`. ### What did you expect to happen? I expect that if I use `keep_attrs=True`, either via `xr.set_options` or directly passed to `xr.where`, that attributes on the dataset are retained. ### Minimal Complete Verifiable Example ```Python import xarray as xr a = xr.DataArray([0], attrs={""a"": ""b""}) with xr.set_options(keep_attrs=True): a2 = xr.where(a==0, 0, a, keep_attrs=True) print(a2.attrs) ``` ### MVCE confirmation - [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray. - [X] Complete example — the example is self-contained, including all data and the text of any traceback. - [X] Verifiable example — the example copy & pastes into an IPython prompt or [Binder notebook](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/blank_template.ipynb), returning the result. - [X] New issue — a search of GitHub Issues suggests this is not a duplicate. ### Relevant log output ```Python {} ``` ### Anything else we need to know? I can make a workaround by turning the logic around, such as `xr.where(a!=0, a, 0)`, which does retain attributes. The workaround works in this case, but `a!=0` is not always the same as `a==0`, so it would be preferable if the attributes were retained either way. ### Environment
INSTALLED VERSIONS ------------------ commit: None python: 3.11.0 | packaged by conda-forge | (main, Jan 14 2023, 12:27:40) [GCC 11.3.0] python-bits: 64 OS: Linux OS-release: 5.3.18-150300.59.76-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: ('en_GB', 'UTF-8') libhdf5: 1.12.2 libnetcdf: 4.9.1 xarray: 2023.2.0 pandas: 1.5.3 numpy: 1.24.2 scipy: 1.10.1 netCDF4: 1.6.2 pydap: None h5netcdf: 1.1.0 h5py: 3.8.0 Nio: None zarr: 2.13.6 cftime: 1.6.2 nc_time_axis: None PseudoNetCDF: None rasterio: 1.3.6 cfgrib: None iris: None bottleneck: 1.3.6 dask: 2023.2.1 distributed: 2023.2.1 matplotlib: 3.7.0 cartopy: 0.21.1 seaborn: None numbagg: None fsspec: 2023.1.0 cupy: None pint: 0.20.1 sparse: None flox: None numpy_groupies: None setuptools: 67.4.0 pip: 23.0.1 conda: None pytest: 7.2.1 mypy: None IPython: 8.7.0 sphinx: 5.3.0
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 217216935,MDU6SXNzdWUyMTcyMTY5MzU=,1329,Cannot open NetCDF file if dimension with time coordinate has length 0 (`ValueError` when decoding CF datetime),500246,closed,0,,,7,2017-03-27T11:33:07Z,2022-08-10T17:25:20Z,2022-08-10T17:25:20Z,CONTRIBUTOR,,,,"If a data set has a zero-sized coordinate that is a time index, reading fails. A `ValueError` is triggered when xarray tries to decode the array, as shown below: ``` $ cat mwe.py #!/usr/bin/env python import numpy import xarray ds = xarray.Dataset( {""a"": (""x"", [])}, coords={""x"": numpy.zeros(shape=0, dtype=""M8[ns]"")}) ds.to_netcdf(""/tmp/test.nc"") xarray.open_dataset(""/tmp/test.nc"") $ ./mwe.py Traceback (most recent call last): File ""./mwe.py"", line 12, in xarray.open_dataset(""/tmp/test.nc"") File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py"", line 302, in open_dataset return maybe_decode_store(store, lock) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py"", line 223, in maybe_decode_store drop_variables=drop_variables) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py"", line 952, in decode_cf ds = Dataset(vars, attrs=attrs) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py"", line 358, in __init__ self._set_init_vars_and_dims(data_vars, coords, compat) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py"", line 373, in _set_init_vars_and_dims data_vars, coords, compat=compat) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/merge.py"", line 365, in merge_data_and_coords return merge_core(objs, compat, join, explicit_coords=explicit_coords) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/merge.py"", line 413, in merge_core expanded = expand_variable_dicts(aligned) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/merge.py"", line 213, in expand_variable_dicts var = as_variable(var, name=name) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py"", line 83, in as_variable obj = obj.to_index_variable() File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py"", line 322, in to_index_variable encoding=self._encoding, fastpath=True) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py"", line 1173, in __init__ self._data = PandasIndexAdapter(self._data) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py"", line 497, in __init__ self.array = utils.safe_cast_to_index(array) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/utils.py"", line 57, in safe_cast_to_index index = pd.Index(np.asarray(array), **kwargs) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/numpy/core/numeric.py"", line 531, in asarray return array(a, dtype, copy=False, order=order) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py"", line 373, in __array__ return np.asarray(array[self.key], dtype=None) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py"", line 408, in __getitem__ calendar=self.calendar) File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py"", line 151, in decode_cf_datetime pd.to_timedelta(flat_num_dates.min(), delta) + ref_date File ""/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/numpy/core/_methods.py"", line 29, in _amin return umr_minimum(a, axis, None, out, keepdims) ValueError: zero-size array to reduction operation minimum which has no identity $ ncdump /tmp/test.nc netcdf test { dimensions: x = UNLIMITED ; // (0 currently) variables: double a(x) ; a:_FillValue = NaN ; int64 x(x) ; x:units = ""days since 1970-01-01 00:00:00"" ; x:calendar = ""proleptic_gregorian"" ; // global attributes: :_NCProperties = ""version=1|netcdflibversion=4.4.1|hdf5libversion=1.8.18"" ; data: } ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 376104925,MDU6SXNzdWUzNzYxMDQ5MjU=,2529,numpy.insert on DataArray may silently result in array inconsistent with its coordinates ,500246,closed,0,,,1,2018-10-31T18:33:23Z,2020-11-07T21:55:42Z,2020-11-07T21:55:42Z,CONTRIBUTOR,,,,"```python import numpy import xarray da = xarray.DataArray(numpy.arange(10*3).reshape(10, 3), dims=(""x"", ""y""), coords={""foo"": ((""x"", ""y""), numpy.arange(3*10).reshape(10,3))}) print(da.shape == da[""foo""].shape) da2 = numpy.insert(da, 3, 0, axis=0) print(da2.shape == da2[""foo""].shape) ``` #### Problem description Running the code snippet gives ``` True False ``` and does not raise any exception. In the resulting `da2`, the shape for `da2` and `da2['foo']` are different: we have changed the size of `da2` without changing the size of its corresponding `foo` coordinate. This happens silently, no exception is thrown. Inevitably, this is likely to result in problems at a later stage. #### Expected Output I would expect to get an exception, telling me that the insertion has failed because there are coordinates associated with the axis along which we are inserting values. It would be nice to have an `xarray.insert` that can handle this, for example, by forcing us to provide corresponding insertion values for the coordinates. #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.7.0.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-754.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.7 pandas: 0.23.2 numpy: 1.15.2 scipy: 1.1.0 netCDF4: 1.4.0 h5netcdf: 0.6.1 h5py: 2.8.0 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.18.1 distributed: 1.22.0 matplotlib: 3.0.0 cartopy: 0.16.0 seaborn: 0.9.0 setuptools: 39.2.0 pip: 18.0 conda: None pytest: 3.2.2 IPython: 6.4.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2529/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 618985094,MDU6SXNzdWU2MTg5ODUwOTQ=,4065,keep_attrs not respected for unary operators,500246,closed,0,,,2,2020-05-15T13:55:14Z,2020-10-14T16:29:51Z,2020-10-14T16:29:51Z,CONTRIBUTOR,,,,"The xarray global option `keep_attrs` (introduced in #2482 ) is not respected for unary operators. #### MCVE Code Sample ```python import xarray as xr x = xr.DataArray([1, 2, 3], attrs={""A"": ""B""}) with xr.set_options(keep_attrs=True): y = ~x print(x.attrs, y.attrs) ``` #### Expected Output I expect ``` {'A': 'B'} {'A': 'B'} ``` #### Problem Description I get: ``` {'A': 'B'} {} ``` I get the same for the other unary operators `+x`, `-x`, and `abs(x)`. #### Versions Tested with latest xarray master (see below for details).
Output of xr.show_versions() INSTALLED VERSIONS ------------------ commit: None python: 3.8.2 | packaged by conda-forge | (default, Mar 23 2020, 18:16:37) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.15.2.dev64+g2542a63f pandas: 1.0.3 numpy: 1.18.1 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: 2.4.0 cftime: 1.1.1.2 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.3 cfgrib: None iris: None bottleneck: None dask: 2.14.0 distributed: 2.14.0 matplotlib: 3.2.1 cartopy: 0.17.0 seaborn: None numbagg: None pint: None setuptools: 46.1.3.post20200325 pip: 20.0.2 conda: installed pytest: 5.4.1 IPython: 7.13.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4065/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 710876876,MDU6SXNzdWU3MTA4NzY4NzY=,4471,"Numeric scalar variable attributes (including fill_value, scale_factor, add_offset) are 1-d instead of 0-d with h5netcdf engine, triggering ValueError: non-broadcastable output on application when loading single elements",500246,closed,0,,,13,2020-09-29T08:15:48Z,2020-10-11T20:06:33Z,2020-10-11T20:06:33Z,CONTRIBUTOR,,,," **What happened**: When I try to open a NetCDF file using the `h5netcdf` engine, accessing a single data point before scale factors have been applied results in `ValueError: non-broadcastable output operand with shape () doesn't match the broadcast shape (1,)`. The MCVE (see below) results in: ``` Traceback (most recent call last): File ""mwe93.py"", line 4, in ds[""Rad""][400, 300].load() File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/core/dataarray.py"", line 808, in load ds = self._to_temp_dataset().load(**kwargs) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/core/dataset.py"", line 662, in load v.load() File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/core/variable.py"", line 439, in load self._data = np.asarray(self._data) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/numpy/core/_asarray.py"", line 83, in asarray return array(a, dtype, copy=False, order=order) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/core/indexing.py"", line 685, in __array__ self._ensure_cached() File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/core/indexing.py"", line 682, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/numpy/core/_asarray.py"", line 83, in asarray return array(a, dtype, copy=False, order=order) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/core/indexing.py"", line 655, in __array__ return np.asarray(self.array, dtype=dtype) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/numpy/core/_asarray.py"", line 83, in asarray return array(a, dtype, copy=False, order=order) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/core/indexing.py"", line 560, in __array__ return np.asarray(array[self.key], dtype=None) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/numpy/core/_asarray.py"", line 83, in asarray return array(a, dtype, copy=False, order=order) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/coding/variables.py"", line 70, in __array__ return self.func(self.array) File ""/data/gholl/miniconda3/envs/py38b/lib/python3.8/site-packages/xarray/coding/variables.py"", line 220, in _scale_offset_decoding data *= scale_factor ValueError: non-broadcastable output operand with shape () doesn't match the broadcast shape (1,) ``` **What you expected to happen**: I expect the data access to work similarly as when opening with other engines. **Minimal Complete Verifiable Example**: ```python import xarray fn = ""/data/gholl/cache/fogtools/abi/2017/03/14/20/06/7/OR_ABI-L1b-RadF-M3C07_G16_s20170732006100_e20170732016478_c20170732016514.nc"" with xarray.open_dataset(fn, engine=""h5netcdf"") as ds: ds[""Rad""][400, 300].load() ``` **Anything else we need to know?**: An earlier version of this issue, and some comments, refer to fsspec or working on open files, but that proved to have nothing to do with the problem. **Environment**: I've confirmed this issue installing xarray from latest master, which means xarray 0.16.2.dev11+gf821fe20 at the time of writing,
Output of xr.show_versions() INSTALLED VERSIONS ------------------ commit: None python: 3.8.5 | packaged by conda-forge | (default, Sep 24 2020, 16:55:52) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.2.dev11+gf821fe20 pandas: 1.1.2 numpy: 1.19.1 scipy: 1.5.2 netCDF4: 1.5.4 pydap: None h5netcdf: 0.8.1 h5py: 2.10.0 Nio: None zarr: 2.4.0 cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.6 cfgrib: None iris: None bottleneck: None dask: 2.27.0 distributed: 2.27.0 matplotlib: 3.3.2 cartopy: None seaborn: None numbagg: None pint: None setuptools: 49.6.0.post20200917 pip: 20.2.3 conda: None pytest: 6.0.2 IPython: None sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4471/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 204071440,MDU6SXNzdWUyMDQwNzE0NDA=,1240,Cannot use xarrays own times for indexing,500246,closed,0,,,9,2017-01-30T17:12:08Z,2020-08-28T09:48:56Z,2018-03-18T21:04:07Z,CONTRIBUTOR,,,,"I need to get the first Δt from the start of my dataset, i.e. `ds.sel(start_time, start_time + timedelta)`. However, due to pandas using `M8[ns]` but datetime.datetime not supporting this, the index gets converted to an `int` and indexing fails. Inspection tells me that by the time the index reaches `pandas` it is already an int. This is ultimately due to the [numpy problem](https://github.com/numpy/numpy/issues/8546) that `timedelta64(0, 'ns').item()` is an `int`, but it would be very nice if `xarray` had a workaround so that we can use indexing such as shown below. ``` In [282]: time = pd.date_range('2000-01-01', freq='H', periods=365 * 24) In [283]: ds = xarray.Dataset({'foo': ('time', np.arange(365 * 24)), 'time': time}) In [284]: ds.sel(time=slice(ds[""time""][0], ds[""time""][10])) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in () ----> 1 ds.sel(time=slice(ds[""time""][0], ds[""time""][10])) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in sel(self, method, tolerance, drop, **indexers) 1180 """""" 1181 pos_indexers, new_indexes = indexing.remap_label_indexers( -> 1182 self, indexers, method=method, tolerance=tolerance 1183 ) 1184 result = self.isel(drop=drop, **pos_indexers) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in remap_label_indexers(data_obj, indexers, method, tolerance) 286 else: 287 idxr, new_idx = convert_label_indexer(index, label, --> 288 dim, method, tolerance) 289 pos_indexers[dim] = idxr 290 if new_idx is not None: /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in convert_label_indexer(index, label, index_name, method, tolerance) 183 indexer = index.slice_indexer(_try_get_item(label.start), 184 _try_get_item(label.stop), --> 185 _try_get_item(label.step)) 186 if not isinstance(indexer, slice): 187 # unlike pandas, in xarray we never want to silently convert a slice /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/tseries/index.py in slice_indexer(self, start, end, step, kind) 1496 1497 try: -> 1498 return Index.slice_indexer(self, start, end, step, kind=kind) 1499 except KeyError: 1500 # For historical reasons DatetimeIndex by default supports /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in slice_indexer(self, start, end, step, kind) 2995 """""" 2996 start_slice, end_slice = self.slice_locs(start, end, step=step, -> 2997 kind=kind) 2998 2999 # return a slice /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in slice_locs(self, start, end, step, kind) 3174 start_slice = None 3175 if start is not None: -> 3176 start_slice = self.get_slice_bound(start, 'left', kind) 3177 if start_slice is None: 3178 start_slice = 0 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in get_slice_bound(self, label, side, kind) 3113 # For datetime indices label may be a string that has to be converted 3114 # to datetime boundary according to its resolution. -> 3115 label = self._maybe_cast_slice_bound(label, side, kind) 3116 3117 # we need to look up the label /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/tseries/index.py in _maybe_cast_slice_bound(self, label, side, kind) 1444 1445 if is_float(label) or isinstance(label, time) or is_integer(label): -> 1446 self._invalid_indexer('slice', label) 1447 1448 if isinstance(label, compat.string_types): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in _invalid_indexer(self, form, key) 1282 ""indexers [{key}] of {kind}"".format( 1283 form=form, klass=type(self), key=key, -> 1284 kind=type(key))) 1285 1286 def get_duplicates(self): TypeError: cannot do slice indexing on with these indexers [946684800000000000] of ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1240/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 212177054,MDU6SXNzdWUyMTIxNzcwNTQ=,1297,Encoding lost upon concatenation,500246,closed,0,,,6,2017-03-06T16:33:40Z,2020-04-05T19:12:41Z,2019-02-06T22:51:19Z,CONTRIBUTOR,,,,"When using `xarray.concat`, attributes are retained, but encoding information is not. I believe encoding information should be retained, at least optionally. ``` In [64]: da = xarray.DataArray([1, 2, 3, 2, 1]) In [65]: da.attrs.update(foo=""bar"") In [66]: da.encoding.update(complevel=5) In [67]: da2 = xarray.concat((da, da), dim=""new"") In [68]: print(da2.attrs) OrderedDict([('foo', 'bar')]) In [69]: print(da2.encoding) {} ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 528154893,MDU6SXNzdWU1MjgxNTQ4OTM=,3572,Context manager `AttributeError` when engine='h5netcdf',500246,closed,0,,,2,2019-11-25T15:19:29Z,2019-11-25T16:12:37Z,2019-11-25T16:12:37Z,CONTRIBUTOR,,,,"Opening this NetCDF file works fine with the default engine, but fails with AttributeError with the h5netcdf engine: #### MCVE Code Sample Data available from EUMETSAT: https://www.eumetsat.int/website/home/Satellites/FutureSatellites/MeteosatThirdGeneration/MTGData/MTGUserTestData/index.html --> ftp://ftp.eumetsat.int/pub/OPS/out/test-data/Test-data-for-External-Users/MTG_FCI_Test-Data/ --> uncompressed ```python import xarray f = ""/path/to/.../W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410114434_GTT_DEV_20170410113925_20170410113934_N__C_0070_0067.nc"" ds = xarray.open_dataset(f, engine=""h5netcdf"") ``` #### Expected Output No output at all. #### Problem Description Results in `AttributeError`: ``` Traceback (most recent call last): File ""mwe4.py"", line 3, in with xarray.open_dataset(f, engine=""h5netcdf"") as ds: File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/api.py"", line 535, in open_dataset ds = maybe_decode_store(store) File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/api.py"", line 450, in maybe_decode_store use_cftime=use_cftime, File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/conventions.py"", line 570, in decode_cf vars, attrs = obj.load() File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/common.py"", line 123, in load (_decode_variable_name(k), v) for k, v in self.get_variables().items() File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/h5netcdf_.py"", line 156, in get_variables (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items() File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/core/utils.py"", line 402, in FrozenDict return Frozen(dict(*args, **kwargs)) File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/h5netcdf_.py"", line 156, in (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items() File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/h5netcdf_.py"", line 120, in open_store_variable dimensions = var.dimensions File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/h5netcdf/core.py"", line 114, in dimensions self._dimensions = self._lookup_dimensions() File ""/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/h5netcdf/core.py"", line 98, in _lookup_dimensions for axis, dim in enumerate(self._h5ds.dims): AttributeError: 'Datatype' object has no attribute 'dims' ``` #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.79-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.14.1 pandas: 0.25.3 numpy: 1.17.3 scipy: 1.3.2 netCDF4: 1.5.3 pydap: None h5netcdf: 0.7.4 h5py: 2.10.0 Nio: None zarr: 2.3.2 cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.0 cfgrib: None iris: None bottleneck: None dask: 2.8.0 distributed: 2.8.0 matplotlib: 3.1.2 cartopy: 0.17.0 seaborn: None numbagg: None setuptools: 41.6.0.post20191101 pip: 19.3.1 conda: None pytest: 5.3.0 IPython: 7.9.0 sphinx: 2.2.1
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 442617907,MDU6SXNzdWU0NDI2MTc5MDc=,2954,Segmentation fault reading many groups from many files,500246,closed,0,,,14,2019-05-10T09:12:34Z,2019-07-12T16:48:26Z,2019-07-12T16:48:26Z,CONTRIBUTOR,,,,"This is probably the wrong place to report it, but I haven't been able to reproduce this without using xarray. Repeatedly opening NetCDF4/HDF5 files and reading a group from them, triggers a Segmentation Fault after about 130–150 openings. See details below. #### Code Sample, a copy-pastable example if possible ```python from itertools import count, product import netCDF4 import glob import xarray files = sorted(glob.glob(""/media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/*BODY*.nc"")) # get all groups def get_groups(ds, pre=""""): for g in ds.groups.keys(): nm = pre + ""/"" + g yield from get_groups(ds[g], nm) yield nm with netCDF4.Dataset(files[0]) as ds: groups = sorted(list(get_groups(ds))) print(""total groups"", len(groups), ""total files"", len(files)) ds_all = [] ng = 20 nf = 20 print(""using groups"", ng, ""using files"", nf) for (i, (g, f)) in zip(count(), product(groups[:ng], files[:nf])): print(""attempting"", i, ""group"", g, ""from"", f) ds = xarray.open_dataset( f, group=g, decode_cf=False) ds_all.append(ds) ``` #### Problem description I have 70 NetCDF-4 files with 70 groups each. When I cycle through the files and read one group from them at the time, after about 130–150 times, the next opening fails with a Segmentation Fault. If I try to read one group from one file at the time, that would require a total of 70*70=4900 openings. If I limit to 20 groups from 20 files in total, it would require 400 openings. In either case, it fails after about 130–150 times. I'm using the Python xarray interface, but the error occurs in the HDF5 library. The message belows includes the traceback in Python: ``` HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 140107218855616: [9/1985] #000: H5D.c line 485 in H5Dget_create_plist(): Can't get creation plist major: Dataset minor: Can't get value #001: H5Dint.c line 3159 in H5D__get_create_plist(): can't get dataset's creation property list major: Dataset minor: Can't get value #002: H5Dint.c line 3296 in H5D_get_create_plist(): datatype conversion failed major: Dataset minor: Can't convert datatypes #003: H5T.c line 5025 in H5T_convert(): datatype conversion failed major: Datatype minor: Can't convert datatypes #004: H5Tconv.c line 3227 in H5T__conv_vlen(): can't read VL data major: Datatype minor: Read failed #005: H5Tvlen.c line 853 in H5T_vlen_disk_read(): Unable to read VL information major: Datatype minor: Read failed #006: H5HG.c line 611 in H5HG_read(): unable to protect global heap major: Heap minor: Unable to protect metadata #007: H5HG.c line 264 in H5HG__protect(): unable to protect global heap major: Heap minor: Unable to protect metadata #008: H5AC.c line 1591 in H5AC_protect(): unable to get logging status major: Object cache minor: Internal error detected #009: H5Clog.c line 313 in H5C_get_logging_status(): cache magic value incorrect major: Invalid arguments to routine minor: Bad value HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 140107218855616: #000: H5L.c line 1138 in H5Literate(): link iteration failed major: Links minor: Iteration failed #001: H5L.c line 3440 in H5L__iterate(): link iteration failed major: Links minor: Iteration failed #002: H5Gint.c line 893 in H5G_iterate(): error iterating over links major: Symbol table minor: Iteration failed #003: H5Gobj.c line 683 in H5G__obj_iterate(): can't iterate over dense links major: Symbol table minor: Iteration failed #004: H5Gdense.c line 1054 in H5G__dense_iterate(): iteration operator failed major: Symbol table minor: Can't move to next iterator location #005: H5Glink.c line 493 in H5G__link_iterate_table(): iteration operator failed major: Symbol table minor: Can't move to next iterator location Traceback (most recent call last): File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/file_manager.py"", line 167, in acquire file = self._cache[self._key] File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/lru_cache.py"", line 41, in __getitem__ value = self._cache[key] KeyError: [, ('/media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410114417_GTT_DEV_20170410113908_20170410113917_N__C_0070_0065.nc', CombinedLock([, ])), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('group', '/data/vis_04/measured'), ('persist ', False))] During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/tmp/mwe9.py"", line 24, in f, group=g, decode_cf=False) File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/api.py"", line 363, in open_dataset filename_or_obj, group=group, lock=lock, **backend_kwargs) File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 352, in open return cls(manager, lock=lock, autoclose=autoclose) File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 311, in __init__ self.format = self.ds.data_model File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 356, in ds return self._manager.acquire().value File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/file_manager.py"", line 173, in acquire file = self._opener(*self._args, **kwargs) File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 244, in _open_netcdf4_group ds = nc4.Dataset(filename, mode=mode, **kwargs) File ""netCDF4/_netCDF4.pyx"", line 2291, in netCDF4._netCDF4.Dataset.__init__ File ""netCDF4/_netCDF4.pyx"", line 1855, in netCDF4._netCDF4._ensure_nc_success OSError: [Errno -101] NetCDF: HDF error: b'/media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410114417_GTT_DEV_20170410113908_20170410113917_N__C_0070_0065.nc' ``` More usually however, it fails with a Segmentation Fault and no further information. The failure might happen in any file. The full output of my script might end with: ``` attempting 137 group /data/ir_123/measured from /media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410113734_GTT_DEV_20170410113225_20170410113234_N__C_0070_0018.nc attempting 138 group /data/ir_123/measured from /media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410113742_GTT_DEV_20170410113234_20170410113242_N__C_0070_0019.nc attempting 139 group /data/ir_123/measured from /media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410113751_GTT_DEV_20170410113242_20170410113251_N__C_0070_0020.nc attempting 140 group /data/ir_123/quality_channel from /media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410113508_GTT_DEV_20170410113000_20170410113008_N__C_0070_0001.nc Fatal Python error: Segmentation fault ``` prior to the segmentation fault. When running with `-X faulthandler` and a segmentation fault happens: ``` Fatal Python error: Segmentation fault Current thread 0x00007ff6ab89d6c0 (most recent call first): File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 244 in _open_netcdf4_group File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/file_manager.py"", line 173 in acquire File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 356 in ds File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 311 in __init__ File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py"", line 352 in open File ""/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/api.py"", line 363 in open_dataset File ""/tmp/mwe9.py"", line 24 in Segmentation fault (core dumped) ``` #### Expected Output I expect no segmentation fault. #### Output of ``xr.show_versions()``
``` INSTALLED VERSIONS ------------------ commit: None python: 3.7.1 | packaged by conda-forge | (default, Feb 18 2019, 01:42:00) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.58-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.6.2 xarray: 0.12.0 pandas: 0.24.2 numpy: 1.16.2 scipy: 1.2.1 netCDF4: 1.5.0.1 pydap: None h5netcdf: 0.7.1 h5py: 2.9.0 Nio: None zarr: None cftime: 1.0.3.4 nc_time_axis: None PseudonetCDF: None rasterio: 1.0.22 cfgrib: None iris: None bottleneck: None dask: 1.1.5 distributed: 1.26.1 matplotlib: 3.0.3 cartopy: 0.17.0 seaborn: None setuptools: 40.8.0 pip: 19.0.3 conda: None pytest: None IPython: 7.4.0 sphinx: 2.0.0 ``` The machine is running openSUSE 15.0 with `Linux oflws222 4.12.14-lp150.12.58-default #1 SMP Mon Apr 1 15:20:46 UTC 2019 (58fcc15) x86_64 x86_64 x86_64 GNU/Linux`. The problem has also been reported on other machines, such as one running CentOS Linux release 7.6.1810 (Core) with `Linux oflks333.dwd.de 3.10.0-957.5.1.el7.x86_64 #1 SMP Fri Feb 1 14:54:57 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux` The HDF5 installation on my machine is from the SuSe package. From `strings /usr/lib64/libhdf5.so`, I get: ``` SUMMARY OF THE HDF5 CONFIGURATION ================================= General Information: ------------------- HDF5 Version: 1.10.1 Host system: x86_64-suse-linux-gnu Byte sex: little-endian Installation point: /usr Compiling Options: ------------------ Build Mode: production Debugging Symbols: no Asserts: no Profiling: no Optimization Level: high Linking Options: ---------------- Libraries: static, shared Statically Linked Executables: LDFLAGS: H5_LDFLAGS: AM_LDFLAGS: Extra libraries: -lpthread -lz -ldl -lm Archiver: ar Ranlib: ranlib Languages: ---------- C: yes C Compiler: /usr/bin/gcc CPPFLAGS: H5_CPPFLAGS: -D_GNU_SOURCE -D_POSIX_C_SOURCE=200112L -DNDEBUG -UH5_DEBUG_API AM_CPPFLAGS: C Flags: -fmessage-length=0 -grecord-gcc-switches -O2 -Wall -D_FORTIFY_SOURCE=2 -fstack-protector-strong -funwind-tables -fasynchronous-unwind-tables -fstack-clash-protection -g H5 C Flags: -std=c99 -pedantic -Wall -W -Wundef -Wshadow -Wpointer-arith -Wbad-function-cast -Wcast-qual -Wcast-align -Wwrite-strings -Wconversion -Wstrict-prototypes -Wmissing-prototypes -Wmissing-declarations -Wredundant-decls -Wnested-externs -finline-functions -s -Wno-inline -Wno-aggregate-return -O AM C Flags: Shared C Library: yes Static C Library: yes Fortran: yes Fortran Compiler: /usr/bin/gfortran Fortran Flags: H5 Fortran Flags: -pedantic -Wall -Wextra -Wunderflow -Wimplicit-interface -Wsurprising -Wno-c-binding-type -s -O2 AM Fortran Flags: Shared Fortran Library: yes Static Fortran Library: yes C++: yes C++ Compiler: /usr/bin/g++ C++ Flags: -fmessage-length=0 -grecord-gcc-switches -O2 -Wall -D_FORTIFY_SOURCE=2 -fstack-protector-strong -funwind-tables -fasynchronous-unwind-tables -fstack-clash-protection -g H5 C++ Flags: -pedantic -Wall -W -Wundef -Wshadow -Wpointer-arith -Wcast-qual -Wcast-align -Wwrite-strings -Wconversion -Wredundant-decls -Winline -Wsign-promo -Woverloaded-virtual -Wold-style-cast -Weffc++ -Wreorder -Wnon-virtual-dtor -Wctor-dtor-privacy -Wabi -finline-functions -s -O AM C++ Flags: Shared C++ Library: yes Static C++ Library: yes Java: no Features: --------- Parallel HDF5: no High-level library: yes Threadsafety: yes Default API mapping: v110 With deprecated public symbols: yes I/O filters (external): deflate(zlib) MPE: no Direct VFD: no dmalloc: no Packages w/ extra debug output: none API tracing: no Using memory checker: no Memory allocation sanity checks: no Metadata trace file: no Function stack tracing: no Strict file format checks: no Optimization instrumentation: no ```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2954/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 246093122,MDU6SXNzdWUyNDYwOTMxMjI=,1494,AssertionError when storing datetime coordinates of wrong units,500246,closed,0,,,2,2017-07-27T16:11:48Z,2019-06-30T04:28:18Z,2019-06-30T04:28:17Z,CONTRIBUTOR,,,,"The following code should probably fail somewhere else than with an `AssertionError` triggered by `to_netcdf`: ``` $ cat mwe.py #!/usr/bin/env python3.6 import numpy import xarray x = xarray.DataArray( [1, 2, 3], dims=[""X""], coords={""X"": numpy.zeros(shape=3, dtype=""M8[ms]"")}) x.to_netcdf(""/tmp/test.nc"") $ python3.6 mwe.py Traceback (most recent call last): File ""mwe.py"", line 11, in x.to_netcdf(""/tmp/test.nc"") File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataarray.py"", line 1351, in to_netcdf dataset.to_netcdf(*args, **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py"", line 977, in to_netcdf unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/api.py"", line 573, in to_netcdf unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py"", line 916, in dump_to_store unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py"", line 244, in store cf_variables, cf_attrs = cf_encoder(variables, attributes) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py"", line 1089, in cf_encoder for k, v in iteritems(variables)) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py"", line 1089, in for k, v in iteritems(variables)) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py"", line 734, in encode_cf_variable var = maybe_encode_datetime(var) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py"", line 585, in maybe_encode_datetime data, encoding.pop('units', None), encoding.pop('calendar', None)) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py"", line 293, in encode_cf_datetime assert dates.dtype == 'datetime64[ns]' AssertionError ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1494/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 290572700,MDU6SXNzdWUyOTA1NzI3MDA=,1849,passing unlimited_dims to to_netcdf triggers RuntimeError: NetCDF: Invalid argument,500246,closed,0,,,12,2018-01-22T18:43:23Z,2019-06-04T20:41:50Z,2019-06-04T20:41:50Z,CONTRIBUTOR,,,,"For some datafiles with properties I cannot quite reproduce, `.to_netcdf` leads to a `RuntimeError: NetCDF: Invalid argument` if and only if I pass an `unlimited_dims` corresponding to `y`. The problem is hard to reproduce. It happens to this particular dataset, but not to seemingly identical ones created from scratch. I attach `sample.nc` (gzipped so github would let me upload it). ``` $ cat mwe.py #!/usr/bin/env python3.6 import xarray ds = xarray.open_dataset(""sample.nc"") ds.to_netcdf(""sample2.nc"", unlimited_dims=[""y""]) $ ncdump sample.nc netcdf sample { dimensions: y = 6 ; variables: float x(y) ; x:_FillValue = NaNf ; int64 y(y) ; data: x = 0, 0, 0, 0, 0, 0 ; y = 0, 1, 2, 3, 4, 5 ; } $ ./mwe.py Traceback (most recent call last): File ""./mwe.py"", line 5, in ds.to_netcdf(""sample2.nc"", unlimited_dims=[""y""]) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py"", line 1133, in to_netcdf unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/api.py"", line 627, in to_netcdf unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py"", line 1070, in dump_to_store unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py"", line 254, in store *args, **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py"", line 221, in store unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/netCDF4_.py"", line 339, in set_variables super(NetCDF4DataStore, self).set_variables(*args, **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py"", line 233, in set_variables name, v, check, unlimited_dims=unlimited_dims) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/netCDF4_.py"", line 385, in prepare_variable fill_value=fill_value) File ""netCDF4/_netCDF4.pyx"", line 2437, in netCDF4._netCDF4.Dataset.createVariable File ""netCDF4/_netCDF4.pyx"", line 3439, in netCDF4._netCDF4.Variable.__init__ File ""netCDF4/_netCDF4.pyx"", line 1638, in netCDF4._netCDF4._ensure_nc_success RuntimeError: NetCDF: Invalid argument ``` #### Output of ``xr.show_versions()``
# Paste the output here xr.show_versions() here $ ./mwe.py INSTALLED VERSIONS ------------------ commit: None python: 3.6.1.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-696.6.3.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.0+dev39.ge31cf43 pandas: 0.22.0 numpy: 1.14.0 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: None Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.16.1 distributed: None matplotlib: 2.1.2 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 38.4.0 pip: 9.0.1 conda: 4.3.16 pytest: 3.1.2 IPython: 6.1.0 sphinx: 1.6.2 [sample.nc.gz](https://github.com/pydata/xarray/files/1653178/sample.nc.gz)
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1849/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 371990138,MDU6SXNzdWUzNzE5OTAxMzg=,2495,Confusing error message when using a set to pass coordinates,500246,closed,0,,,4,2018-10-19T14:48:04Z,2018-10-22T21:43:01Z,2018-10-22T20:32:48Z,CONTRIBUTOR,,,,"#### Code Sample, a copy-pastable example if possible ```python xarray.DataArray(numpy.arange(3), dims=(""x"",), coords={""x"": {""a"", ""b"", ""c""}}) ``` #### Problem description This results in a `MissingDimensionsError`, which really isn't the correct exception to raise here. ``` In [57]: xarray.DataArray(numpy.arange(3), dims=(""x"",), coords={""x"": {""a"", ""b"", ""c""}}) --------------------------------------------------------------------------- MissingDimensionsError Traceback (most recent call last) in () ----> 1 xarray.DataArray(numpy.arange(3), dims=(""x"",), coords={""x"": {""a"", ""b"", ""c""}}) /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/dataarray.py in __init__(self, data, coords, dims, name, attrs, encoding, fastpath) 225 226 data = as_compatible_data(data) --> 227 coords, dims = _infer_coords_and_dims(data.shape, coords, dims) 228 variable = Variable(dims, data, attrs, encoding, fastpath=True) 229 /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/dataarray.py in _infer_coords_and_dims(shape, coords, dims) 62 if utils.is_dict_like(coords): 63 for k, v in coords.items(): ---> 64 new_coords[k] = as_variable(v, name=k) 65 elif coords is not None: 66 for dim, coord in zip(dims, coords): /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in as_variable(obj, name) 99 'cannot set variable %r with %r-dimensional data ' 100 'without explicit dimension names. Pass a tuple of ' --> 101 '(dims, data) instead.' % (name, data.ndim)) 102 obj = Variable(name, obj, fastpath=True) 103 else: MissingDimensionsError: cannot set variable 'x' with 0-dimensional data without explicit dimension names. Pass a tuple of (dims, data) instead. ``` #### Expected Output It should probably raise a `TypeError`, because the values of the `coords` mapping must be ordered, and a set is an unordered type. If there comes a day that sets are ordered like dictionaries, then it should probably accept the values as they are. In fact, an ordered set would be even more appropriate than a list or array, seeing as coordinate values should be unique :) #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.7.0.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-754.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.7 pandas: 0.23.2 numpy: 1.15.2 scipy: 1.1.0 netCDF4: 1.4.0 h5netcdf: 0.6.1 h5py: 2.8.0 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.18.1 distributed: 1.22.0 matplotlib: 3.0.0 cartopy: 0.16.0 seaborn: 0.9.0 setuptools: 39.2.0 pip: 18.0 conda: None pytest: 3.2.2 IPython: 6.4.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 289790965,MDU6SXNzdWUyODk3OTA5NjU=,1838,DataArray.sum does not respect dtype keyword,500246,closed,0,,,2,2018-01-18T22:01:07Z,2018-01-20T18:29:02Z,2018-01-20T18:29:02Z,CONTRIBUTOR,,,,"#### Code Sample, a copy-pastable example if possible ```python # Your code here da = xarray.DataArray(arange(5, dtype=""i2"")) print(da.sum(dtype=""i4"").dtype) ``` #### Problem description The result is int64. This is a problem because I asked for int32. #### Expected Output Expected output `int32`. #### Output of ``xr.show_versions()``
# Paste the output here xr.show_versions() here INSTALLED VERSIONS ------------------ commit: None python: 3.6.1.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-696.6.3.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.0+dev12.gf882a58 pandas: 0.22.0 numpy: 1.14.0 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: None Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.16.1 matplotlib: 2.1.1 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 38.4.0 pip: 9.0.1 conda: 4.3.16 pytest: 3.1.2 IPython: 6.1.0 sphinx: 1.6.2
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1838/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 212501628,MDU6SXNzdWUyMTI1MDE2Mjg=,1300,git version label yields version in violation of PEP 440,500246,closed,0,,,2,2017-03-07T17:23:00Z,2017-12-15T07:26:24Z,2017-12-15T07:26:24Z,CONTRIBUTOR,,,,"When an `xarray` installation does not match a released version ,it has a version number like `0.9.1-28-g769f120`. This violates [PEP 440](https://www.python.org/dev/peps/pep-0440/), which leads to multiple problems: * `pip install --update` will revert `xarray` back to `0.9.1`, because it does not recognise that `0.9.1-28-g769f120 > 0.9.1` * packages with an `xarray` dependencies will be considered not satisfied. Running a script through `setuptools` `load_entry_point` fails with `pkg_resources.ContextualVersionConflict: (xarray 0.9.1-28-g769f120 (/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages), Requirement.parse('xarray>=0.8')` Instead, the version number above should be written as `0.9.1+r10345` or so, which would satisfy PEP 440 and not cause problems with `pip` or `setuptools`.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 269143043,MDU6SXNzdWUyNjkxNDMwNDM=,1663,ds.notnull() fails with AttributeError on pandas 0.21.0rc1,500246,closed,0,,,8,2017-10-27T15:19:33Z,2017-11-01T05:27:18Z,2017-11-01T05:27:18Z,CONTRIBUTOR,,,,"`xarray.Dataset({""A"": (""x"", arange(5))}).notnull()` fails with an 'AttributeError' when using `numpy` 1.13.3, `xarray` 0.9.6, and `pandas` 0.21.0rc1. The `AttributeError` is raised by `pandas`; see below. ``` $ cat mwe.py #!/usr/bin/env python3.6 import numpy print(numpy.__version__) import xarray print(xarray.__version__) import pandas print(pandas.__version__) xarray.Dataset({""A"": (""x"", numpy.arange(5))}).notnull() $ ./mwe.py 1.13.3 0.9.6 0.21.0rc1 Traceback (most recent call last): File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/ops.py"", line 193, in func return getattr(self, name)(*args, **kwargs) AttributeError: 'Variable' object has no attribute 'notna' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""./mwe.py"", line 10, in xarray.Dataset({""A"": (""x"", numpy.arange(5))}).notnull() File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py"", line 2485, in func ds._variables[k] = f(self._variables[k], *args, **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/ops.py"", line 195, in func return f(self, *args, **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/pandas/core/dtypes/missing.py"", line 212, in notna res = isna(obj) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/pandas/core/dtypes/missing.py"", line 45, in isna return _isna(obj) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/pandas/core/dtypes/missing.py"", line 60, in _isna_new return obj._constructor(obj._data.isna(func=isna)) AttributeError: 'Variable' object has no attribute '_constructor' ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1663/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 268487752,MDU6SXNzdWUyNjg0ODc3NTI=,1661,da.plot.pcolormesh fails when there is a datetime coordinate,500246,closed,0,,,9,2017-10-25T17:44:38Z,2017-10-29T17:28:55Z,2017-10-29T17:28:55Z,CONTRIBUTOR,,,," `da.plot.pcolormesh`, where `da` is a `DataArray`, fails with `TypeError: invalid type promotion` when one of the coordinates is a datetime array: ``` $ cat mwe.py #!/usr/bin/env python3.6 import xarray import numpy da = xarray.DataArray( numpy.arange(3*4).reshape(3,4), dims=(""x"", ""y""), coords={""x"": [1,2,3], ""y"": [numpy.datetime64(f""2000-01-{x:02d}"") for x in range(1, 5)]}) da.plot.pcolormesh() $ ./mwe.py Traceback (most recent call last): File ""./mwe.py"", line 13, in da.plot.pcolormesh() File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/plot/plot.py"", line 547, in plotmethod return newplotfunc(**allargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/plot/plot.py"", line 500, in newplotfunc **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/plot/plot.py"", line 667, in pcolormesh primitive = ax.pcolormesh(x, y, z, **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/matplotlib/__init__.py"", line 1710, in inner return func(ax, *args, **kwargs) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/matplotlib/axes/_axes.py"", line 5636, in pcolormesh coords = np.column_stack((X, Y)).astype(float, copy=False) File ""/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/numpy/lib/shape_base.py"", line 353, in column_stack return _nx.concatenate(arrays, 1) TypeError: invalid type promotion ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1661/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 228023777,MDU6SXNzdWUyMjgwMjM3Nzc=,1405,Using uint64 for Dataset indexing gives ValueError,500246,closed,0,,,2,2017-05-11T15:05:20Z,2017-10-23T07:50:29Z,2017-10-23T07:50:29Z,CONTRIBUTOR,,,,"Trying to index a `Dataset` using an index array of dtype `uint64` yields a `ValueError`. `int64` works fine. See below: ``` In [13]: import xarray In [14]: ds = xarray.Dataset({""A"": ((""x"", ""y""), arange(5*6).reshape(5,6))}) In [15]: ds[{""x"": numpy.array([0], dtype=""int64"")}] Out[15]: Dimensions: (x: 1, y: 6) Dimensions without coordinates: x, y Data variables: A (x, y) int64 0 1 2 3 4 5 In [16]: ds[{""x"": numpy.array([0], dtype=""uint64"")}] --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () ----> 1 ds[{""x"": numpy.array([0], dtype=""uint64"")}] /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in __getitem__(self, key) 722 """""" 723 if utils.is_dict_like(key): --> 724 return self.isel(**key) 725 726 if hashable(key): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in isel(self, drop, **indexers) 1147 for name, var in iteritems(self._variables): 1148 var_indexers = dict((k, v) for k, v in indexers if k in var.dims) -> 1149 new_var = var.isel(**var_indexers) 1150 if not (drop and name in var_indexers): 1151 variables[name] = new_var /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py in isel(self, **indexers) 547 if dim in indexers: 548 key[i] = indexers[dim] --> 549 return self[tuple(key)] 550 551 def squeeze(self, dim=None): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py in __getitem__(self, key) 377 dims = tuple(dim for k, dim in zip(key, self.dims) 378 if not isinstance(k, integer_types)) --> 379 values = self._indexable_data[key] 380 # orthogonal indexing should ensure the dimensionality is consistent 381 if hasattr(values, 'ndim'): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in __getitem__(self, key) 467 468 def __getitem__(self, key): --> 469 key = self._convert_key(key) 470 return self._ensure_ndarray(self.array[key]) 471 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in _convert_key(self, key) 454 if any(not isinstance(k, integer_types + (slice,)) for k in key): 455 # key would trigger fancy indexing --> 456 key = orthogonal_indexer(key, self.shape) 457 return key 458 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in orthogonal_indexer(key, shape) 78 """""" 79 # replace Ellipsis objects with slices ---> 80 key = list(canonicalize_indexer(key, len(shape))) 81 # replace 1d arrays and slices with broadcast compatible arrays 82 # note: we treat integers separately (instead of turning them into 1d /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in canonicalize_indexer(key, ndim) 66 return indexer 67 ---> 68 return tuple(canonicalize(k) for k in expanded_indexer(key, ndim)) 69 70 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in (.0) 66 return indexer 67 ---> 68 return tuple(canonicalize(k) for k in expanded_indexer(key, ndim)) 69 70 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in canonicalize(indexer) 63 'array indexing; all subkeys must be ' 64 'slices, integers or sequences of ' ---> 65 'integers or Booleans' % indexer) 66 return indexer 67 ValueError: invalid subkey array([0], dtype=uint64) for integer based array indexing; all subkeys must be slices, integers or sequences of integers or Booleans ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 203159853,MDU6SXNzdWUyMDMxNTk4NTM=,1229,opening NetCDF file fails with ValueError when time variable is multidimensional,500246,closed,0,,,3,2017-01-25T16:56:27Z,2017-01-26T05:13:12Z,2017-01-26T05:13:12Z,CONTRIBUTOR,,,,"I have a NetCDF file that includes a time field with multiple dimensions. This leads to a failure in `xarray.open_dataset`, because `first_n_items` returns an object with shape `(1,)`, but `last_item` returns an object with shape `(1,)*ndim` where `ndim` is the number of dimensions for the time variable. See the illustration below: ``` In [748]: ds = netCDF4.Dataset(""test.nc"", ""w"") In [749]: dim = ds.createDimension(""dim"", 5) In [750]: dim2 = ds.createDimension(""dim2"", 5) In [751]: time = ds.createVariable(""time"", ""u4"", (""dim"", ""dim2"")) In [752]: time.units = ""seconds since 1970-01-01"" In [753]: time.calendar = ""gregorian"" In [754]: time[:, :] = arange(25).reshape(5, 5) In [755]: ds.close() In [757]: xarray.open_dataset(""test.nc"") --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () ----> 1 xarray.open_dataset(""test.nc"") /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables) 300 lock = _default_lock(filename_or_obj, engine) 301 with close_on_error(store): --> 302 return maybe_decode_store(store, lock) 303 else: 304 if engine is not None and engine != 'scipy': /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 221 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 222 concat_characters=concat_characters, decode_coords=decode_coords, --> 223 drop_variables=drop_variables) 224 225 _protect_dataset_variables_inplace(ds, cache) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 947 vars, attrs, coord_names = decode_cf_variables( 948 vars, attrs, concat_characters, mask_and_scale, decode_times, --> 949 decode_coords, drop_variables=drop_variables) 950 ds = Dataset(vars, attrs=attrs) 951 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variables(variables, attributes, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 882 new_vars[k] = decode_cf_variable( 883 v, concat_characters=concat, mask_and_scale=mask_and_scale, --> 884 decode_times=decode_times) 885 if decode_coords: 886 var_attrs = new_vars[k].attrs /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variable(var, concat_characters, mask_and_scale, decode_times, decode_endianness) 819 units = pop_to(attributes, encoding, 'units') 820 calendar = pop_to(attributes, encoding, 'calendar') --> 821 data = DecodedCFDatetimeArray(data, units, calendar) 822 elif attributes['units'] in TIME_UNITS: 823 # timedelta /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in __init__(self, array, units, calendar) 384 # Dataset.__repr__ when users try to view their lazily decoded array. 385 example_value = np.concatenate([first_n_items(array, 1) or [0], --> 386 last_item(array) or [0]]) 387 388 try: ValueError: all the input arrays must have same number of dimensions ``` Closer look in the debugger: ``` In [758]: %debug xarray.open_dataset(""test.nc"") NOTE: Enter 'c' at the ipdb> prompt to continue execution. > (1)() ipdb> break /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 Breakpoint 1 at /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 ipdb> cont > /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py(385)__init__() 383 # successfully. Otherwise, tracebacks end up swallowed by 384 # Dataset.__repr__ when users try to view their lazily decoded array. 1-> 385 example_value = np.concatenate([first_n_items(array, 1) or [0], 386 last_item(array) or [0]]) 387 ipdb> p first_n_items(array, 1).shape (1,) ipdb> p last_item(array).shape (1, 1) ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue