home / github / issues

Menu
  • GraphQL API
  • Search all tables

issues: 260279615

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
260279615 MDU6SXNzdWUyNjAyNzk2MTU= 1591 indexing/groupby fails on array opened with chunks from netcdf 13190237 closed 0     2 2017-09-25T13:37:43Z 2017-09-26T08:15:45Z 2017-09-26T05:36:26Z CONTRIBUTOR      

Hi, since the last update of dask (to version 0.15.3), iterating over a groupby object and indexing using np.int64 fails, when the DataArray was opend with chunks from netcdf.

I'm using xarray version 0.9.6 and the 'h5netcdf'-engin for reading/writing.

To reproduce: ``` import xarray as xr arr = xr.DataArray(np.random.rand(2, 3, 4), dims=['one', 'two', 'three']) arr.to_netcdf('test.nc', engine='h5netcdf')

arr_disk = xr.open_dataarray('test.nc', engine='h5netcdf', chunks=dict(one=1))

This produces the error:

[g for g in arr_disk.groupby('one')]

/usr/lib/python3.6/site-packages/xarray/core/groupby.py in _iter_grouped(self) 296 """Iterate over each element in this group""" 297 for indices in self._group_indices: --> 298 yield self._obj.isel(**{self._group_dim: indices}) 299 300 def _infer_concat_args(self, applied_example):

/usr/lib/python3.6/site-packages/xarray/core/dataarray.py in isel(self, drop, indexers) 677 DataArray.sel 678 """ --> 679 ds = self._to_temp_dataset().isel(drop=drop, indexers) 680 return self._from_temp_dataset(ds) 681

/usr/lib/python3.6/site-packages/xarray/core/dataset.py in isel(self, drop, indexers) 1141 for name, var in iteritems(self._variables): 1142 var_indexers = dict((k, v) for k, v in indexers if k in var.dims) -> 1143 new_var = var.isel(var_indexers) 1144 if not (drop and name in var_indexers): 1145 variables[name] = new_var

/usr/lib/python3.6/site-packages/xarray/core/variable.py in isel(self, **indexers) 568 if dim in indexers: 569 key[i] = indexers[dim] --> 570 return self[tuple(key)] 571 572 def squeeze(self, dim=None):

/usr/lib/python3.6/site-packages/xarray/core/variable.py in getitem(self, key) 398 dims = tuple(dim for k, dim in zip(key, self.dims) 399 if not isinstance(k, integer_types)) --> 400 values = self._indexable_data[key] 401 # orthogonal indexing should ensure the dimensionality is consistent 402 if hasattr(values, 'ndim'):

/usr/lib/python3.6/site-packages/xarray/core/indexing.py in getitem(self, key) 496 value = value[(slice(None),) * axis + (subkey,)] 497 else: --> 498 value = self.array[key] 499 return value 500

/home/herter/.local/lib/python3.6/site-packages/dask/array/core.py in getitem(self, index) 1220 1221 from .slicing import normalize_index, slice_with_dask_array -> 1222 index2 = normalize_index(index, self.shape) 1223 1224 if any(isinstance(i, Array) for i in index2):

/home/herter/.local/lib/python3.6/site-packages/dask/array/slicing.py in normalize_index(idx, shape) 760 idx = idx + (slice(None),) * (len(shape) - n_sliced_dims) 761 if len([i for i in idx if i is not None]) > len(shape): --> 762 raise IndexError("Too many indices for array") 763 764 none_shape = []

IndexError: Too many indices for array

/home/herter/.local/lib/python3.6/site-packages/dask/array/slicing.py(762)normalize_index() 760 idx = idx + (slice(None),) * (len(shape) - n_sliced_dims) 761 if len([i for i in idx if i is not None]) > len(shape): --> 762 raise IndexError("Too many indices for array") 763 764 none_shape = [] ```

I'm getting the same error when doing arr_disk[np.int64(0)]

Thanks Uli

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1591/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 0 rows from issues_id in issues_labels
  • 2 rows from issue in issue_comments
Powered by Datasette · Queries took 0.698ms · About: xarray-datasette