home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where issue = 724777139 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 2

  • mathause 3
  • alexamici 1

issue 1

  • xr.DataArray.where(drop=True) crashes if the result is False everywhere (h5netcdf engine) · 4 ✖

author_association 1

  • MEMBER 4
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
753607645 https://github.com/pydata/xarray/issues/4524#issuecomment-753607645 https://api.github.com/repos/pydata/xarray/issues/4524 MDEyOklzc3VlQ29tbWVudDc1MzYwNzY0NQ== alexamici 226037 2021-01-03T12:10:31Z 2021-01-03T12:10:31Z MEMBER

@mathause I think you are right, #4733 (the issue that #4737 aimed to fix) is probably not related to cfgrib, I didn't notice that no backend code was mentioned in the traceback.

I was fooled by the fact that we had a know bug in cfgrib https://github.com/ecmwf/cfgrib/issues/157 and the error message reported in #4733 was the same.

So the PR #4737 is needed anyhow, but it doesn't really solve the issue it refers to :/

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xr.DataArray.where(drop=True) crashes if the result is False everywhere (h5netcdf engine) 724777139
753587798 https://github.com/pydata/xarray/issues/4524#issuecomment-753587798 https://api.github.com/repos/pydata/xarray/issues/4524 MDEyOklzc3VlQ29tbWVudDc1MzU4Nzc5OA== mathause 10194086 2021-01-03T09:01:19Z 2021-01-03T09:01:19Z MEMBER

@alexamici could this be a similar issue as #4737? (we decided against #4453)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xr.DataArray.where(drop=True) crashes if the result is False everywhere (h5netcdf engine) 724777139
712455231 https://github.com/pydata/xarray/issues/4524#issuecomment-712455231 https://api.github.com/repos/pydata/xarray/issues/4524 MDEyOklzc3VlQ29tbWVudDcxMjQ1NTIzMQ== mathause 10194086 2020-10-19T21:33:53Z 2020-10-19T23:01:20Z MEMBER

git bisect tells me #4379 is the offending PR. So this likely has the same underlying cause as #4449 & there is a draft for a fix (#4453) which may also fix this issue

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xr.DataArray.where(drop=True) crashes if the result is False everywhere (h5netcdf engine) 724777139
712450334 https://github.com/pydata/xarray/issues/4524#issuecomment-712450334 https://api.github.com/repos/pydata/xarray/issues/4524 MDEyOklzc3VlQ29tbWVudDcxMjQ1MDMzNA== mathause 10194086 2020-10-19T21:23:06Z 2020-10-19T21:23:06Z MEMBER

This also fails for engine='netcdf4' ```python ds = xr.tutorial.open_dataset('ROMS_example', engine='netcdf4') da = ds["zeta"].isel(eta_rho=0, xi_rho=0) print(da.where(da>1000, drop=True))

`` with a different error:ValueError: Cannot apply_along_axis when any iteration dimensions are 0`

Traceback

```python-traceback ValueError Traceback (most recent call last) <ipython-input-14-9df3a1c8f9e1> in <module> ----> 1 da.where(da>1000, drop=True) ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/common.py in where(self, cond, other, drop) 1266 cond = cond.isel(**indexers) 1267 -> 1268 return ops.where_method(self, cond, other) 1269 1270 def close(self: Any) -> None: ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/ops.py in where_method(self, cond, other) 191 # alignment for three arguments is complicated, so don't support it yet 192 join = "inner" if other is dtypes.NA else "exact" --> 193 return apply_ufunc( 194 duck_array_ops.where_method, 195 self, ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/computation.py in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, *args) 1102 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc 1103 elif any(isinstance(a, DataArray) for a in args): -> 1104 return apply_dataarray_vfunc( 1105 variables_vfunc, 1106 *args, ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/computation.py in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, *args) 257 else: 258 name = result_name(args) --> 259 result_coords = build_output_coords(args, signature, exclude_dims) 260 261 data_vars = [getattr(a, "variable", a) for a in args] ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/computation.py in build_output_coords(args, signature, exclude_dims) 222 else: 223 # TODO: save these merged indexes, instead of re-computing them later --> 224 merged_vars, unused_indexes = merge_coordinates_without_align( 225 coords_list, exclude_dims=exclude_dims 226 ) ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/merge.py in merge_coordinates_without_align(objects, prioritized, exclude_dims) 327 filtered = collected 328 --> 329 return merge_collected(filtered, prioritized) 330 331 ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/merge.py in merge_collected(grouped, prioritized, compat) 227 variables = [variable for variable, _ in elements_list] 228 try: --> 229 merged_vars[name] = unique_variable(name, variables, compat) 230 except MergeError: 231 if compat != "minimal": ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/merge.py in unique_variable(name, variables, compat, equals) 118 if compat == "broadcast_equals": 119 dim_lengths = broadcast_dimension_size(variables) --> 120 out = out.set_dims(dim_lengths) 121 122 if compat == "no_conflicts": ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/variable.py in set_dims(self, dims, shape) 1438 # don't use broadcast_to unless necessary so the result remains 1439 # writeable if possible -> 1440 expanded_data = self.data 1441 elif shape is not None: 1442 dims_map = dict(zip(dims, shape)) ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/variable.py in data(self) 357 return self._data 358 else: --> 359 return self.values 360 361 @data.setter ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/variable.py in values(self) 508 def values(self): 509 """The variable's data as a numpy.ndarray""" --> 510 return _as_array_or_item(self._data) 511 512 @values.setter ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/variable.py in _as_array_or_item(data) 270 data = data.get() 271 else: --> 272 data = np.asarray(data) 273 if data.ndim == 0: 274 if data.dtype.kind == "M": ~/conda/envs/xarray_dev/lib/python3.8/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/indexing.py in __array__(self, dtype) 683 684 def __array__(self, dtype=None): --> 685 self._ensure_cached() 686 return np.asarray(self.array, dtype=dtype) 687 ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/indexing.py in _ensure_cached(self) 680 def _ensure_cached(self): 681 if not isinstance(self.array, NumpyIndexingAdapter): --> 682 self.array = NumpyIndexingAdapter(np.asarray(self.array)) 683 684 def __array__(self, dtype=None): ~/conda/envs/xarray_dev/lib/python3.8/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/indexing.py in __array__(self, dtype) 653 654 def __array__(self, dtype=None): --> 655 return np.asarray(self.array, dtype=dtype) 656 657 def __getitem__(self, key): ~/conda/envs/xarray_dev/lib/python3.8/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/indexing.py in __array__(self, dtype) 558 def __array__(self, dtype=None): 559 array = as_indexable(self.array) --> 560 return np.asarray(array[self.key], dtype=None) 561 562 def transpose(self, order): ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/backends/netCDF4_.py in __getitem__(self, key) 70 71 def __getitem__(self, key): ---> 72 return indexing.explicit_indexing_adapter( 73 key, self.shape, indexing.IndexingSupport.OUTER, self._getitem 74 ) ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/core/indexing.py in explicit_indexing_adapter(key, shape, indexing_support, raw_indexing_method) 843 """ 844 raw_key, numpy_indices = decompose_indexer(key, shape, indexing_support) --> 845 result = raw_indexing_method(raw_key.tuple) 846 if numpy_indices.tuple: 847 # index the loaded np.ndarray ~/conda/envs/xarray_dev/lib/python3.8/site-packages/xarray/backends/netCDF4_.py in _getitem(self, key) 83 with self.datastore.lock: 84 original_array = self.get_array(needs_lock=False) ---> 85 array = getitem(original_array, key) 86 except IndexError: 87 # Catch IndexError in netCDF4 and return a more informative netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.__getitem__() ~/conda/envs/xarray_dev/lib/python3.8/site-packages/netCDF4/utils.py in _StartCountStride(elem, shape, dimensions, grp, datashape, put, use_get_vars) 435 # ITERABLE # 436 elif np.iterable(e) and np.array(e).dtype.kind in 'i': # Sequence of integers --> 437 start[...,i] = np.apply_along_axis(lambda x: e*x, i, np.ones(sdim[:-1])) 438 indices[...,i] = np.apply_along_axis(lambda x: np.arange(sdim[i])*x, i, np.ones(sdim[:-1], int)) 439 <__array_function__ internals> in apply_along_axis(*args, **kwargs) ~/conda/envs/xarray_dev/lib/python3.8/site-packages/numpy/lib/shape_base.py in apply_along_axis(func1d, axis, arr, *args, **kwargs) 374 ind0 = next(inds) 375 except StopIteration as e: --> 376 raise ValueError( 377 'Cannot apply_along_axis when any iteration dimensions are 0' 378 ) from None ValueError: Cannot apply_along_axis when any iteration dimensions are 0 ```

It does not fail when you load the data beforehand:

python import xarray as xr ds = xr.tutorial.open_dataset('ROMS_example', engine='h5netcdf') da = ds["zeta"] da.load() print(da.where(da>1000, drop=True))

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xr.DataArray.where(drop=True) crashes if the result is False everywhere (h5netcdf engine) 724777139

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 11.077ms · About: xarray-datasette