home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

5 rows where type = "issue" and user = 4441338 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 3
  • open 2

type 1

  • issue · 5 ✖

repo 1

  • xarray 5
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1664193419 I_kwDOAMm_X85jMZOL 7748 diff('non existing dimension') does not raise exception LunarLanding 4441338 open 0     4 2023-04-12T09:29:58Z 2024-04-21T22:31:37Z   NONE      

What happened?

Calling xr.DataArray.diff with a non-existing dimension does not raise an exception.

What did you expect to happen?

An exception to be raised.

Minimal Complete Verifiable Example

Python import xarray as xr; import numpy as np; xr.DataArray(np.arange(10),dims=('a',)).diff('b')

MVCE confirmation

  • [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [X] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [X] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [X] New issue — a search of GitHub Issues suggests this is not a duplicate.

Relevant log output

No response

Anything else we need to know?

No response

Environment

INSTALLED VERSIONS ------------------ commit: None python: 3.10.9 | packaged by conda-forge | (main, Feb 2 2023, 20:20:04) [GCC 11.3.0] python-bits: 64 OS: Linux OS-release: 5.10.0-21-amd64 machine: x86_64 processor: byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.12.2 libnetcdf: 4.8.1 xarray: 2023.3.0 pandas: 1.5.3 numpy: 1.23.5 scipy: 1.10.1 netCDF4: 1.6.2 pydap: None h5netcdf: 1.1.0 h5py: 3.8.0 Nio: None zarr: 2.14.2 cftime: 1.6.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2023.3.1 distributed: 2023.3.1 matplotlib: 3.7.1 cartopy: None seaborn: 0.12.2 numbagg: None fsspec: 2023.3.0 cupy: None pint: None sparse: 0.14.0 flox: 0.6.9 numpy_groupies: 0.9.20 setuptools: 67.6.0 pip: 23.0.1 conda: 23.1.0 pytest: 7.2.2 mypy: 1.1.1 IPython: 8.11.0 sphinx: 6.1.3
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7748/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
1050937380 I_kwDOAMm_X84-pAgk 5975 Typing of dim in concat.py does not include Hashable LunarLanding 4441338 closed 0     2 2021-11-11T12:38:44Z 2022-05-17T20:53:07Z 2022-05-17T20:53:07Z NONE      

i.e. https://github.com/pydata/xarray/blob/5871637873cd83c3a656ee6f4df86ea6628cf68a/xarray/core/concat.py#L46

Arent' dims supposed to be any Hashable? In other xarray methods they generally are.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5975/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1195097342 I_kwDOAMm_X85HO7z- 6449 query on coords only dataset fails LunarLanding 4441338 open 0     1 2022-04-06T19:46:12Z 2022-04-14T22:16:00Z   NONE      

What happened?

I make a dataset with some variables, and make them all coordinates. Then I try to query on the dataset. Error ensues.

What did you expect to happen?

No error.

Minimal Complete Verifiable Example

```Python import xarray as xr import pandas as pd x = xr.Dataset.from_dataframe(pd.DataFrame(data=[[0,1],[2,3]],columns=['a','b']))

display(x.query(index='a==0')) #fine

y = x.set_coords(['a','b'])

display(y) #fine

y.query(index='a==0') #error ```

Relevant log output

```Python

KeyError Traceback (most recent call last) ~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/scope.py in resolve(self, key, is_local) 199 assert not is_local and not self.has_resolvers --> 200 return self.scope[key] 201 except KeyError:

~/miniconda3/lib/python3.9/collections/init.py in getitem(self, key) 940 pass --> 941 return self.missing(key) # support subclasses that define missing 942

~/miniconda3/lib/python3.9/collections/init.py in missing(self, key) 932 def missing(self, key): --> 933 raise KeyError(key) 934

KeyError: 'a'

During handling of the above exception, another exception occurred:

KeyError Traceback (most recent call last) ~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/scope.py in resolve(self, key, is_local) 205 # e.g., df[df > 0] --> 206 return self.temps[key] 207 except KeyError as err:

KeyError: 'a'

The above exception was the direct cause of the following exception:

UndefinedVariableError Traceback (most recent call last) /tmp/ipykernel_23733/4091370488.py in <cell line: 7>() 5 y = x.set_coords(['a','b']) 6 display(y) ----> 7 y.query(index='a==0')

~/miniconda3/lib/python3.9/site-packages/xarray/core/dataset.py in query(self, queries, parser, engine, missing_dims, **queries_kwargs) 7605 7606 # evaluate the queries to create the indexers -> 7607 indexers = { 7608 dim: pd.eval(expr, resolvers=[self], parser=parser, engine=engine) 7609 for dim, expr in queries.items()

~/miniconda3/lib/python3.9/site-packages/xarray/core/dataset.py in <dictcomp>(.0) 7606 # evaluate the queries to create the indexers 7607 indexers = { -> 7608 dim: pd.eval(expr, resolvers=[self], parser=parser, engine=engine) 7609 for dim, expr in queries.items() 7610 }

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/eval.py in eval(expr, parser, engine, truediv, local_dict, global_dict, resolvers, level, target, inplace) 348 ) 349 --> 350 parsed_expr = Expr(expr, engine=engine, parser=parser, env=env) 351 352 # construct the engine and evaluate the parsed expression

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in init(self, expr, engine, parser, env, level) 809 self.parser = parser 810 self._visitor = PARSERSparser --> 811 self.terms = self.parse() 812 813 @property

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in parse(self) 828 Parse an expression. 829 """ --> 830 return self._visitor.visit(self.expr) 831 832 @property

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit(self, node, kwargs) 413 method = "visit_" + type(node).name 414 visitor = getattr(self, method) --> 415 return visitor(node, kwargs) 416 417 def visit_Module(self, node, **kwargs):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit_Module(self, node, kwargs) 419 raise SyntaxError("only a single expression is allowed") 420 expr = node.body[0] --> 421 return self.visit(expr, kwargs) 422 423 def visit_Expr(self, node, **kwargs):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit(self, node, kwargs) 413 method = "visit_" + type(node).name 414 visitor = getattr(self, method) --> 415 return visitor(node, kwargs) 416 417 def visit_Module(self, node, **kwargs):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit_Expr(self, node, kwargs) 422 423 def visit_Expr(self, node, kwargs): --> 424 return self.visit(node.value, **kwargs) 425 426 def _rewrite_membership_op(self, node, left, right):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit(self, node, kwargs) 413 method = "visit_" + type(node).name 414 visitor = getattr(self, method) --> 415 return visitor(node, kwargs) 416 417 def visit_Module(self, node, **kwargs):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit_Compare(self, node, **kwargs) 721 op = self.translate_In(ops[0]) 722 binop = ast.BinOp(op=op, left=node.left, right=comps[0]) --> 723 return self.visit(binop) 724 725 # recursive case: we have a chained comparison, a CMP b CMP c, etc.

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit(self, node, kwargs) 413 method = "visit_" + type(node).name 414 visitor = getattr(self, method) --> 415 return visitor(node, kwargs) 416 417 def visit_Module(self, node, **kwargs):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit_BinOp(self, node, kwargs) 534 535 def visit_BinOp(self, node, kwargs): --> 536 op, op_class, left, right = self._maybe_transform_eq_ne(node) 537 left, right = self._maybe_downcast_constants(left, right) 538 return self._maybe_evaluate_binop(op, op_class, left, right)

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in _maybe_transform_eq_ne(self, node, left, right) 454 def _maybe_transform_eq_ne(self, node, left=None, right=None): 455 if left is None: --> 456 left = self.visit(node.left, side="left") 457 if right is None: 458 right = self.visit(node.right, side="right")

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit(self, node, kwargs) 413 method = "visit_" + type(node).name 414 visitor = getattr(self, method) --> 415 return visitor(node, kwargs) 416 417 def visit_Module(self, node, **kwargs):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/expr.py in visit_Name(self, node, kwargs) 547 548 def visit_Name(self, node, kwargs): --> 549 return self.term_type(node.id, self.env, kwargs) 550 551 def visit_NameConstant(self, node, kwargs):

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/ops.py in init(self, name, env, side, encoding) 96 tname = str(name) 97 self.is_local = tname.startswith(LOCAL_TAG) or tname in DEFAULT_GLOBALS ---> 98 self._value = self._resolve_name() 99 self.encoding = encoding 100

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/ops.py in _resolve_name(self) 113 114 def _resolve_name(self): --> 115 res = self.env.resolve(self.local_name, is_local=self.is_local) 116 self.update(res) 117

~/miniconda3/lib/python3.9/site-packages/pandas/core/computation/scope.py in resolve(self, key, is_local) 209 from pandas.core.computation.ops import UndefinedVariableError 210 --> 211 raise UndefinedVariableError(key, is_local) from err 212 213 def swapkey(self, old_key: str, new_key: str, new_value=None) -> None:

UndefinedVariableError: name 'a' is not defined ```

Anything else we need to know?

If the dataset has one data variable, then the error does not happen.

Environment

INSTALLED VERSIONS

commit: None python: 3.9.12 | packaged by conda-forge | (main, Mar 24 2022, 23:25:59) [GCC 10.3.0] python-bits: 64 OS: Linux OS-release: 4.19.0-19-amd64 machine: x86_64 processor: byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.12.1 libnetcdf: 4.8.1

xarray: 2022.3.0 pandas: 1.4.1 numpy: 1.22.3 scipy: 1.8.0 netCDF4: 1.5.8 pydap: None h5netcdf: 1.0.0 h5py: 3.6.0 Nio: None zarr: 2.11.1 cftime: 1.5.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2022.03.0 distributed: 2022.3.0 matplotlib: 3.5.1 cartopy: None seaborn: 0.11.2 numbagg: None fsspec: 2022.02.0 cupy: None pint: 0.18 sparse: 0.13.0 setuptools: 59.8.0 pip: 22.0.4 conda: 4.12.0 pytest: 7.1.1 IPython: 7.32.0 sphinx: None

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6449/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
869786882 MDU6SXNzdWU4Njk3ODY4ODI= 5228 sel(dim=slice(a,b,c)) only accepts integers for c, uses c as isel does LunarLanding 4441338 closed 0     1 2021-04-28T10:22:40Z 2021-04-28T10:47:00Z 2021-04-28T10:46:59Z NONE      

Trying to slice with sel using label delta values as interval, i.e. da.sel(mydim=slice(None,None,c).

What happened: The label delta value is not accepted.

What you expected to happen: Selecting c-spaced samples from the DataArray da.

Minimal Complete Verifiable Example:

```python dim0_n = 100 d = xr.DataArray(np.arange(dim0_n),coords={'dim0':np.linspace(0,1,num=dim0_n)},dims=('dim0',)) d.sel(dim0=slice(None,None,0.1))

expected 0,10,20,...,90

instead get a TypeError: 'float' object cannot be interpreted as an integer

(full traceback below)

```

Full traceback

``` TypeError Traceback (most recent call last) <ipython-input-43-9eb3e3c8bbe4> in <module> 1 dim0_n = 100 2 d = xr.DataArray(np.arange(dim0_n),coords={'dim0':np.linspace(0,1,num=dim0_n)},dims=('dim0',)) ----> 3 d.sel(dim0=slice(None,None,0.1)) 4 # expected 0,10,20,...,90 5 # instead get a TypeError (traceback below) ~/miniconda3/lib/python3.9/site-packages/xarray/core/dataarray.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs) 1252 Dimensions without coordinates: points 1253 """ -> 1254 ds = self._to_temp_dataset().sel( 1255 indexers=indexers, 1256 drop=drop, ~/miniconda3/lib/python3.9/site-packages/xarray/core/dataset.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs) 2231 self, indexers=indexers, method=method, tolerance=tolerance 2232 ) -> 2233 result = self.isel(indexers=pos_indexers, drop=drop) 2234 return result._overwrite_indexes(new_indexes) 2235 ~/miniconda3/lib/python3.9/site-packages/xarray/core/dataset.py in isel(self, indexers, drop, missing_dims, **indexers_kwargs) 2093 var_indexers = {k: v for k, v in indexers.items() if k in var_value.dims} 2094 if var_indexers: -> 2095 var_value = var_value.isel(var_indexers) 2096 if drop and var_value.ndim == 0 and var_name in coord_names: 2097 coord_names.remove(var_name) ~/miniconda3/lib/python3.9/site-packages/xarray/core/variable.py in isel(self, indexers, missing_dims, **indexers_kwargs) 1164 1165 key = tuple(indexers.get(dim, slice(None)) for dim in self.dims) -> 1166 return self[key] 1167 1168 def squeeze(self, dim=None): ~/miniconda3/lib/python3.9/site-packages/xarray/core/variable.py in __getitem__(self, key) 808 array `x.values` directly. 809 """ --> 810 dims, indexer, new_order = self._broadcast_indexes(key) 811 data = as_indexable(self._data)[indexer] 812 if new_order: ~/miniconda3/lib/python3.9/site-packages/xarray/core/variable.py in _broadcast_indexes(self, key) 647 648 if all(isinstance(k, BASIC_INDEXING_TYPES) for k in key): --> 649 return self._broadcast_indexes_basic(key) 650 651 self._validate_indexers(key) ~/miniconda3/lib/python3.9/site-packages/xarray/core/variable.py in _broadcast_indexes_basic(self, key) 675 dim for k, dim in zip(key, self.dims) if not isinstance(k, integer_types) 676 ) --> 677 return dims, BasicIndexer(key), None 678 679 def _validate_indexers(self, key): ~/miniconda3/lib/python3.9/site-packages/xarray/core/indexing.py in __init__(self, key) 382 k = int(k) 383 elif isinstance(k, slice): --> 384 k = as_integer_slice(k) 385 else: 386 raise TypeError( ~/miniconda3/lib/python3.9/site-packages/xarray/core/indexing.py in as_integer_slice(value) 359 start = as_integer_or_none(value.start) 360 stop = as_integer_or_none(value.stop) --> 361 step = as_integer_or_none(value.step) 362 return slice(start, stop, step) 363 ~/miniconda3/lib/python3.9/site-packages/xarray/core/indexing.py in as_integer_or_none(value) 353 354 def as_integer_or_none(value): --> 355 return None if value is None else operator.index(value) 356 357 TypeError: 'float' object cannot be interpreted as an integer ```

Anything else we need to know?:

I think a warning in the documentation would be nice. It happened that I was giving integer values to c and xarray was subsetting datasets in an entirely different way that the one intended. It took noticing a downstream effect to find this behavior.

Environment:

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.9.2 | packaged by conda-forge | (default, Feb 21 2021, 05:02:46) [GCC 9.3.0] python-bits: 64 OS: Linux OS-release: 3.10.0-1160.11.1.el7.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: None LOCALE: en_US.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.17.0 pandas: 1.2.4 numpy: 1.20.2 scipy: 1.6.2 netCDF4: 1.5.6 pydap: None h5netcdf: 0.11.0 h5py: 3.1.0 Nio: None zarr: 2.8.1 cftime: 1.4.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2021.04.1 distributed: 2021.04.1 matplotlib: 3.3.4 cartopy: None seaborn: 0.11.1 numbagg: None pint: None setuptools: 51.0.0.post20201207 pip: 21.1 conda: 4.10.1 pytest: None IPython: 7.22.0 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5228/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
831148018 MDU6SXNzdWU4MzExNDgwMTg= 5034 output_dtypes needs to be a tuple, not a sequence LunarLanding 4441338 closed 0     4 2021-03-14T12:47:49Z 2021-04-19T19:33:14Z 2021-04-19T19:33:14Z NONE      

https://github.com/pydata/xarray/blob/d4b7a608bab0e7c140937b0b59ca45115d205145/xarray/core/computation.py#L807

If a sequence, I get TypeError: Field elements must be 2- or 3-tuples, got 'dtype('float64')'

MVE ( WIP, will finish later )

```

p,v are chunked in 'time'

M is a positive integer

msd_w_one_mol takes two args, has two outputs

m,w = xr.apply_ufunc( msd_w_one_mol, p,v, input_core_dims=[('time','spatial'),('time',)], output_core_dims=[('interval',),('interval',)], vectorize=True, dask='parallelized', dask_gufunc_kwargs={ 'output_sizes':{'interval':M}, 'allow_rechunk':True,

'meta':(np.empty((M,),dtype=p.dtype),np.empty((M,),dtype=np.min_scalar_type(M)))

'output_dtypes':[p.dtype,np.min_scalar_type(M)],

},
output_dtypes=[p.dtype,np.min_scalar_type(M)] # error
output_dtypes=(p.dtype,np.min_scalar_type(M)) # works

)

```

Stack Trace: ```


ValueError Traceback (most recent call last) ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/gufunc.py in apply_gufunc(func, signature, args, kwargs) 429 try: --> 430 tmp = blockwise( # First try to compute meta 431 func, loop_output_dims, arginds, concatenate=True, **kwargs

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/blockwise.py in blockwise(func, out_ind, name, token, dtype, adjust_chunks, new_axes, align_arrays, concatenate, meta, args, kwargs) 278 --> 279 meta = compute_meta(func, dtype, args[::2], **kwargs) 280 return new_da_object(graph, out, chunks, meta=meta, dtype=dtype)

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/utils.py in compute_meta(func, _dtype, args, kwargs) 138 if isinstance(func, np.vectorize): --> 139 meta = func(args_meta) 140 else:

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/numpy/lib/function_base.py in call(self, args, *kwargs) 2112 -> 2113 return self._vectorize_call(func=func, args=vargs) 2114

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/numpy/lib/function_base.py in _vectorize_call(self, func, args) 2186 if self.signature is not None: -> 2187 res = self._vectorize_call_with_signature(func, args) 2188 elif not args:

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/numpy/lib/function_base.py in _vectorize_call_with_signature(self, func, args) 2260 for dim in dims): -> 2261 raise ValueError('cannot call vectorize with a signature ' 2262 'including new output dimensions on size 0 '

ValueError: cannot call vectorize with a signature including new output dimensions on size 0 inputs

During handling of the above exception, another exception occurred:

TypeError Traceback (most recent call last) <ipython-input-119-a2a3ac7df12d> in <module> ----> 1 m,w = xr.apply_ufunc( 2 msd_w_one_mol, 3 p,v, 4 input_core_dims=[('time','spatial'),('time',)], 5 output_core_dims=[('interval',),('interval',)],

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, args) 1126 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc 1127 elif any(isinstance(a, DataArray) for a in args): -> 1128 return apply_dataarray_vfunc( 1129 variables_vfunc, 1130 args,

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, args) 269 270 data_vars = [getattr(a, "variable", a) for a in args] --> 271 result_var = func(data_vars) 272 273 if signature.num_outputs > 1:

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in apply_variable_ufunc(func, signature, exclude_dims, dask, output_dtypes, vectorize, keep_attrs, dask_gufunc_kwargs, args) 722 ) 723 --> 724 result_data = func(input_data) 725 726 if signature.num_outputs == 1:

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in func(*arrays) 690 import dask.array as da 691 --> 692 res = da.apply_gufunc( 693 numpy_func, 694 signature.to_gufunc_string(exclude_dims),

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/gufunc.py in apply_gufunc(func, signature, args, *kwargs) 441 ) 442 else: --> 443 meta = tuple( 444 meta_from_array(sample, dtype=odt) 445 for ocd, odt in zip((output_coredimss,), (output_dtypes,))

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/gufunc.py in <genexpr>(.0) 442 else: 443 meta = tuple( --> 444 meta_from_array(sample, dtype=odt) 445 for ocd, odt in zip((output_coredimss,), (output_dtypes,)) 446 )

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/utils.py in meta_from_array(x, ndim, dtype) 104 meta = np.array(meta) 105 --> 106 if dtype and meta.dtype != dtype: 107 try: 108 meta = meta.astype(dtype)

TypeError: Field elements must be 2- or 3-tuples, got 'dtype('float64')' ```

Preliminary investigation using %debug: ```

~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/utils.py(106)meta_from_array() 104 meta = np.array(meta) 105 --> 106 if dtype and meta.dtype != dtype: 107 try: 108 meta = meta.astype(dtype)

ipdb> print(dtype)

[dtype('float64'), dtype('uint16')]

ipdb> print(meta.dtype)

float64

ipdb> print(dtype)

[dtype('float64'), dtype('uint16')]

```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5034/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 23.952ms · About: xarray-datasette