issues
3 rows where "closed_at" is on date 2021-04-19, comments = 4 and repo = 13221727 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
831148018 | MDU6SXNzdWU4MzExNDgwMTg= | 5034 | output_dtypes needs to be a tuple, not a sequence | LunarLanding 4441338 | closed | 0 | 4 | 2021-03-14T12:47:49Z | 2021-04-19T19:33:14Z | 2021-04-19T19:33:14Z | NONE | If a sequence, I get MVE ( WIP, will finish later ) ``` p,v are chunked in 'time'M is a positive integermsd_w_one_mol takes two args, has two outputsm,w = xr.apply_ufunc( msd_w_one_mol, p,v, input_core_dims=[('time','spatial'),('time',)], output_core_dims=[('interval',),('interval',)], vectorize=True, dask='parallelized', dask_gufunc_kwargs={ 'output_sizes':{'interval':M}, 'allow_rechunk':True, 'meta':(np.empty((M,),dtype=p.dtype),np.empty((M,),dtype=np.min_scalar_type(M)))'output_dtypes':[p.dtype,np.min_scalar_type(M)],
) ``` Stack Trace: ``` ValueError Traceback (most recent call last) ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/gufunc.py in apply_gufunc(func, signature, args, kwargs) 429 try: --> 430 tmp = blockwise( # First try to compute meta 431 func, loop_output_dims, arginds, concatenate=True, **kwargs ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/blockwise.py in blockwise(func, out_ind, name, token, dtype, adjust_chunks, new_axes, align_arrays, concatenate, meta, args, kwargs) 278 --> 279 meta = compute_meta(func, dtype, args[::2], **kwargs) 280 return new_da_object(graph, out, chunks, meta=meta, dtype=dtype) ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/utils.py in compute_meta(func, _dtype, args, kwargs) 138 if isinstance(func, np.vectorize): --> 139 meta = func(args_meta) 140 else: ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/numpy/lib/function_base.py in call(self, args, *kwargs) 2112 -> 2113 return self._vectorize_call(func=func, args=vargs) 2114 ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/numpy/lib/function_base.py in _vectorize_call(self, func, args) 2186 if self.signature is not None: -> 2187 res = self._vectorize_call_with_signature(func, args) 2188 elif not args: ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/numpy/lib/function_base.py in _vectorize_call_with_signature(self, func, args)
2260 for dim in dims):
-> 2261 raise ValueError('cannot call ValueError: cannot call During handling of the above exception, another exception occurred: TypeError Traceback (most recent call last) <ipython-input-119-a2a3ac7df12d> in <module> ----> 1 m,w = xr.apply_ufunc( 2 msd_w_one_mol, 3 p,v, 4 input_core_dims=[('time','spatial'),('time',)], 5 output_core_dims=[('interval',),('interval',)], ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, args) 1126 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc 1127 elif any(isinstance(a, DataArray) for a in args): -> 1128 return apply_dataarray_vfunc( 1129 variables_vfunc, 1130 args, ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, args) 269 270 data_vars = [getattr(a, "variable", a) for a in args] --> 271 result_var = func(data_vars) 272 273 if signature.num_outputs > 1: ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in apply_variable_ufunc(func, signature, exclude_dims, dask, output_dtypes, vectorize, keep_attrs, dask_gufunc_kwargs, args) 722 ) 723 --> 724 result_data = func(input_data) 725 726 if signature.num_outputs == 1: ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/xarray/core/computation.py in func(*arrays) 690 import dask.array as da 691 --> 692 res = da.apply_gufunc( 693 numpy_func, 694 signature.to_gufunc_string(exclude_dims), ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/gufunc.py in apply_gufunc(func, signature, args, *kwargs) 441 ) 442 else: --> 443 meta = tuple( 444 meta_from_array(sample, dtype=odt) 445 for ocd, odt in zip((output_coredimss,), (output_dtypes,)) ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/gufunc.py in <genexpr>(.0) 442 else: 443 meta = tuple( --> 444 meta_from_array(sample, dtype=odt) 445 for ocd, odt in zip((output_coredimss,), (output_dtypes,)) 446 ) ~/dotfiles/conda/base/.conda/lib/python3.8/site-packages/dask/array/utils.py in meta_from_array(x, ndim, dtype) 104 meta = np.array(meta) 105 --> 106 if dtype and meta.dtype != dtype: 107 try: 108 meta = meta.astype(dtype) TypeError: Field elements must be 2- or 3-tuples, got 'dtype('float64')' ``` Preliminary investigation using %debug: ```
ipdb> print(dtype) [dtype('float64'), dtype('uint16')] ipdb> print(meta.dtype) float64 ipdb> print(dtype) [dtype('float64'), dtype('uint16')] ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5034/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
733201109 | MDU6SXNzdWU3MzMyMDExMDk= | 4556 | quick overview example not working with `to_zarr` function with gcs store | skgbanga 8398696 | closed | 0 | 4 | 2020-10-30T13:54:43Z | 2021-04-19T03:18:50Z | 2021-04-19T03:18:50Z | NONE | Hello, Consider the following code: ```py import os import xarray as xr import numpy as np import zarr import gcsfs from .helpers import project, credentials, bucketname # project specific def make_store(key): if key == "memory": return zarr.MemoryStore() if key == "disc": return zarr.DirectoryStore("example.zarr") if key == "gcs": gcs = gcsfs.GCSFileSystem(project=project(), token=credentials()) root = os.path.join(bucketname, "xarray-testing") return gcsfs.GCSMap(root, gcs=gcs, check=False)
data = xr.DataArray(np.random.randn(2, 3), dims=("x", "y"), coords={"x": [10, 20]}) ds = xr.Dataset({"foo": data, "bar": ("x", [1, 2]), "baz": np.pi}) ds.to_zarr(make_store("gcs"), consolidated=True, mode="w") ``` The example dataset is from the quick overview example. The above code works fine for both ```py
ipdb> p data array(3.14159265) ```
I also have implemented a custom zarr store (Details of which are present in this zarr issue) which gives more insight into the issue: ```py
~/.venv/valkyrie/lib/python3.8/site-packages/zarr/core.py in set_basic_selection(self, selection, value, fields) ~/.venv/valkyrie/lib/python3.8/site-packages/zarr/core.py in _set_basic_selection_zd(self, selection, value, fields) ~gcsstore.py in setitem(self, key, value) ~/.venv/valkyrie/lib/python3.8/site-packages/google/cloud/storage/blob.py in upload_from_string(self, data, content_type, client, predefined_acl, if_generation_match, if_generation_not_match, if_metageneration_match, if_metageneration_not_match, timeout, checksum) 2437 "md5", "crc32c" and None. The default is None. 2438 """ -> 2439 data = _to_bytes(data, encoding="utf-8") 2440 string_buffer = BytesIO(data) 2441 self.upload_from_file( ~/.venv/valkyrie/lib/python3.8/site-packages/google/cloud/_helpers.py in _to_bytes(value, encoding) 368 return result 369 else: --> 370 raise TypeError("%r could not be converted to bytes" % (value,)) 371 372 TypeError: array(3.14159265) could not be converted to bytes ``` It seems to me that zarr is not converting the data into its serialized representation (via their codec library) and is directly passing the datatype into MutableMapping which results in an exception since google libraries don't know how to convert the passed data (np.pi) into bytes. ```py ipdb> u
ipdb> p key 'baz/0' ipdb> p value array(3.14159265) ``` Please let me know if you think I should raise this issue in zarr project rather than here. version of xarray and zarr:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4556/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
830930964 | MDU6SXNzdWU4MzA5MzA5NjQ= | 5032 | Error when using xarray where that results in zero dimensions. | imcslatte 5339008 | closed | 0 | 4 | 2021-03-13T16:09:30Z | 2021-04-19T02:48:39Z | 2021-04-19T02:48:38Z | NONE | Xarray 0.17.0 Executing the following code: ```python import xarray as xray ncurl='http://tds.coaps.fsu.edu/thredds/dodsC/samos/data/intermediate/ZMFR/2021/ZMFR_20210303v20001.nc' dat =xray.open_dataset(ncurl) dat=dat.where(dat.lat>-90.0,drop=True)dat=dat.where(dat.lat>-11.0,drop=True) dat=dat.where(dat.lat<30.0,drop=True) dat=dat.where(dat.lon>91.0,drop=True) dat=dat.where(dat.lon<139.0,drop=True) print(dat) ``` Results in the error: ```python Traceback (most recent call last): File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/backends/netCDF4_.py", line 98, in _getitem array = getitem(original_array, key) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/backends/common.py", line 53, in robust_getitem return array[key] File "src/netCDF4/_netCDF4.pyx", line 4397, in netCDF4._netCDF4.Variable.getitem File "/home/hunter/miniconda3/lib/python3.8/site-packages/netCDF4/utils.py", line 467, in _out_array_shape c = count[..., i].ravel()[0] # All elements should be identical. IndexError: index 0 is out of bounds for axis 0 with size 0 During handling of the above exception, another exception occurred: Traceback (most recent call last): File "./test_xarray_error.py", line 7, in <module> dat=dat.where(dat.lat>-11.0,drop=True) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/common.py", line 1273, in where return ops.where_method(self, cond, other) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/ops.py", line 195, in where_method return apply_ufunc( File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/computation.py", line 1116, in apply_ufunc return apply_dataset_vfunc( File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/computation.py", line 428, in apply_dataset_vfunc result_vars = apply_dict_of_variables_vfunc( File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/computation.py", line 373, in apply_dict_of_variables_vfunc result_vars[name] = func(*variable_args) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/computation.py", line 628, in apply_variable_ufunc input_data = [ File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/computation.py", line 629, in <listcomp> broadcast_compat_data(arg, broadcast_dims, core_dims) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/computation.py", line 542, in broadcast_compat_data data = variable.data File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/variable.py", line 374, in data return self.values File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/variable.py", line 554, in values return as_array_or_item(self._data) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/variable.py", line 287, in _as_array_or_item data = np.asarray(data) File "/home/hunter/miniconda3/lib/python3.8/site-packages/numpy/core/_asarray.py", line 102, in asarray return array(a, dtype, copy=False, order=order) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/indexing.py", line 693, in __array__ self._ensure_cached() File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/indexing.py", line 690, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File "/home/hunter/miniconda3/lib/python3.8/site-packages/numpy/core/_asarray.py", line 102, in asarray return array(a, dtype, copy=False, order=order) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/indexing.py", line 663, in __array__ return np.asarray(self.array, dtype=dtype) File "/home/hunter/miniconda3/lib/python3.8/site-packages/numpy/core/_asarray.py", line 102, in asarray return array(a, dtype, copy=False, order=order) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/indexing.py", line 568, in __array__ return np.asarray(array[self.key], dtype=None) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/coding/strings.py", line 236, in __getitem__ return _numpy_char_to_bytes(self.array[key]) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/coding/strings.py", line 193, in _numpy_char_to_bytes arr = np.array(arr, copy=False, order="C") File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/indexing.py", line 568, in __array__ return np.asarray(array[self.key], dtype=None) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/backends/netCDF4.py", line 85, in getitem return indexing.explicit_indexing_adapter( File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/core/indexing.py", line 853, in explicit_indexing_adapter result = raw_indexing_method(raw_key.tuple) File "/home/hunter/miniconda3/lib/python3.8/site-packages/xarray/backends/netCDF4_.py", line 108, in _getitem raise IndexError(msg) IndexError: The indexing operation you are attempting to perform is not valid on netCDF4.Variable object. Try loading your data into memory first by calling .load(). ``` It seems to be related to the fact that:
However, if I uncomment one line and run: ```python import xarray as xray ncurl='http://tds.coaps.fsu.edu/thredds/dodsC/samos/data/intermediate/ZMFR/2021/ZMFR_20210303v20001.nc' dat =xray.open_dataset(ncurl) dat=dat.where(dat.lat>-90.0,drop=True) dat=dat.where(dat.lat>-11.0,drop=True) dat=dat.where(dat.lat<30.0,drop=True) dat=dat.where(dat.lon>91.0,drop=True) dat=dat.where(dat.lon<139.0,drop=True) print(dat) ``` There is no error, even though the line with Perhaps this is not a bug, but maybe I am misunderstanding how where() works in xarray. But it seems inconsistent. Eli Hunter Rutgers University |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5032/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);