issues
66 rows where type = "issue" and user = 14808389 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, state_reason, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
590630281 | MDU6SXNzdWU1OTA2MzAyODE= | 3921 | issues discovered by the all-but-dask CI | keewis 14808389 | closed | 0 | 4 | 2020-03-30T22:08:46Z | 2024-04-25T14:48:15Z | 2024-02-10T02:57:34Z | MEMBER | After adding the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3921/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2079089277 | I_kwDOAMm_X8577GJ9 | 8607 | allow computing just a small number of variables | keewis 14808389 | open | 0 | 4 | 2024-01-12T15:21:27Z | 2024-01-12T20:20:29Z | MEMBER | Is your feature request related to a problem?I frequently find myself computing a handful of variables of a dataset (typically coordinates) and assigning them back to the dataset, and wishing we had a method / function that allowed that. Describe the solution you'd likeI'd imagine something like
Describe alternatives you've consideredSo far I've been using something like
Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8607/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1655290694 | I_kwDOAMm_X85iqbtG | 7721 | `as_shared_dtype` converts scalars to 0d `numpy` arrays if chunked `cupy` is involved | keewis 14808389 | open | 0 | 7 | 2023-04-05T09:48:34Z | 2023-12-04T10:45:43Z | MEMBER | I tried to run TypeError Traceback (most recent call last) Cell In[4], line 1 ----> 1 arr.chunk().where(mask).compute() File ~/repos/xarray/xarray/core/dataarray.py:1095, in DataArray.compute(self, kwargs) 1076 """Manually trigger loading of this array's data from disk or a 1077 remote source into memory and return a new array. The original is 1078 left unaltered. (...) 1092 dask.compute 1093 """ 1094 new = self.copy(deep=False) -> 1095 return new.load(kwargs) File ~/repos/xarray/xarray/core/dataarray.py:1069, in DataArray.load(self, kwargs) 1051 def load(self: T_DataArray, kwargs) -> T_DataArray: 1052 """Manually trigger loading of this array's data from disk or a 1053 remote source into memory and return this array. 1054 (...) 1067 dask.compute 1068 """ -> 1069 ds = self._to_temp_dataset().load(**kwargs) 1070 new = self._from_temp_dataset(ds) 1071 self._variable = new._variable File ~/repos/xarray/xarray/core/dataset.py:752, in Dataset.load(self, kwargs) 749 import dask.array as da 751 # evaluate all the dask arrays simultaneously --> 752 evaluated_data = da.compute(*lazy_data.values(), kwargs) 754 for k, data in zip(lazy_data, evaluated_data): 755 self.variables[k].data = data File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/base.py:600, in compute(traverse, optimize_graph, scheduler, get, args, kwargs) 597 keys.append(x.dask_keys()) 598 postcomputes.append(x.dask_postcompute()) --> 600 results = schedule(dsk, keys, kwargs) 601 return repack([f(r, a) for r, (f, a) in zip(results, postcomputes)]) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/threaded.py:89, in get(dsk, keys, cache, num_workers, pool, kwargs) 86 elif isinstance(pool, multiprocessing.pool.Pool): 87 pool = MultiprocessingPoolExecutor(pool) ---> 89 results = get_async( 90 pool.submit, 91 pool._max_workers, 92 dsk, 93 keys, 94 cache=cache, 95 get_id=_thread_get_id, 96 pack_exception=pack_exception, 97 kwargs, 98 ) 100 # Cleanup pools associated to dead threads 101 with pools_lock: File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:511, in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs) 509 _execute_task(task, data) # Re-execute locally 510 else: --> 511 raise_exception(exc, tb) 512 res, worker_id = loads(res_info) 513 state["cache"][key] = res File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:319, in reraise(exc, tb) 317 if exc.traceback is not tb: 318 raise exc.with_traceback(tb) --> 319 raise exc File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:224, in execute_task(key, task_info, dumps, loads, get_id, pack_exception) 222 try: 223 task, data = loads(task_info) --> 224 result = _execute_task(task, data) 225 id = get_id() 226 result = dumps((result, id)) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk) 115 func, args = arg[0], arg[1:] 116 # Note: Don't assign the subtask results to a variable. numpy detects 117 # temporaries by their reference count and can execute certain 118 # operations in-place. --> 119 return func(*(_execute_task(a, cache) for a in args)) 120 elif not ishashable(arg): 121 return arg File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/optimization.py:990, in SubgraphCallable.call(self, *args) 988 if not len(args) == len(self.inkeys): 989 raise ValueError("Expected %d args, got %d" % (len(self.inkeys), len(args))) --> 990 return core.get(self.dsk, self.outkey, dict(zip(self.inkeys, args))) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:149, in get(dsk, out, cache) 147 for key in toposort(dsk): 148 task = dsk[key] --> 149 result = _execute_task(task, cache) 150 cache[key] = result 151 result = _execute_task(out, cache) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk) 115 func, args = arg[0], arg[1:] 116 # Note: Don't assign the subtask results to a variable. numpy detects 117 # temporaries by their reference count and can execute certain 118 # operations in-place. --> 119 return func(*(_execute_task(a, cache) for a in args)) 120 elif not ishashable(arg): 121 return arg File <array_function internals>:180, in where(args, *kwargs) File cupy/_core/core.pyx:1723, in cupy._core.core._ndarray_base.array_function() File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/cupy/_sorting/search.py:211, in where(condition, x, y) 209 if fusion._is_fusing(): 210 return fusion._call_ufunc(_where_ufunc, condition, x, y) --> 211 return _where_ufunc(condition.astype('?'), x, y) File cupy/_core/_kernel.pyx:1287, in cupy._core._kernel.ufunc.call() File cupy/_core/_kernel.pyx:160, in cupy._core._kernel._preprocess_args() File cupy/_core/_kernel.pyx:146, in cupy._core._kernel._preprocess_arg() TypeError: Unsupported type <class 'numpy.ndarray'>
I think the reason is that this: https://github.com/pydata/xarray/blob/d4db16699f30ad1dc3e6861601247abf4ac96567/xarray/core/duck_array_ops.py#L195 is not sufficient to detect cc @jacobtomlinson |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7721/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1730451312 | I_kwDOAMm_X85nJJdw | 7879 | occasional segfaults on CI | keewis 14808389 | closed | 0 | 3 | 2023-05-29T09:52:01Z | 2023-11-06T22:03:43Z | 2023-11-06T22:03:42Z | MEMBER | The upstream-dev CI currently fails sometimes due to a segfault (the normal CI crashes, too, but since we use I'm not sure why, and I can't reproduce locally, either. Given that log of the segfaulting CI job``` ============================= test session starts ============================== platform linux -- Python 3.10.11, pytest-7.3.1, pluggy-1.0.0 rootdir: /home/runner/work/xarray/xarray configfile: setup.cfg testpaths: xarray/tests, properties plugins: env-0.8.1, xdist-3.3.1, timeout-2.1.0, cov-4.1.0, reportlog-0.1.2, hypothesis-6.75.6 timeout: 60.0s timeout method: signal timeout func_only: False collected 16723 items / 2 skipped xarray/tests/test_accessor_dt.py ....................................... [ 0%] ........................................................................ [ 0%] ........................................................................ [ 1%] ........................................................................ [ 1%] ............................... [ 1%] xarray/tests/test_accessor_str.py ...................................... [ 1%] ........................................................................ [ 2%] ........................................................................ [ 2%] ............................................. [ 3%] xarray/tests/test_array_api.py ........... [ 3%] xarray/tests/test_backends.py ........................X........x........ [ 3%] ...................................s.........................X........x. [ 3%] .........................................s.........................X.... [ 4%] ....x.......................................X........................... [ 4%] ....................................X........x.......................... [ 5%] .............X.......................................................... [ 5%] ....X........x....................x.x................X.................. [ 5%] x..x..x..x...................................X........x................. [ 6%] ...x.x................X..................x..x..x..x..................... [ 6%] ..............X........x....................x.x................X........ [ 7%] ..........x..x..x..x.......................................X........x... [ 7%] ...........................................X........x................... [ 8%] ...ss........................X........x................................. [ 8%] .................X........x............................................. [ 8%] ..X........x.............................................X........x..... [ 9%] ............................................X........x.................. [ 9%] .........................................................X........x..... [ 10%] ......................................................................X. [ 10%] .......x................................................................ [ 11%] Fatal Python error: Segmentation fault Thread 0x00007f9c7b8ff640 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 81 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Current thread 0x00007f9c81f1d640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9c82f1e640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9ca575e740 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 320 in wait File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/queue.py", line 171 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 137 in queue_get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 500 in get_async File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py", line 89 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py", line 595 in compute File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 1046 in open_mfdataset File "/home/runner/work/xarray/xarray/xarray/tests/test_backends.py", line 3295 in test_open_mfdataset_manyfiles File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 1799 in runtest File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 262 in <lambda> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 341 in from_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 261 in call_runtest_hook File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 222 in call_and_report File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 133 in runtestprotocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 348 in pytest_runtestloop File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 323 in _main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 269 in wrap_session File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 316 in pytest_cmdline_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 166 in main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 189 in console_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pytest/__main__.py", line 5 in <module> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 86 in _run_code File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 196 in _run_module_as_main Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pandas._libs.ops, numexpr.interpreter, bottleneck.move, bottleneck.nonreduce, bottleneck.nonreduce_axis, bottleneck.reduce, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.indexing, pandas._libs.index, pandas._libs.internals, pandas._libs.join, pandas._libs.writers, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, cftime._cftime, yaml._yaml, cytoolz.utils, cytoolz.itertoolz, cytoolz.functoolz, cytoolz.dicttoolz, cytoolz.recipes, xxhash._xxhash, psutil._psutil_linux, psutil._psutil_posix, markupsafe._speedups, numpy.linalg.lapack_lite, matplotlib._c_internal_utils, PIL._imaging, matplotlib._path, kiwisolver._cext, scipy._lib._ccallback_c, _cffi_backend, unicodedata2, netCDF4._netCDF4, h5py._errors, h5py.defs, h5py._objects, h5py.h5, h5py.h5r, h5py.utils, h5py.h5s, h5py.h5ac, h5py.h5p, h5py.h5t, h5py._conv, h5py.h5z, h5py._proxy, h5py.h5a, h5py.h5d, h5py.h5ds, h5py.h5g, h5py.h5i, h5py.h5f, h5py.h5fd, h5py.h5pl, h5py.h5o, h5py.h5l, h5py._selector, pyproj._compat, pyproj._datadir, pyproj._network, pyproj._geod, pyproj.list, pyproj._crs, pyproj.database, pyproj._transformer, pyproj._sync, matplotlib._image, rasterio._version, rasterio._err, rasterio._filepath, rasterio._env, rasterio._transform, rasterio._base, rasterio.crs, rasterio._features, rasterio._warp, rasterio._io, numcodecs.compat_ext, numcodecs.blosc, numcodecs.zstd, numcodecs.lz4, numcodecs._shuffle, msgpack._cmsgpack, numcodecs.vlen, zstandard.backend_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._isolve._iterative, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._ellip_harm_2, scipy.spatial.transform._rotation, scipy.ndimage._nd_image, _ni_label, scipy.ndimage._ni_label, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize.__nnls, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._statlib, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._mvn, scipy.stats._rcont.rcont, scipy.cluster._vq, scipy.cluster._hierarchy, scipy.cluster._optimal_leaf_ordering, shapely.lib, shapely._geos, shapely._geometry_helpers, cartopy.trace, scipy.fftpack.convolve, tornado.speedups, cf_units._udunits2, scipy.io.matlab._mio_utils, scipy.io.matlab._streams, scipy.io.matlab._mio5_utils (total: 241) /home/runner/work/_temp/b3f3888c-5349-4d19-80f6-41d140b86db5.sh: line 3: 6114 Segmentation fault (core dumped) python -m pytest --timeout=60 -rf --report-log output-3.10-log.jsonl ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7879/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
not_planned | xarray 13221727 | issue | ||||||
1158378382 | I_kwDOAMm_X85FC3OO | 6323 | propagation of `encoding` | keewis 14808389 | open | 0 | 8 | 2022-03-03T12:57:29Z | 2023-10-25T23:20:31Z | MEMBER | What is your issue?We frequently get bug reports related to There are also a few discussions with more background: - https://github.com/pydata/xarray/pull/5065#issuecomment-806154872 - https://github.com/pydata/xarray/issues/1614 - #5082 - #5336 We discussed this in the meeting yesterday and as far as I can remember agreed that the current default behavior is not ideal and decided to investigate #5336: a cc @rabernat, @shoyer |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6323/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1440494247 | I_kwDOAMm_X85V3DKn | 7270 | type checking CI is failing | keewis 14808389 | closed | 0 | 3 | 2022-11-08T16:10:24Z | 2023-04-15T18:31:59Z | 2023-04-15T18:31:59Z | MEMBER | The most recent runs of the type checking CI have started to fail with a segfault:
7269 pinned |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7270/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 1, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
683142059 | MDU6SXNzdWU2ODMxNDIwNTk= | 4361 | restructure the contributing guide | keewis 14808389 | open | 0 | 5 | 2020-08-20T22:51:39Z | 2023-03-31T17:39:00Z | MEMBER | From #4355 @max-sixty:
We could also add a docstring guide since the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4361/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1306795760 | I_kwDOAMm_X85N5B7w | 6793 | improve docstrings with examples and links | keewis 14808389 | open | 0 | 10 | 2022-07-16T12:30:33Z | 2023-03-24T16:33:28Z | MEMBER | This is a (incomplete) checklist for #5816 to make it easier to find methods that are in need of examples and links to the narrative docs with further information (of course, changes to the docstrings of all other methods / functions part of the public API are also appreciated). Good examples explicitly construct small xarray objects to make it easier to follow (e.g. use Use
To easily generate the expected output install To link to other documentation pages we can use
Top-level functions:
- [ ] I/O:
- [ ] Contents:
- [ ] Comparisons:
- [ ] Dask:
- [ ] Missing values:
- [ ] Indexing:
- [ ] Aggregations:
- [ ] |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6793/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
760574919 | MDU6SXNzdWU3NjA1NzQ5MTk= | 4670 | increase the visibility of the upstream-dev PR CI | keewis 14808389 | closed | 0 | 3 | 2020-12-09T18:37:57Z | 2022-12-29T21:15:05Z | 2021-01-19T15:27:26Z | MEMBER | We currently have two upstream-dev PR CI: the old pipelines CI and the new github actions CI added together with the scheduled upstream-dev ("nightly") CI. Since we don't need both I think we should disable one of these, presumably the old pipelines CI. There's an issue with the CI result, though: since github doesn't have a icon for "passed with issues", we have to choose between "passed" or "failed" as the CI result (neither of which is optimal). The advantage of using "failed" is that a failure is easily visible, but often the total CI result on PRs is set to "failed" because we didn't get around to fixing bugs introduced by recent changes to dependencies (which is confusing for contributors). In #4584 I switched the pipelines upstream-dev CI to "allowed failure" so we get a warning instead of a failure. However, github doesn't print the number of warnings in the summary line, which means that if the icon is green nobody checks the status and upstream-dev CI failures are easily missed. Our new scheduled nightly CI improves the situation quite a bit since we automatically get a issue containing the failures, but that means we aren't able to catch these failures before actually merging. As pointed out in https://github.com/pydata/xarray/issues/4574#issuecomment-725795622 that might be acceptable, though. If we still want to fix this, we could have the PR CI automatically add a comment to the PR, which would contain a summary of the failures but also state that these failures can be ignored as long as they get the approval of a maintainer. This would increase the noise on PRs, though. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4670/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
532696790 | MDU6SXNzdWU1MzI2OTY3OTA= | 3594 | support for units with pint | keewis 14808389 | open | 0 | 7 | 2019-12-04T13:49:28Z | 2022-08-03T11:44:05Z | MEMBER |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3594/reactions", "total_count": 14, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 14, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1250755592 | I_kwDOAMm_X85KjQQI | 6645 | pre-release for v2022.06.0 | keewis 14808389 | closed | 0 | 14 | 2022-05-27T13:14:06Z | 2022-07-22T15:44:59Z | 2022-07-22T15:44:59Z | MEMBER | There's a few unreleased and potentially breaking changes in I am planning to create the pre-release tomorrow, but if there's any big changes that should be included please post here. cc @TomNicholas Edit: the version will be called |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6645/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
818957820 | MDU6SXNzdWU4MTg5NTc4MjA= | 4976 | reported version in the docs is misleading | keewis 14808389 | closed | 0 | 3 | 2021-03-01T15:08:12Z | 2022-07-10T13:00:46Z | 2022-07-10T13:00:46Z | MEMBER | The editable install on RTD is reported to have the version This is not something I can reproduce with We should try to get the version right. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4976/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1280027449 | I_kwDOAMm_X85MS6s5 | 6714 | `mypy` CI is failing on `main` | keewis 14808389 | closed | 0 | 1 | 2022-06-22T11:54:16Z | 2022-06-22T16:01:45Z | 2022-06-22T16:01:45Z | MEMBER | The most recent runs of the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6714/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1265366275 | I_kwDOAMm_X85La_UD | 6678 | exception groups | keewis 14808389 | open | 0 | 1 | 2022-06-08T22:09:37Z | 2022-06-08T23:38:28Z | MEMBER | What is your issue?As I mentioned in the meeting today, we have a lot of features where the the exception group support from PEP654 (which is scheduled for python 3.11 and consists of the class and a syntax change) might be useful. For example, we might want to collect all errors raised by For |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6678/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
624778130 | MDU6SXNzdWU2MjQ3NzgxMzA= | 4095 | merging non-dimension coordinates with the Dataset constructor | keewis 14808389 | open | 0 | 1 | 2020-05-26T10:30:37Z | 2022-04-19T13:54:43Z | MEMBER | When adding two This fails: ```python In [1]: import xarray as xr ...: import numpy as np In [2]: a = np.linspace(0, 1, 10)
...: b = np.linspace(-1, 0, 12)
...:
...: x_a = np.arange(10)
...: x_b = np.arange(12)
...:
...: y_a = x_a * 1000
...: y_b = x_b * 1000
...:
...: arr1 = xr.DataArray(data=a, coords={"x": x_a, "y": ("x", y_a)}, dims="x")
...: arr2 = xr.DataArray(data=b, coords={"x": x_b, "y": ("x", y_b)}, dims="x")
...:
...: xr.Dataset({"a": arr1, "b": arr2})
...
MergeError: conflicting values for variable 'y' on objects to be combined. You can skip this check by specifying compat='override'.
I can work around this by calling:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4095/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
628436420 | MDU6SXNzdWU2Mjg0MzY0MjA= | 4116 | xarray ufuncs | keewis 14808389 | closed | 0 | 5 | 2020-06-01T13:25:54Z | 2022-04-19T03:26:53Z | 2022-04-19T03:26:53Z | MEMBER | The documentation warns that the universal functions in Since we only support Since there are also functions that are not true ufuncs (e.g. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4116/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1174386963 | I_kwDOAMm_X85F_7kT | 6382 | `reindex` drops attrs | keewis 14808389 | closed | 0 | 1 | 2022-03-19T22:37:46Z | 2022-03-21T07:53:05Z | 2022-03-21T07:53:04Z | MEMBER | What happened?
As far as I can tell, the new reindexing code in Minimal Complete Verifiable Example```Python before #5692In [1]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature") In [2]: ds.reindex(lat=range(10, 80, 5)).lat Out[2]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y In [3]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[3]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y after #5692In [3]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature") In [4]: ds.reindex(lat=range(10, 80, 5)).lat Out[4]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 In [5]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[5]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6382/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1125877338 | I_kwDOAMm_X85DG4Za | 6250 | failing docs builds because the `scipy` intersphinx registry is unreachable | keewis 14808389 | closed | 0 | 6 | 2022-02-07T11:44:44Z | 2022-02-08T21:49:48Z | 2022-02-08T21:49:47Z | MEMBER | What happened?
There's nothing we can do to really fix this, but we can try to the avoid docs build failures by disabling that intersphinx entry (not sure if that results in other errors, though) What did you expect to happen?No response Minimal Complete Verifiable ExampleNo response Relevant log output
Anything else we need to know?No response EnvironmentSee RTD |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6250/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
861860833 | MDU6SXNzdWU4NjE4NjA4MzM= | 5190 | PRs cancel CI on push | keewis 14808389 | closed | 0 | 1 | 2021-04-19T20:33:52Z | 2022-01-31T16:59:27Z | 2022-01-31T16:59:27Z | MEMBER | The |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5190/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
539181896 | MDU6SXNzdWU1MzkxODE4OTY= | 3638 | load_store and dump_to_store | keewis 14808389 | open | 0 | 1 | 2019-12-17T16:37:53Z | 2021-11-08T21:11:26Z | MEMBER | Continuing from #3602, what should we do with these? Are they obsolete and should be removed or just unmaintained (then we should properly document and test them). |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3638/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
789106802 | MDU6SXNzdWU3ODkxMDY4MDI= | 4825 | clean up the API for renaming and changing dimensions / coordinates | keewis 14808389 | open | 0 | 5 | 2021-01-19T15:11:55Z | 2021-09-10T15:04:14Z | MEMBER | From #4108: I wonder if it would be better to first "reorganize" all of the existing functions: we currently have I believe we currently have these use cases (not sure if that list is complete, though):
- rename a Sometimes, some of these can be emulated by combinations of others, for example: ```python x is a dimension without coordinatesassert_identical(ds.set_index({"x": "b"}), ds.swap_dims({"x": "b"}).rename({"b": "x"}))
assert_identical(ds.swap_dims({"x": "b"}), ds.set_index({"x": "b"}).rename({"x": "b"}))
In any case I think we should add a guide which explains which method to pick in which situation (or extend Originally posted by @keewis in https://github.com/pydata/xarray/issues/4108#issuecomment-761907785 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4825/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
935531700 | MDU6SXNzdWU5MzU1MzE3MDA= | 5562 | hooks to "prepare" xarray objects for plotting | keewis 14808389 | open | 0 | 6 | 2021-07-02T08:14:02Z | 2021-07-04T08:46:34Z | MEMBER | From https://github.com/xarray-contrib/pint-xarray/pull/61#discussion_r662485351
While this is pretty neat there are some issues:
- All of this makes me wonder: should we try to maintain our own mapping of hooks which "prepare" the object based on the data's type? My initial idea would be that the hook function receives a For example for xref #5561 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5562/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
589850951 | MDU6SXNzdWU1ODk4NTA5NTE= | 3917 | running numpy functions on xarray objects | keewis 14808389 | open | 0 | 1 | 2020-03-29T18:17:29Z | 2021-07-04T02:00:22Z | MEMBER | In the Some of these functions, like Should we define |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3917/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
928448281 | MDU6SXNzdWU5Mjg0NDgyODE= | 5517 | mypy issues on `main` | keewis 14808389 | closed | 0 | 3 | 2021-06-23T16:38:24Z | 2021-06-24T08:58:06Z | 2021-06-24T08:58:06Z | MEMBER | The does anyone know how to fix those? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5517/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
895470157 | MDU6SXNzdWU4OTU0NzAxNTc= | 5344 | failing cftime plot tests | keewis 14808389 | closed | 0 | 2 | 2021-05-19T13:48:49Z | 2021-06-12T12:57:52Z | 2021-06-12T12:57:52Z | MEMBER | This was reported in #5077 and seems to be an upstream issue (see https://github.com/pydata/xarray/issues/5077#issuecomment-808605313):
In order to get the upsteam-dev CI open new issues for other failing tests on upstream-dev I've |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5344/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
889162918 | MDU6SXNzdWU4ODkxNjI5MTg= | 5291 | ds = xr.tutorial.load_dataset("air_temperature") with 0.18 needs engine argument | keewis 14808389 | closed | 0 | 15 | 2021-05-11T23:42:58Z | 2021-05-18T21:23:00Z | 2021-05-18T21:23:00Z | MEMBER | From xarray-contrib/xarray-tutorial#43 by @scottyhq: Many notebooks out there start with the line
It's an easy fix though, just add |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5291/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
873751431 | MDU6SXNzdWU4NzM3NTE0MzE= | 5241 | linters are not run after pre-commit autoupdate | keewis 14808389 | closed | 0 | 0 | 2021-05-01T19:33:47Z | 2021-05-16T11:21:40Z | 2021-05-16T11:21:40Z | MEMBER | 4906 added a autoupdate CI for the pre-commit hooks. However, once the hook versions are updated, the
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5241/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
855330062 | MDU6SXNzdWU4NTUzMzAwNjI= | 5138 | failing CI | keewis 14808389 | closed | 0 | 13 | 2021-04-11T15:08:08Z | 2021-04-25T21:08:43Z | 2021-04-12T18:17:50Z | MEMBER | It seems Is this a cc @andersy005 @alexamici |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5138/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
674445594 | MDU6SXNzdWU2NzQ0NDU1OTQ= | 4321 | push inline formatting functions upstream | keewis 14808389 | open | 0 | 0 | 2020-08-06T16:35:04Z | 2021-04-19T03:20:11Z | MEMBER | 4248 added a
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4321/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
675342733 | MDU6SXNzdWU2NzUzNDI3MzM= | 4324 | constructing nested inline reprs | keewis 14808389 | open | 0 | 9 | 2020-08-07T23:25:31Z | 2021-04-19T03:20:01Z | MEMBER | While implementing the new From that PR: @keewis
@jthielen
How should we deal with this? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4324/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
856831600 | MDU6SXNzdWU4NTY4MzE2MDA= | 5147 | failing CI due to CFTimeIndex errors | keewis 14808389 | closed | 0 | 0 | 2021-04-13T10:40:17Z | 2021-04-14T13:27:10Z | 2021-04-14T13:27:10Z | MEMBER | the normal CI started to fail:
traceback```pytb [gw2] linux -- Python 3.9.2 /usr/share/miniconda/envs/xarray-tests/bin/python > ??? E TypeError: Expected unicode, got datetime.timedelta pandas/_libs/tslibs/timedeltas.pyx:264: TypeError During handling of the above exception, another exception occurred: calendar = 'proleptic_gregorian' @requires_cftime @pytest.mark.parametrize("calendar", _CFTIME_CALENDARS) def test_distant_cftime_datetime_sub_cftimeindex(calendar): a = xr.cftime_range("2000", periods=5, calendar=calendar) with pytest.raises(ValueError, match="difference exceeds"): > a.date_type(1, 1, 1) - a /home/runner/work/xarray/xarray/xarray/tests/test_cftimeindex.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /home/runner/work/xarray/xarray/xarray/coding/cftimeindex.py:581: in __rsub__ return pd.TimedeltaIndex(other - np.array(self)) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/indexes/timedeltas.py:155: in __new__ tdarr = TimedeltaArray._from_sequence_not_strict( /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/arrays/timedeltas.py:250: in _from_sequence_not_strict data, inferred_freq = sequence_to_td64ns(data, copy=copy, unit=unit) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/arrays/timedeltas.py:957: in sequence_to_td64ns data = objects_to_td64ns(data, unit=unit, errors=errors) /usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/pandas/core/arrays/timedeltas.py:1067: in objects_to_td64ns result = array_to_timedelta64(values, unit=unit, errors=errors) pandas/_libs/tslibs/timedeltas.pyx:269: in pandas._libs.tslibs.timedeltas.array_to_timedelta64 ??? pandas/_libs/tslibs/timedeltas.pyx:222: in pandas._libs.tslibs.timedeltas.convert_to_timedelta64 ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > ??? E OverflowError: Python int too large to convert to C long pandas/_libs/tslibs/timedeltas.pyx:167: OverflowError ```This seems to coincide with the release of cc @spencerkclark |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5147/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
603497896 | MDU6SXNzdWU2MDM0OTc4OTY= | 3986 | building the visualization gallery is slow | keewis 14808389 | closed | 0 | 2 | 2020-04-20T20:00:51Z | 2021-03-24T17:56:49Z | 2021-03-24T17:56:49Z | MEMBER | When running sphinx to build the documentation, it frequently times out when trying to build the visualization gallery. Running
If instead I download the file manually and then load from disk, the whole notebook completes in about 10 seconds. Also, directly calling I do think we should try to fix this in the backend, but maybe we could also cache Edit: this is really flaky, I can't reliably reproduce this. Edit2: for now, I'm using a extra cell containing ```python import pathlib import shutil import requests cache_dir = pathlib.Path.home() / ".xarray_tutorial_data" path = cache_dir / "RGB.byte.tif" url = "https://github.com/mapbox/rasterio/raw/master/tests/data/RGB.byte.tif" if not path.exists() or path.stat().st_size == 0: with requests.get(url) as r, path.open(mode="wb") as f: if r.status_code == requests.codes.ok: shutil.copyfileobj(r.raw, f) else: print("download failed: {r.status_code}") r.raise_for_status() url = path ``` and modify both examples to use the new url |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3986/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
657230088 | MDU6SXNzdWU2NTcyMzAwODg= | 4226 | failing upstream-dev CI: matplotlib | keewis 14808389 | closed | 0 | 1 | 2020-07-15T10:08:04Z | 2021-03-07T13:31:37Z | 2021-03-07T13:31:37Z | MEMBER | Recent changes in tracebacks``` ________________________ TestContour.test_single_level _________________________ self = <xarray.tests.test_plot.TestContour object at 0x7fc60f7c3e20> # add_colorbar defaults to false > self.plotmethod(levels=[0.1]) xarray/tests/test_plot.py:1561: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray/plot/plot.py:827: in plotmethod return newplotfunc(**allargs) xarray/plot/plot.py:699: in newplotfunc cmap_params, cbar_kwargs = _process_cmap_cbar_kwargs( xarray/plot/utils.py:819: in _process_cmap_cbar_kwargs cmap_params = _determine_cmap_params(**cmap_kwargs) xarray/plot/utils.py:291: in _determine_cmap_params cmap, newnorm = _build_discrete_cmap(cmap, levels, extend, filled) xarray/plot/utils.py:77: in _build_discrete_cmap new_cmap, cnorm = mpl.colors.from_levels_and_colors(levels, pal, extend=extend) /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:2200: in from_levels_and_colors norm = BoundaryNorm(levels, ncolors=n_data_colors) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <matplotlib.colors.BoundaryNorm object at 0x7fc60f5048e0> boundaries = [0.1], ncolors = 0, clip = False def __init__(self, boundaries, ncolors, clip=False, *, extend='neither'): """ Parameters ---------- boundaries : array-like Monotonically increasing sequence of at least 2 boundaries. ncolors : int Number of colors in the colormap to be used. clip : bool, optional """ if clip and extend != 'neither': raise ValueError("'clip=True' is not compatible with 'extend'") self.clip = clip self.vmin = boundaries[0] self.vmax = boundaries[-1] self.boundaries = np.asarray(boundaries) self.N = len(self.boundaries) if self.N < 2: > raise ValueError("You must provide at least 2 boundaries " f"(1 region) but you passed in {boundaries!r}") E ValueError: You must provide at least 2 boundaries (1 region) but you passed in [0.1] /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:1508: ValueError ________________________ test_facetgrid_single_contour _________________________ @requires_matplotlib def test_facetgrid_single_contour(): @requires_matplotlib def test_facetgrid_single_contour(): # regression test for GH3569 x, y = np.meshgrid(np.arange(12), np.arange(12)) z = xr.DataArray(np.sqrt(x ** 2 + y ** 2)) z2 = xr.DataArray(np.sqrt(x ** 2 + y ** 2) + 1) ds = xr.concat([z, z2], dim="time") ds["time"] = [0, 1] > ds.plot.contour(col="time", levels=[4], colors=["k"]) xarray/tests/test_plot.py:2409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray/plot/plot.py:827: in plotmethod return newplotfunc(**allargs) xarray/plot/plot.py:638: in newplotfunc return _easy_facetgrid(darray, kind="dataarray", **allargs) xarray/plot/facetgrid.py:644: in _easy_facetgrid return g.map_dataarray(plotfunc, x, y, **kwargs) xarray/plot/facetgrid.py:248: in map_dataarray cmap_params, cbar_kwargs = _process_cmap_cbar_kwargs( xarray/plot/utils.py:819: in _process_cmap_cbar_kwargs cmap_params = _determine_cmap_params(**cmap_kwargs) xarray/plot/utils.py:291: in _determine_cmap_params cmap, newnorm = _build_discrete_cmap(cmap, levels, extend, filled) xarray/plot/utils.py:77: in _build_discrete_cmap new_cmap, cnorm = mpl.colors.from_levels_and_colors(levels, pal, extend=extend) /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:2200: in from_levels_and_colors norm = BoundaryNorm(levels, ncolors=n_data_colors) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <matplotlib.colors.BoundaryNorm object at 0x7fc60d47b280> boundaries = [4], ncolors = 0, clip = False def __init__(self, boundaries, ncolors, clip=False, *, extend='neither'): """ Parameters ---------- boundaries : array-like Monotonically increasing sequence of at least 2 boundaries. ncolors : int Number of colors in the colormap to be used. clip : bool, optional If clip is ``True``, out of range values are mapped to 0 if they are below ``boundaries[0]`` or mapped to ``ncolors - 1`` if they are above ``boundaries[-1]``. If clip is ``False``, out of range values are mapped to -1 if they are below ``boundaries[0]`` or mapped to *ncolors* if they are above ``boundaries[-1]``. These are then converted to valid indices by `Colormap.__call__`. extend : {'neither', 'both', 'min', 'max'}, default: 'neither' Extend the number of bins to include one or both of the regions beyond the boundaries. For example, if ``extend`` is 'min', then the color to which the region between the first pair of boundaries is mapped will be distinct from the first color in the colormap, and by default a `~matplotlib.colorbar.Colorbar` will be drawn with the triangle extension on the left or lower end. Returns ------- int16 scalar or array Notes ----- *boundaries* defines the edges of bins, and data falling within a bin is mapped to the color with the same index. If the number of bins, including any extensions, is less than *ncolors*, the color index is chosen by linear interpolation, mapping the ``[0, nbins - 1]`` range onto the ``[0, ncolors - 1]`` range. """ if clip and extend != 'neither': raise ValueError("'clip=True' is not compatible with 'extend'") self.clip = clip self.vmin = boundaries[0] self.vmax = boundaries[-1] self.boundaries = np.asarray(boundaries) self.N = len(self.boundaries) if self.N < 2: > raise ValueError("You must provide at least 2 boundaries " f"(1 region) but you passed in {boundaries!r}") E ValueError: You must provide at least 2 boundaries (1 region) but you passed in [4] /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/matplotlib/colors.py:1508: ValueError ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4226/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
550335922 | MDU6SXNzdWU1NTAzMzU5MjI= | 3697 | documentation build issues on RTD | keewis 14808389 | closed | 0 | 8 | 2020-01-15T17:46:13Z | 2021-02-25T13:52:03Z | 2021-02-21T22:21:55Z | MEMBER | It seems we have (seemingly) random failures on RTD. Some of these are the known memory issue: recreating my doc environment used about 1.4 GB of RAM, which might be too much for RTD, even with the extended memory. Much more often is a timeout when building the docs but I can't reproduce them locally. Any ideas? Edit: This really is random, I tried rerunning and the build passed. Also, a warning:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3697/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
791277757 | MDU6SXNzdWU3OTEyNzc3NTc= | 4837 | expose _to_temp_dataset / _from_temp_dataset as semi-public API? | keewis 14808389 | open | 0 | 5 | 2021-01-21T16:11:32Z | 2021-01-22T02:07:08Z | MEMBER | When writing accessors which behave the same for both Otherwise I guess it would be possible to use
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4837/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
784628736 | MDU6SXNzdWU3ODQ2Mjg3MzY= | 4801 | scheduled upstream-dev CI is skipped | keewis 14808389 | closed | 0 | 5 | 2021-01-12T22:08:19Z | 2021-01-14T00:09:43Z | 2021-01-13T21:35:44Z | MEMBER | The scheduled CI is being skipped since about 6 days ago (which means this is probably due to the merge of #4729, see this run before the merge and this run after the merge). This is really strange because Edit: it seems to be because of https://github.com/pydata/xarray/blob/f52a95cbe694336fe47bc5a42c713bee8ad74d64/.github/workflows/upstream-dev-ci.yaml#L34-L37 if I remove that the scheduled CI runs. cc @andersy005 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4801/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
765675044 | MDU6SXNzdWU3NjU2NzUwNDQ= | 4688 | drop python 3.6 support | keewis 14808389 | closed | 0 | 0 | 2020-12-13T22:11:15Z | 2021-01-07T18:15:18Z | 2021-01-07T18:15:18Z | MEMBER | NEP 29 states that as of Jun 23, 2020 we can drop python 3.6 support. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4688/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
749035078 | MDU6SXNzdWU3NDkwMzUwNzg= | 4603 | failing nightly CI | keewis 14808389 | closed | 0 | 1 | 2020-11-23T18:35:25Z | 2020-12-05T00:30:11Z | 2020-12-05T00:30:11Z | MEMBER | It seems the scheduled CI failed because no artifacts were uploaded (i.e. the CI passed). Is it possible to skip the cc @andersy005 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4603/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
739281259 | MDU6SXNzdWU3MzkyODEyNTk= | 4571 | upgrading mypy to 0.790 fails | keewis 14808389 | closed | 0 | 3 | 2020-11-09T18:56:43Z | 2020-11-13T19:38:05Z | 2020-11-13T19:38:05Z | MEMBER | Continuing from #4567, trying to upgrade |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4571/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
736272283 | MDU6SXNzdWU3MzYyNzIyODM= | 4565 | failing upstream-dev polyfit warning test | keewis 14808389 | closed | 0 | 1 | 2020-11-04T16:56:26Z | 2020-11-09T19:08:35Z | 2020-11-09T19:08:35Z | MEMBER | In emits a warning with the current since the test makes sure only the first call emits a warning (a |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4565/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
638211949 | MDU6SXNzdWU2MzgyMTE5NDk= | 4152 | signature in accessor methods | keewis 14808389 | closed | 0 | 5 | 2020-06-13T18:49:16Z | 2020-09-06T23:05:04Z | 2020-09-06T23:05:04Z | MEMBER | With the merge of #3988 we're now properly documenting the Also, we need to remove the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4152/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
642039530 | MDU6SXNzdWU2NDIwMzk1MzA= | 4165 | allow specifying a fill value per variable | keewis 14808389 | closed | 0 | 1 | 2020-06-19T15:08:16Z | 2020-08-24T22:03:15Z | 2020-08-24T22:03:15Z | MEMBER | While working on #4163 I noticed that the fill value parameter for Consider this:
I could get there by passing the default ( |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4165/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
676827750 | MDU6SXNzdWU2NzY4Mjc3NTA= | 4334 | missing parameter in DataArray.str.get | keewis 14808389 | closed | 0 | 1 | 2020-08-11T12:13:58Z | 2020-08-15T10:28:05Z | 2020-08-15T10:28:05Z | MEMBER | While working on #4286 I noticed that the docstring of |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4334/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
675602229 | MDU6SXNzdWU2NzU2MDIyMjk= | 4328 | failing docs CI | keewis 14808389 | closed | 0 | 1 | 2020-08-08T23:09:48Z | 2020-08-09T11:57:38Z | 2020-08-09T11:57:38Z | MEMBER | The RTD builds are timing out again. With my own setup I get this error instead: Traceback```pytb AttributeError Traceback (most recent call last) ~/checkouts/readthedocs.org/user_builds/xarray-keewis/conda/latest/lib/python3.8/site-packages/IPython/core/formatters.py in __call__(self, obj) 700 type_pprinters=self.type_printers, 701 deferred_pprinters=self.deferred_printers) --> 702 printer.pretty(obj) 703 printer.flush() 704 return stream.getvalue() ~/checkouts/readthedocs.org/user_builds/xarray-keewis/conda/latest/lib/python3.8/site-packages/IPython/lib/pretty.py in pretty(self, obj) 392 if cls is not object \ 393 and callable(cls.__dict__.get('__repr__')): --> 394 return _repr_pprint(obj, self, cycle) 395 396 return _default_pprint(obj, self, cycle) ~/checkouts/readthedocs.org/user_builds/xarray-keewis/conda/latest/lib/python3.8/site-packages/IPython/lib/pretty.py in _repr_pprint(obj, p, cycle) 698 """A pprint that just redirects to the normal repr function.""" 699 # Find newlines and replace them with p.break_() --> 700 output = repr(obj) 701 lines = output.splitlines() 702 with p.group(): ~/checkouts/readthedocs.org/user_builds/xarray-keewis/checkouts/latest/xarray/core/rolling.py in __repr__(self) 99 """provide a nice str repr of our rolling object""" 100 --> 101 attrs = [ 102 "{k}->{v}".format(k=k, v=getattr(self, k)) 103 for k in list(self.dim) + self.window + self.center + [self.min_periods] ~/checkouts/readthedocs.org/user_builds/xarray-keewis/checkouts/latest/xarray/core/rolling.py in <listcomp>(.0) 100 101 attrs = [ --> 102 "{k}->{v}".format(k=k, v=getattr(self, k)) 103 for k in list(self.dim) + self.window + self.center + [self.min_periods] 104 ] AttributeError: 'DataArrayRolling' object has no attribute 'y' ```I think that was introduced in #4219. cc @fujiisoup Also, we should definitely ask support why those two behave differently. Edit: see readthedocs/readthedocs.org#7371 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4328/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
668166816 | MDU6SXNzdWU2NjgxNjY4MTY= | 4287 | failing docs CI | keewis 14808389 | closed | 0 | 4 | 2020-07-29T21:19:30Z | 2020-08-05T21:31:46Z | 2020-08-05T21:31:46Z | MEMBER | I'm not quite sure why (maybe a there are also sphinx warnings about malformed rst:
Edit: |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4287/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
650929228 | MDU6SXNzdWU2NTA5MjkyMjg= | 4202 | isort flags | keewis 14808389 | closed | 0 | 0 | 2020-07-04T17:39:17Z | 2020-07-16T19:13:57Z | 2020-07-16T19:13:57Z | MEMBER | Because I've been hit by this elsewhere: Not sure if we should pin Update: |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4202/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
657223618 | MDU6SXNzdWU2NTcyMjM2MTg= | 4225 | failing type checking CI | keewis 14808389 | closed | 0 | 2 | 2020-07-15T09:57:59Z | 2020-07-15T12:24:46Z | 2020-07-15T12:24:46Z | MEMBER | Due to the update of |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4225/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
650884355 | MDU6SXNzdWU2NTA4ODQzNTU= | 4200 | Upstream-dev matplotlib failure | keewis 14808389 | closed | 0 | 0 | 2020-07-04T12:37:02Z | 2020-07-04T17:24:15Z | 2020-07-04T17:24:15Z | MEMBER | The upstream-dev CI currently fails because |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4200/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
642990666 | MDU6SXNzdWU2NDI5OTA2NjY= | 4168 | upstream-dev failures due to numpy dtype deprecations | keewis 14808389 | closed | 0 | 0 | 2020-06-22T11:30:07Z | 2020-06-22T22:51:57Z | 2020-06-22T22:51:57Z | MEMBER | It seems Since we have a few tests that require zero warnings, this means our test suite fails (and more than 13000 warnings are not a good thing, anyways). |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4168/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
537156661 | MDU6SXNzdWU1MzcxNTY2NjE= | 3617 | diff formatting for assert_allclose | keewis 14808389 | closed | 0 | 0 | 2019-12-12T18:59:52Z | 2020-06-13T17:53:03Z | 2020-06-13T17:53:03Z | MEMBER | At the moment, the output of a failing In [2]: da1 = xr.DataArray(data=[2, 5, 8], coords={"x": [0, 1, 2]}, dims="x") ...: da2 = xr.DataArray(data=[1, 5, 8], coords={"x": [0, 1, 3]}, dims="x") In [3]: assert_allclose(da1, da2)AssertionError Traceback (most recent call last) <ipython-input-6-55cbcc1eeb58> in <module> ----> 1 assert_allclose(da1, da2) ~/Documents/Programming/xarray/xarray/testing.py in assert_allclose(a, b, rtol, atol, decode_bytes) 123 assert allclose, f"{a.values}\n{b.values}" 124 elif isinstance(a, DataArray): --> 125 assert_allclose(a.variable, b.variable, **kwargs) 126 assert set(a.coords) == set(b.coords) 127 for v in a.coords.variables: ~/Documents/Programming/xarray/xarray/testing.py in assert_allclose(a, b, rtol, atol, decode_bytes) 121 assert a.dims == b.dims 122 allclose = _data_allclose_or_equiv(a.values, b.values, kwargs) --> 123 assert allclose, f"{a.values}\n{b.values}" 124 elif isinstance(a, DataArray): 125 assert_allclose(a.variable, b.variable, kwargs) AssertionError: [2 5 8] [1 5 8] In [4]: assert_identical(da1, da2)AssertionError Traceback (most recent call last) <ipython-input-7-127e2b1ca0dc> in <module> ----> 1 assert_identical(da1, da2) ~/Documents/Programming/xarray/xarray/testing.py in assert_identical(a, b) 83 elif isinstance(a, DataArray): 84 assert a.name == b.name ---> 85 assert a.identical(b), formatting.diff_array_repr(a, b, "identical") 86 elif isinstance(a, (Dataset, Variable)): 87 assert a.identical(b), formatting.diff_dataset_repr(a, b, "identical") AssertionError: Left and right DataArray objects are not identical Differing values:
L
array([2, 5, 8])
R
array([1, 5, 8])
Differing coordinates:
L * x (x) int64 0 1 2
R * x (x) int64 0 1 3
I found a reference to this: https://github.com/pydata/xarray/pull/1690#issuecomment-455540813, but it's not being actively worked on. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3617/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
538092307 | MDU6SXNzdWU1MzgwOTIzMDc= | 3625 | Documentation for injected methods | keewis 14808389 | closed | 0 | 4 | 2019-12-15T19:09:47Z | 2020-06-13T17:52:46Z | 2020-06-13T17:52:46Z | MEMBER | At the moment, the documentation for methods introduced by accessors is broken. We work around this issue by directly referencing the corresponding accessor method or the underlying function. Of course, this is not how we recommend calling them, so we should try to fix this. It seems this is mostly a sphinx issue since this is known (see #3333 and #3602), but we didn't find a way to resolve this. I'd like use this issue to collect ideas for solutions. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3625/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
637227979 | MDU6SXNzdWU2MzcyMjc5Nzk= | 4147 | readthedocs build / documentation build time | keewis 14808389 | closed | 0 | 1 | 2020-06-11T18:17:53Z | 2020-06-12T15:03:20Z | 2020-06-12T15:03:20Z | MEMBER | It seems that after the merge of #3818, the RTD builds started to time out while the As a reference, before #3818 our I'm not sure if that is due to #3818 or because of updated dependencies ( |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4147/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
634979933 | MDU6SXNzdWU2MzQ5Nzk5MzM= | 4133 | upstream-dev failure when installing pandas | keewis 14808389 | closed | 0 | 3 | 2020-06-08T22:39:01Z | 2020-06-11T02:14:49Z | 2020-06-11T02:14:49Z | MEMBER | So after #4124 has been fixed upstream, we now have problems with installing I'm not sure if I read the logs correctly, but it seems For reference, see the logs of the last passing build. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4133/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
631860723 | MDU6SXNzdWU2MzE4NjA3MjM= | 4124 | upstream-dev CI fails on import of dask.array | keewis 14808389 | closed | 0 | 9 | 2020-06-05T19:08:31Z | 2020-06-10T12:41:44Z | 2020-06-10T12:41:44Z | MEMBER | I'm not sure why, but our upstream-dev CI fails when importing |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4124/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
618234617 | MDU6SXNzdWU2MTgyMzQ2MTc= | 4062 | clean up of the branches on the main repository | keewis 14808389 | closed | 0 | 3 | 2020-05-14T13:31:57Z | 2020-05-18T23:24:47Z | 2020-05-18T14:03:19Z | MEMBER | We have several branches on our main repository that appear to be either out-dated or used to build documentation on RTD (these are no longer necessary since RTD fixed the Edit: I'm going to delete those branches on Monday |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4062/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
603594157 | MDU6SXNzdWU2MDM1OTQxNTc= | 3987 | failing distributed tests on upstream-dev | keewis 14808389 | closed | 0 | 0 | 2020-04-20T23:18:08Z | 2020-04-21T12:53:45Z | 2020-04-21T12:53:45Z | MEMBER | Since dask/distributed#3706 I think this should be a pretty strait-forward fix: replace |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3987/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
569789152 | MDU6SXNzdWU1Njk3ODkxNTI= | 3794 | all-dependencies-but-dask CI | keewis 14808389 | closed | 0 | 1 | 2020-02-24T11:15:54Z | 2020-04-03T19:48:19Z | 2020-04-03T19:48:19Z | MEMBER | We recently got a few reports about tests failing with most optional dependencies installed but not Since |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3794/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
592823921 | MDU6SXNzdWU1OTI4MjM5MjE= | 3928 | failing cftime upstream-dev tests | keewis 14808389 | closed | 0 | 0 | 2020-04-02T18:00:54Z | 2020-04-03T19:35:18Z | 2020-04-03T19:35:18Z | MEMBER | Another set of failing ```pytb __ test_use_cftime_standard_calendar_default_in_range[gregorian] ___ calendar = 'gregorian' for v in ["x", "time"]: original[v].attrs["units"] = units original[v].attrs["calendar"] = calendar
xarray/tests/test_backends.py:4378: AssertionError ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3928/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
552896124 | MDU6SXNzdWU1NTI4OTYxMjQ= | 3711 | PseudoNetCDF tests failing randomly | keewis 14808389 | open | 0 | 6 | 2020-01-21T14:01:49Z | 2020-03-23T20:32:32Z | MEMBER | The self = <xarray.tests.test_backends.TestPseudoNetCDFFormat object at 0x000002E11FF2DC08>
xarray\tests\test_backends.py:3532: xarray\core\formatting.py:628: in diff_dataset_repr summary.append(diff_attrs_repr(a.attrs, b.attrs, compat)) a_mapping = {'CPROJ': 0, 'FILEDESC': 'CAMx ', 'FTYPE': 1, 'GDNAM': 'CAMx ', ...} b_mapping = {'CPROJ': 0, 'FILEDESC': 'CAMx ', 'FTYPE': 1, 'GDNAM': 'CAMx ', ...} compat = 'identical', title = 'Attributes' summarizer = <function summarize_attr at 0x000002E1156813A8>, col_width = None
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3711/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
572789017 | MDU6SXNzdWU1NzI3ODkwMTc= | 3809 | warning in the MacOSX CI | keewis 14808389 | closed | 0 | 0 | 2020-02-28T14:26:31Z | 2020-03-07T04:49:22Z | 2020-03-07T04:49:22Z | MEMBER | Running the MacOSX py38 CI job gives us this warning: ``` [warning]This pipeline uses a Microsoft-hosted agent image that will be removed on March 23, 2020 (MacOS-10.13). You must make changes to your pipeline before that date, or else your pipeline will fail. Learn more (https://aka.ms/removing-older-images-hosted-pools).``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3809/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
538065011 | MDU6SXNzdWU1MzgwNjUwMTE= | 3623 | RTD configuration file version | keewis 14808389 | closed | 0 | 0 | 2019-12-15T15:18:42Z | 2019-12-17T23:22:17Z | 2019-12-17T23:22:17Z | MEMBER | RTD released a new version of the configuration file format a bit more than a year ago and deprecated the old version. I think we should try to migrate our setup to the new format. Thoughts? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3623/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
537152819 | MDU6SXNzdWU1MzcxNTI4MTk= | 3616 | formatting of nep-18 compatible arrays with assert_* | keewis 14808389 | closed | 0 | 2 | 2019-12-12T18:51:33Z | 2019-12-13T11:27:55Z | 2019-12-13T11:27:54Z | MEMBER | At the moment, the What should we about this? Does xref #3611 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3616/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
527755536 | MDU6SXNzdWU1Mjc3NTU1MzY= | 3567 | version collisions on readthedocs | keewis 14808389 | closed | 0 | 3 | 2019-11-24T20:58:24Z | 2019-12-06T13:36:03Z | 2019-12-03T18:59:41Z | MEMBER | In #3199 the documentation on Fortunately, It might be that this is a problem with how versioneer constructs versions based on git commit hashes and a fix to #2853 is a fix to this, but it certainly needs more investigation. 3369 is related, but more about catching these sort of issues before merging. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3567/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
522498720 | MDU6SXNzdWU1MjI0OTg3MjA= | 3525 | DatasetGroupBy does not implement quantile | keewis 14808389 | closed | 0 | 6 | 2019-11-13T22:02:55Z | 2019-11-15T19:58:02Z | 2019-11-15T19:58:02Z | MEMBER | The docs claim
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3525/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
517195073 | MDU6SXNzdWU1MTcxOTUwNzM= | 3483 | assign_coords with mixed DataArray / array args removes coords | keewis 14808389 | open | 0 | 5 | 2019-11-04T14:38:40Z | 2019-11-07T15:46:15Z | MEMBER | I'm not sure if using
I would expect the result to be the same regardless of the type of the new coords. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3483/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
484097190 | MDU6SXNzdWU0ODQwOTcxOTA= | 3241 | aggregation functions treat duck arrays differently depending on dtype | keewis 14808389 | closed | 0 | 6 | 2019-08-22T16:30:56Z | 2019-09-02T22:52:47Z | 2019-09-02T22:52:47Z | MEMBER | While working on #3238, I tried replacing
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3241/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);