issues
331 rows where user = 14808389 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, draft, state_reason, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2194953062 | PR_kwDOAMm_X85qFqp1 | 8854 | array api-related upstream-dev failures | keewis 14808389 | open | 0 | 15 | 2024-03-19T13:17:09Z | 2024-05-03T22:46:41Z | MEMBER | 0 | pydata/xarray/pulls/8854 |
This "fixes" the upstream-dev failures related to the removal of |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8854/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2269295936 | PR_kwDOAMm_X85uBwtv | 8983 | fixes for the `pint` tests | keewis 14808389 | open | 0 | 0 | 2024-04-29T15:09:28Z | 2024-05-03T18:30:06Z | MEMBER | 0 | pydata/xarray/pulls/8983 | This removes the use of the deprecated |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8983/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2275404926 | PR_kwDOAMm_X85uWjVP | 8993 | call `np.cross` with 3D vectors only | keewis 14808389 | closed | 0 | 1 | 2024-05-02T12:21:30Z | 2024-05-03T15:56:49Z | 2024-05-03T15:22:26Z | MEMBER | 0 | pydata/xarray/pulls/8993 |
In the tests, we've been calling For a later PR: add tests to check if |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8993/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2241526039 | PR_kwDOAMm_X85skMs0 | 8939 | avoid a couple of warnings in `polyfit` | keewis 14808389 | closed | 0 | 14 | 2024-04-13T11:49:13Z | 2024-05-01T16:42:06Z | 2024-05-01T15:34:20Z | MEMBER | 0 | pydata/xarray/pulls/8939 | - [x] towards #8844
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8939/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2270984193 | PR_kwDOAMm_X85uHk70 | 8986 | clean up the upstream-dev setup script | keewis 14808389 | closed | 0 | 1 | 2024-04-30T09:34:04Z | 2024-04-30T23:26:13Z | 2024-04-30T20:59:56Z | MEMBER | 0 | pydata/xarray/pulls/8986 | In trying to install packages that are compatible with As it seems
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8986/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2267711587 | PR_kwDOAMm_X85t8VWy | 8978 | more engine environment tricks in preparation for `numpy>=2` | keewis 14808389 | closed | 0 | 7 | 2024-04-28T17:54:38Z | 2024-04-29T14:56:22Z | 2024-04-29T14:56:21Z | MEMBER | 0 | pydata/xarray/pulls/8978 | Turns out And finally, the
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8978/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
590630281 | MDU6SXNzdWU1OTA2MzAyODE= | 3921 | issues discovered by the all-but-dask CI | keewis 14808389 | closed | 0 | 4 | 2020-03-30T22:08:46Z | 2024-04-25T14:48:15Z | 2024-02-10T02:57:34Z | MEMBER | After adding the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3921/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2234142680 | PR_kwDOAMm_X85sK0g8 | 8923 | `"source"` encoding for datasets opened from `fsspec` objects | keewis 14808389 | open | 0 | 5 | 2024-04-09T19:12:45Z | 2024-04-23T16:54:09Z | MEMBER | 0 | pydata/xarray/pulls/8923 | When opening files from path-like objects ( In this PR, I'm extracting the If this sounds like a good idea, I'll update the documentation of the
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8923/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2255271332 | PR_kwDOAMm_X85tSKJs | 8961 | use `nan` instead of `NaN` | keewis 14808389 | closed | 0 | 0 | 2024-04-21T21:26:18Z | 2024-04-21T22:01:04Z | 2024-04-21T22:01:03Z | MEMBER | 0 | pydata/xarray/pulls/8961 | FYI @aulemahal,
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8961/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2241492018 | PR_kwDOAMm_X85skF_A | 8937 | drop support for `python=3.9` | keewis 14808389 | open | 0 | 3 | 2024-04-13T10:18:04Z | 2024-04-15T15:07:39Z | MEMBER | 0 | pydata/xarray/pulls/8937 | According to our policy (and NEP-29) we can drop support for We could delay this until we have a release that is compatible with
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8937/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2241528898 | PR_kwDOAMm_X85skNON | 8940 | adapt more tests to the copy-on-write behavior of pandas | keewis 14808389 | closed | 0 | 1 | 2024-04-13T11:57:10Z | 2024-04-13T19:36:30Z | 2024-04-13T14:44:50Z | MEMBER | 0 | pydata/xarray/pulls/8940 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8940/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2241499231 | PR_kwDOAMm_X85skHW9 | 8938 | use `pd.to_timedelta` instead of `TimedeltaIndex` | keewis 14808389 | closed | 0 | 0 | 2024-04-13T10:38:12Z | 2024-04-13T12:32:14Z | 2024-04-13T12:32:13Z | MEMBER | 0 | pydata/xarray/pulls/8938 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8938/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2181644595 | PR_kwDOAMm_X85pYPWY | 8823 | try to get the `upstream-dev` CI to complete again | keewis 14808389 | closed | 0 | 2 | 2024-03-12T13:36:20Z | 2024-03-12T16:59:12Z | 2024-03-12T16:04:53Z | MEMBER | 0 | pydata/xarray/pulls/8823 | There's a couple of accumulated failures now, including a crash because |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8823/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2141273710 | PR_kwDOAMm_X85nOs6t | 8767 | new whats-new section | keewis 14808389 | closed | 0 | 0 | 2024-02-19T00:39:59Z | 2024-02-20T10:35:35Z | 2024-02-20T10:35:35Z | MEMBER | 0 | pydata/xarray/pulls/8767 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8767/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2141229636 | PR_kwDOAMm_X85nOjve | 8766 | release v2024.02.0 | keewis 14808389 | closed | 0 | 0 | 2024-02-18T23:06:06Z | 2024-02-18T23:06:22Z | 2024-02-18T23:06:22Z | MEMBER | 0 | pydata/xarray/pulls/8766 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8766/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2141111970 | PR_kwDOAMm_X85nOMmO | 8764 | release summary for 2024.02.0 | keewis 14808389 | closed | 0 | 3 | 2024-02-18T17:45:01Z | 2024-02-18T23:00:26Z | 2024-02-18T22:52:14Z | MEMBER | 0 | pydata/xarray/pulls/8764 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8764/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1505375386 | PR_kwDOAMm_X85F6MBQ | 7395 | implement `isnull` using `full_like` instead of `zeros_like` | keewis 14808389 | closed | 0 | 2 | 2022-12-20T22:07:30Z | 2024-01-28T14:19:26Z | 2024-01-23T18:29:14Z | MEMBER | 0 | pydata/xarray/pulls/7395 | After changing the behavior of the implementation of I'd argue that
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7395/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2010594399 | PR_kwDOAMm_X85gWlAz | 8483 | import from the new location of `normalize_axis_index` if possible | keewis 14808389 | closed | 0 | 13 | 2023-11-25T12:19:32Z | 2024-01-18T16:52:02Z | 2024-01-18T15:34:57Z | MEMBER | 0 | pydata/xarray/pulls/8483 | Another one of the Since as far as I remember
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8483/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2078559800 | PR_kwDOAMm_X85j6NsC | 8605 | run CI on `python=3.12` | keewis 14808389 | closed | 0 | 7 | 2024-01-12T10:47:18Z | 2024-01-17T21:54:13Z | 2024-01-17T21:54:12Z | MEMBER | 0 | pydata/xarray/pulls/8605 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8605/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2079089277 | I_kwDOAMm_X8577GJ9 | 8607 | allow computing just a small number of variables | keewis 14808389 | open | 0 | 4 | 2024-01-12T15:21:27Z | 2024-01-12T20:20:29Z | MEMBER | Is your feature request related to a problem?I frequently find myself computing a handful of variables of a dataset (typically coordinates) and assigning them back to the dataset, and wishing we had a method / function that allowed that. Describe the solution you'd likeI'd imagine something like
Describe alternatives you've consideredSo far I've been using something like
Additional contextNo response |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8607/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1977836822 | PR_kwDOAMm_X85enruo | 8416 | migrate the other CI to python 3.11 | keewis 14808389 | closed | 0 | 3 | 2023-11-05T15:26:31Z | 2024-01-03T20:17:11Z | 2023-11-17T15:27:21Z | MEMBER | 0 | pydata/xarray/pulls/8416 | (namely, additional CI and upstream-dev CI)
Regarding We still have the special environment files for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8416/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2060807644 | PR_kwDOAMm_X85i-Lpn | 8576 | ignore a `DeprecationWarning` emitted by `seaborn` | keewis 14808389 | closed | 0 | 0 | 2023-12-30T17:30:28Z | 2023-12-30T22:10:08Z | 2023-12-30T22:10:08Z | MEMBER | 0 | pydata/xarray/pulls/8576 | Not sure if this is something that we'll only see on I also moved the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8576/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2028193332 | PR_kwDOAMm_X85hSQNW | 8526 | explicitly skip using `__array_namespace__` for `numpy.ndarray` | keewis 14808389 | closed | 0 | 3 | 2023-12-06T10:09:48Z | 2023-12-07T09:18:05Z | 2023-12-06T17:58:46Z | MEMBER | 0 | pydata/xarray/pulls/8526 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8526/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1655290694 | I_kwDOAMm_X85iqbtG | 7721 | `as_shared_dtype` converts scalars to 0d `numpy` arrays if chunked `cupy` is involved | keewis 14808389 | open | 0 | 7 | 2023-04-05T09:48:34Z | 2023-12-04T10:45:43Z | MEMBER | I tried to run TypeError Traceback (most recent call last) Cell In[4], line 1 ----> 1 arr.chunk().where(mask).compute() File ~/repos/xarray/xarray/core/dataarray.py:1095, in DataArray.compute(self, kwargs) 1076 """Manually trigger loading of this array's data from disk or a 1077 remote source into memory and return a new array. The original is 1078 left unaltered. (...) 1092 dask.compute 1093 """ 1094 new = self.copy(deep=False) -> 1095 return new.load(kwargs) File ~/repos/xarray/xarray/core/dataarray.py:1069, in DataArray.load(self, kwargs) 1051 def load(self: T_DataArray, kwargs) -> T_DataArray: 1052 """Manually trigger loading of this array's data from disk or a 1053 remote source into memory and return this array. 1054 (...) 1067 dask.compute 1068 """ -> 1069 ds = self._to_temp_dataset().load(**kwargs) 1070 new = self._from_temp_dataset(ds) 1071 self._variable = new._variable File ~/repos/xarray/xarray/core/dataset.py:752, in Dataset.load(self, kwargs) 749 import dask.array as da 751 # evaluate all the dask arrays simultaneously --> 752 evaluated_data = da.compute(*lazy_data.values(), kwargs) 754 for k, data in zip(lazy_data, evaluated_data): 755 self.variables[k].data = data File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/base.py:600, in compute(traverse, optimize_graph, scheduler, get, args, kwargs) 597 keys.append(x.dask_keys()) 598 postcomputes.append(x.dask_postcompute()) --> 600 results = schedule(dsk, keys, kwargs) 601 return repack([f(r, a) for r, (f, a) in zip(results, postcomputes)]) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/threaded.py:89, in get(dsk, keys, cache, num_workers, pool, kwargs) 86 elif isinstance(pool, multiprocessing.pool.Pool): 87 pool = MultiprocessingPoolExecutor(pool) ---> 89 results = get_async( 90 pool.submit, 91 pool._max_workers, 92 dsk, 93 keys, 94 cache=cache, 95 get_id=_thread_get_id, 96 pack_exception=pack_exception, 97 kwargs, 98 ) 100 # Cleanup pools associated to dead threads 101 with pools_lock: File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:511, in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs) 509 _execute_task(task, data) # Re-execute locally 510 else: --> 511 raise_exception(exc, tb) 512 res, worker_id = loads(res_info) 513 state["cache"][key] = res File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:319, in reraise(exc, tb) 317 if exc.traceback is not tb: 318 raise exc.with_traceback(tb) --> 319 raise exc File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/local.py:224, in execute_task(key, task_info, dumps, loads, get_id, pack_exception) 222 try: 223 task, data = loads(task_info) --> 224 result = _execute_task(task, data) 225 id = get_id() 226 result = dumps((result, id)) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk) 115 func, args = arg[0], arg[1:] 116 # Note: Don't assign the subtask results to a variable. numpy detects 117 # temporaries by their reference count and can execute certain 118 # operations in-place. --> 119 return func(*(_execute_task(a, cache) for a in args)) 120 elif not ishashable(arg): 121 return arg File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/optimization.py:990, in SubgraphCallable.call(self, *args) 988 if not len(args) == len(self.inkeys): 989 raise ValueError("Expected %d args, got %d" % (len(self.inkeys), len(args))) --> 990 return core.get(self.dsk, self.outkey, dict(zip(self.inkeys, args))) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:149, in get(dsk, out, cache) 147 for key in toposort(dsk): 148 task = dsk[key] --> 149 result = _execute_task(task, cache) 150 cache[key] = result 151 result = _execute_task(out, cache) File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk) 115 func, args = arg[0], arg[1:] 116 # Note: Don't assign the subtask results to a variable. numpy detects 117 # temporaries by their reference count and can execute certain 118 # operations in-place. --> 119 return func(*(_execute_task(a, cache) for a in args)) 120 elif not ishashable(arg): 121 return arg File <array_function internals>:180, in where(args, *kwargs) File cupy/_core/core.pyx:1723, in cupy._core.core._ndarray_base.array_function() File ~/.local/opt/mambaforge/envs/xarray/lib/python3.10/site-packages/cupy/_sorting/search.py:211, in where(condition, x, y) 209 if fusion._is_fusing(): 210 return fusion._call_ufunc(_where_ufunc, condition, x, y) --> 211 return _where_ufunc(condition.astype('?'), x, y) File cupy/_core/_kernel.pyx:1287, in cupy._core._kernel.ufunc.call() File cupy/_core/_kernel.pyx:160, in cupy._core._kernel._preprocess_args() File cupy/_core/_kernel.pyx:146, in cupy._core._kernel._preprocess_arg() TypeError: Unsupported type <class 'numpy.ndarray'>
I think the reason is that this: https://github.com/pydata/xarray/blob/d4db16699f30ad1dc3e6861601247abf4ac96567/xarray/core/duck_array_ops.py#L195 is not sufficient to detect cc @jacobtomlinson |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7721/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1988047821 | PR_kwDOAMm_X85fKfB6 | 8441 | remove `cdms2` | keewis 14808389 | closed | 0 | 4 | 2023-11-10T17:25:50Z | 2023-11-14T17:34:57Z | 2023-11-14T17:15:49Z | MEMBER | 0 | pydata/xarray/pulls/8441 |
This also appears to allow us to remove the special
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8441/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1730451312 | I_kwDOAMm_X85nJJdw | 7879 | occasional segfaults on CI | keewis 14808389 | closed | 0 | 3 | 2023-05-29T09:52:01Z | 2023-11-06T22:03:43Z | 2023-11-06T22:03:42Z | MEMBER | The upstream-dev CI currently fails sometimes due to a segfault (the normal CI crashes, too, but since we use I'm not sure why, and I can't reproduce locally, either. Given that log of the segfaulting CI job``` ============================= test session starts ============================== platform linux -- Python 3.10.11, pytest-7.3.1, pluggy-1.0.0 rootdir: /home/runner/work/xarray/xarray configfile: setup.cfg testpaths: xarray/tests, properties plugins: env-0.8.1, xdist-3.3.1, timeout-2.1.0, cov-4.1.0, reportlog-0.1.2, hypothesis-6.75.6 timeout: 60.0s timeout method: signal timeout func_only: False collected 16723 items / 2 skipped xarray/tests/test_accessor_dt.py ....................................... [ 0%] ........................................................................ [ 0%] ........................................................................ [ 1%] ........................................................................ [ 1%] ............................... [ 1%] xarray/tests/test_accessor_str.py ...................................... [ 1%] ........................................................................ [ 2%] ........................................................................ [ 2%] ............................................. [ 3%] xarray/tests/test_array_api.py ........... [ 3%] xarray/tests/test_backends.py ........................X........x........ [ 3%] ...................................s.........................X........x. [ 3%] .........................................s.........................X.... [ 4%] ....x.......................................X........................... [ 4%] ....................................X........x.......................... [ 5%] .............X.......................................................... [ 5%] ....X........x....................x.x................X.................. [ 5%] x..x..x..x...................................X........x................. [ 6%] ...x.x................X..................x..x..x..x..................... [ 6%] ..............X........x....................x.x................X........ [ 7%] ..........x..x..x..x.......................................X........x... [ 7%] ...........................................X........x................... [ 8%] ...ss........................X........x................................. [ 8%] .................X........x............................................. [ 8%] ..X........x.............................................X........x..... [ 9%] ............................................X........x.................. [ 9%] .........................................................X........x..... [ 10%] ......................................................................X. [ 10%] .......x................................................................ [ 11%] Fatal Python error: Segmentation fault Thread 0x00007f9c7b8ff640 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 81 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Current thread 0x00007f9c81f1d640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9c82f1e640 (most recent call first): File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 216 in _acquire_with_cache_info File "/home/runner/work/xarray/xarray/xarray/backends/file_manager.py", line 198 in acquire_context File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/contextlib.py", line 135 in __enter__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 392 in _acquire File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 398 in ds File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 336 in __init__ File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 389 in open File "/home/runner/work/xarray/xarray/xarray/backends/netCDF4_.py", line 588 in open_dataset File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 566 in open_dataset File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/utils.py", line 73 in apply File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/core.py", line 121 in _execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 224 in execute_task File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in <listcomp> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 238 in batch_execute_tasks File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 58 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/concurrent/futures/thread.py", line 83 in _worker File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 953 in run File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 1016 in _bootstrap_inner File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 973 in _bootstrap Thread 0x00007f9ca575e740 (most recent call first): File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/threading.py", line 320 in wait File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/queue.py", line 171 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 137 in queue_get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/local.py", line 500 in get_async File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/threaded.py", line 89 in get File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/dask/base.py", line 595 in compute File "/home/runner/work/xarray/xarray/xarray/backends/api.py", line 1046 in open_mfdataset File "/home/runner/work/xarray/xarray/xarray/tests/test_backends.py", line 3295 in test_open_mfdataset_manyfiles File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/python.py", line 1799 in runtest File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 262 in <lambda> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 341 in from_call File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 261 in call_runtest_hook File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 222 in call_and_report File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 133 in runtestprotocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 348 in pytest_runtestloop File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 323 in _main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 269 in wrap_session File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/main.py", line 316 in pytest_cmdline_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__ File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 166 in main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/_pytest/config/__init__.py", line 189 in console_main File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/site-packages/pytest/__main__.py", line 5 in <module> File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 86 in _run_code File "/home/runner/micromamba-root/envs/xarray-tests/lib/python3.10/runpy.py", line 196 in _run_module_as_main Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pandas._libs.ops, numexpr.interpreter, bottleneck.move, bottleneck.nonreduce, bottleneck.nonreduce_axis, bottleneck.reduce, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.indexing, pandas._libs.index, pandas._libs.internals, pandas._libs.join, pandas._libs.writers, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, cftime._cftime, yaml._yaml, cytoolz.utils, cytoolz.itertoolz, cytoolz.functoolz, cytoolz.dicttoolz, cytoolz.recipes, xxhash._xxhash, psutil._psutil_linux, psutil._psutil_posix, markupsafe._speedups, numpy.linalg.lapack_lite, matplotlib._c_internal_utils, PIL._imaging, matplotlib._path, kiwisolver._cext, scipy._lib._ccallback_c, _cffi_backend, unicodedata2, netCDF4._netCDF4, h5py._errors, h5py.defs, h5py._objects, h5py.h5, h5py.h5r, h5py.utils, h5py.h5s, h5py.h5ac, h5py.h5p, h5py.h5t, h5py._conv, h5py.h5z, h5py._proxy, h5py.h5a, h5py.h5d, h5py.h5ds, h5py.h5g, h5py.h5i, h5py.h5f, h5py.h5fd, h5py.h5pl, h5py.h5o, h5py.h5l, h5py._selector, pyproj._compat, pyproj._datadir, pyproj._network, pyproj._geod, pyproj.list, pyproj._crs, pyproj.database, pyproj._transformer, pyproj._sync, matplotlib._image, rasterio._version, rasterio._err, rasterio._filepath, rasterio._env, rasterio._transform, rasterio._base, rasterio.crs, rasterio._features, rasterio._warp, rasterio._io, numcodecs.compat_ext, numcodecs.blosc, numcodecs.zstd, numcodecs.lz4, numcodecs._shuffle, msgpack._cmsgpack, numcodecs.vlen, zstandard.backend_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._isolve._iterative, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._ellip_harm_2, scipy.spatial.transform._rotation, scipy.ndimage._nd_image, _ni_label, scipy.ndimage._ni_label, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize.__nnls, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._statlib, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._mvn, scipy.stats._rcont.rcont, scipy.cluster._vq, scipy.cluster._hierarchy, scipy.cluster._optimal_leaf_ordering, shapely.lib, shapely._geos, shapely._geometry_helpers, cartopy.trace, scipy.fftpack.convolve, tornado.speedups, cf_units._udunits2, scipy.io.matlab._mio_utils, scipy.io.matlab._streams, scipy.io.matlab._mio5_utils (total: 241) /home/runner/work/_temp/b3f3888c-5349-4d19-80f6-41d140b86db5.sh: line 3: 6114 Segmentation fault (core dumped) python -m pytest --timeout=60 -rf --report-log output-3.10-log.jsonl ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7879/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
not_planned | xarray 13221727 | issue | ||||||
1158378382 | I_kwDOAMm_X85FC3OO | 6323 | propagation of `encoding` | keewis 14808389 | open | 0 | 8 | 2022-03-03T12:57:29Z | 2023-10-25T23:20:31Z | MEMBER | What is your issue?We frequently get bug reports related to There are also a few discussions with more background: - https://github.com/pydata/xarray/pull/5065#issuecomment-806154872 - https://github.com/pydata/xarray/issues/1614 - #5082 - #5336 We discussed this in the meeting yesterday and as far as I can remember agreed that the current default behavior is not ideal and decided to investigate #5336: a cc @rabernat, @shoyer |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6323/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1845449919 | PR_kwDOAMm_X85Xp1U1 | 8064 | adapt to NEP 51 | keewis 14808389 | closed | 0 | 7 | 2023-08-10T15:43:13Z | 2023-09-30T09:27:25Z | 2023-09-25T04:46:49Z | MEMBER | 0 | pydata/xarray/pulls/8064 | With NEP 51 (and the changes to
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8064/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1898193938 | PR_kwDOAMm_X85abbJ4 | 8188 | fix the failing docs | keewis 14808389 | closed | 0 | 7 | 2023-09-15T11:01:42Z | 2023-09-20T11:04:03Z | 2023-09-15T13:26:24Z | MEMBER | 0 | pydata/xarray/pulls/8188 | The docs have been failing because of a malformatted docstring we inherit from
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8188/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1869782001 | PR_kwDOAMm_X85Y76lw | 8117 | fix miscellaneous `numpy=2.0` errors | keewis 14808389 | closed | 0 | 9 | 2023-08-28T13:34:56Z | 2023-09-13T15:34:15Z | 2023-09-11T03:55:52Z | MEMBER | 0 | pydata/xarray/pulls/8117 |
Edit: looking at the relevant |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8117/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1845114193 | PR_kwDOAMm_X85Xorkf | 8061 | unpin `numpy` | keewis 14808389 | closed | 0 | 8 | 2023-08-10T12:43:32Z | 2023-08-17T18:14:22Z | 2023-08-17T18:14:21Z | MEMBER | 0 | pydata/xarray/pulls/8061 |
It seems in a previous PR I "temporarily" pinned |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8061/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1423972935 | PR_kwDOAMm_X85BlCII | 7225 | join together duplicate entries in the text `repr` | keewis 14808389 | closed | 0 | 4 | 2022-10-26T12:53:49Z | 2023-07-24T18:37:05Z | 2023-07-20T21:13:57Z | MEMBER | 0 | pydata/xarray/pulls/7225 |
The formatting options we were able to come up with:
1. separate with just newlines (see e29aeb9085bc677e16dabd4a2b94cf63d06c155e):
For the unicode box components, we can choose between the light and heavy variants. @benbovy and I think the unicode variants (especially variant 3) are the easiest to understand, but we would need to decide whether we care about terminals that don't support unicode. Edit: in the meeting we decided that support for the subsection of unicode should be common enough that we can use it. I'll clean this PR up to implement option 3, then. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7225/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1789429376 | PR_kwDOAMm_X85Uso19 | 7961 | manually unshallow the repository on RTD | keewis 14808389 | closed | 0 | 0 | 2023-07-05T12:15:31Z | 2023-07-11T13:24:18Z | 2023-07-05T15:44:09Z | MEMBER | 0 | pydata/xarray/pulls/7961 | RTD is deprecating the feature flag we made use of before. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7961/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1745794965 | PR_kwDOAMm_X85SaJCg | 7899 | use trusted publishers instead of a API token | keewis 14808389 | closed | 0 | 4 | 2023-06-07T12:30:56Z | 2023-06-16T08:58:05Z | 2023-06-16T08:37:07Z | MEMBER | 0 | pydata/xarray/pulls/7899 | PyPI introduced the concept of "trusted publishers" a few months ago, which allows requesting short-lived API tokens for trusted publishing services (such as GHA, in our case). Someone with the appropriate rights will have to enable this on PyPI, and I will do the same for TestPyPI. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7899/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1730664352 | PR_kwDOAMm_X85RmgD2 | 7880 | don't use `CacheFileManager.__del__` on interpreter shutdown | keewis 14808389 | closed | 0 | 9 | 2023-05-29T12:16:06Z | 2023-06-06T20:37:40Z | 2023-06-06T15:14:37Z | MEMBER | 0 | pydata/xarray/pulls/7880 | Storing a reference to the function on the class tells the garbage collector to not collect the function before the class, such that any instance can safely complete its No tests because I don't know how to properly test this. Any ideas?
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7880/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1738586208 | PR_kwDOAMm_X85SBfsz | 7889 | retire the TestPyPI workflow | keewis 14808389 | closed | 0 | 1 | 2023-06-02T17:54:04Z | 2023-06-04T19:58:08Z | 2023-06-04T18:46:14Z | MEMBER | 0 | pydata/xarray/pulls/7889 | With the recent addition of the workflow to upload nightly releases to
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7889/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1721896187 | PR_kwDOAMm_X85RIyh6 | 7867 | add `numba` to the py3.11 environment | keewis 14808389 | closed | 0 | 1 | 2023-05-23T11:49:37Z | 2023-06-03T11:36:10Z | 2023-05-28T06:30:10Z | MEMBER | 0 | pydata/xarray/pulls/7867 |
I'm not sure what to do about |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7867/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1730414479 | PR_kwDOAMm_X85RlpAe | 7878 | move to `setup-micromamba` | keewis 14808389 | closed | 0 | 0 | 2023-05-29T09:27:15Z | 2023-06-01T16:21:57Z | 2023-06-01T16:21:56Z | MEMBER | 0 | pydata/xarray/pulls/7878 | The
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7878/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1729709527 | PR_kwDOAMm_X85RjPc9 | 7876 | deprecate the `cdms2` conversion methods | keewis 14808389 | closed | 0 | 2 | 2023-05-28T22:18:55Z | 2023-05-30T20:59:48Z | 2023-05-29T19:01:20Z | MEMBER | 0 | pydata/xarray/pulls/7876 | As the cc @tomvothecoder for visibility
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7876/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1726529405 | PR_kwDOAMm_X85RYfGo | 7875 | defer to `numpy` for the expected result | keewis 14808389 | closed | 0 | 1 | 2023-05-25T21:48:18Z | 2023-05-27T19:53:08Z | 2023-05-27T19:53:07Z | MEMBER | 0 | pydata/xarray/pulls/7875 |
I'm not really sure what the best fix is, so I split the changes into two parts: the first commit uses
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7875/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1718144679 | PR_kwDOAMm_X85Q8Hne | 7855 | adapt the `pint` + `dask` test to the newest version of `pint` | keewis 14808389 | closed | 0 | 0 | 2023-05-20T11:35:47Z | 2023-05-25T17:25:01Z | 2023-05-25T17:24:34Z | MEMBER | 0 | pydata/xarray/pulls/7855 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7855/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1688716198 | PR_kwDOAMm_X85PZRyC | 7793 | adjust the deprecation policy for python | keewis 14808389 | closed | 0 | 2 | 2023-04-28T15:03:51Z | 2023-05-02T11:51:27Z | 2023-05-01T22:26:55Z | MEMBER | 0 | pydata/xarray/pulls/7793 | As discussed in #7765, this extends the policy months by 6 to a total of 30 months. With that, the support for a python version can be removed as soon as the next version is at least 30 months old. Together with the 12 month release cycle python has, we get the 42 month release window from NEP 29. Note that this is still missing the release overview proposed in #7765, I'm still thinking about how to best implement the automatic update / formatting, and how to coordinate it with the (still manual) version overrides.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7793/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1440494247 | I_kwDOAMm_X85V3DKn | 7270 | type checking CI is failing | keewis 14808389 | closed | 0 | 3 | 2022-11-08T16:10:24Z | 2023-04-15T18:31:59Z | 2023-04-15T18:31:59Z | MEMBER | The most recent runs of the type checking CI have started to fail with a segfault:
7269 pinned |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7270/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 1, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1668326257 | PR_kwDOAMm_X85OVLQA | 7756 | remove the `black` hook | keewis 14808389 | closed | 0 | 0 | 2023-04-14T14:10:36Z | 2023-04-14T17:42:49Z | 2023-04-14T16:36:18Z | MEMBER | 0 | pydata/xarray/pulls/7756 | Apparently, in addition to formatting notebooks, |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7756/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1668319039 | PR_kwDOAMm_X85OVJv5 | 7755 | reword the what's new entry for the `pandas` 2.0 dtype changes | keewis 14808389 | closed | 0 | 0 | 2023-04-14T14:06:54Z | 2023-04-14T14:30:51Z | 2023-04-14T14:30:50Z | MEMBER | 0 | pydata/xarray/pulls/7755 | As a follow-up to #7724, this makes the what's new entry a bit more precise. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7755/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1655782486 | PR_kwDOAMm_X85Nr3hH | 7724 | `pandas=2.0` support | keewis 14808389 | closed | 0 | 8 | 2023-04-05T14:52:30Z | 2023-04-12T13:24:07Z | 2023-04-12T13:04:11Z | MEMBER | 0 | pydata/xarray/pulls/7724 | As mentioned in https://github.com/pydata/xarray/issues/7716#issuecomment-1497623839, this tries to unpin |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7724/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
683142059 | MDU6SXNzdWU2ODMxNDIwNTk= | 4361 | restructure the contributing guide | keewis 14808389 | open | 0 | 5 | 2020-08-20T22:51:39Z | 2023-03-31T17:39:00Z | MEMBER | From #4355 @max-sixty:
We could also add a docstring guide since the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4361/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1637616804 | PR_kwDOAMm_X85MvWza | 7664 | use the `files` interface instead of the deprecated `read_binary` | keewis 14808389 | closed | 0 | 2 | 2023-03-23T14:06:36Z | 2023-03-30T14:59:22Z | 2023-03-30T14:58:43Z | MEMBER | 0 | pydata/xarray/pulls/7664 | Apparently, |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7664/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1635470616 | PR_kwDOAMm_X85MoK6O | 7657 | add timeouts for tests | keewis 14808389 | closed | 0 | 9 | 2023-03-22T10:20:04Z | 2023-03-24T16:42:48Z | 2023-03-24T15:49:22Z | MEMBER | 0 | pydata/xarray/pulls/7657 | The Tests that time out raise an error, which might help us with figuring out which test it is that stalls, and also if we can do anything about that. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7657/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1306795760 | I_kwDOAMm_X85N5B7w | 6793 | improve docstrings with examples and links | keewis 14808389 | open | 0 | 10 | 2022-07-16T12:30:33Z | 2023-03-24T16:33:28Z | MEMBER | This is a (incomplete) checklist for #5816 to make it easier to find methods that are in need of examples and links to the narrative docs with further information (of course, changes to the docstrings of all other methods / functions part of the public API are also appreciated). Good examples explicitly construct small xarray objects to make it easier to follow (e.g. use Use
To easily generate the expected output install To link to other documentation pages we can use
Top-level functions:
- [ ] I/O:
- [ ] Contents:
- [ ] Comparisons:
- [ ] Dask:
- [ ] Missing values:
- [ ] Indexing:
- [ ] Aggregations:
- [ ] |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6793/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1615259652 | PR_kwDOAMm_X85LkhcR | 7594 | ignore the `pkg_resources` deprecation warning | keewis 14808389 | closed | 0 | 0 | 2023-03-08T13:18:10Z | 2023-03-08T13:41:55Z | 2023-03-08T13:41:54Z | MEMBER | 0 | pydata/xarray/pulls/7594 | In one of the recent |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7594/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1534634670 | PR_kwDOAMm_X85Hc1wx | 7442 | update the docs environment | keewis 14808389 | closed | 0 | 5 | 2023-01-16T09:58:43Z | 2023-03-03T10:17:14Z | 2023-03-03T10:14:13Z | MEMBER | 0 | pydata/xarray/pulls/7442 | Most notably:
- bump ~Edit: it seems this is blocked by |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7442/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1556739109 | PR_kwDOAMm_X85IhJib | 7477 | RTD maintenance | keewis 14808389 | closed | 0 | 0 | 2023-01-25T14:20:27Z | 2023-01-25T14:58:50Z | 2023-01-25T14:58:46Z | MEMBER | 0 | pydata/xarray/pulls/7477 | Since the last time the RTD configuration was updated, a few things have changed:
- the OS image is now a bit old
- we can tell git (and thus If I read the documentation / history of RTD correctly, they removed the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7477/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1556471464 | PR_kwDOAMm_X85IgPzN | 7476 | fix the RTD build skipping feature | keewis 14808389 | closed | 0 | 0 | 2023-01-25T11:15:26Z | 2023-01-25T11:18:00Z | 2023-01-25T11:17:57Z | MEMBER | 0 | pydata/xarray/pulls/7476 | We can't use the example |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7476/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1553155277 | PR_kwDOAMm_X85IVH-_ | 7470 | allow skipping RTD builds | keewis 14808389 | closed | 0 | 1 | 2023-01-23T14:02:30Z | 2023-01-24T16:09:54Z | 2023-01-24T16:09:51Z | MEMBER | 0 | pydata/xarray/pulls/7470 | RTD somewhat recently introduced the build.jobs setting, which allows skipping builds (technically it's more of a "automated cancel" with a special error code instead of a skip) on user-defined conditions. We can use that to manually "skip" builds we don't need a documentation build for. Edit: the only downside seems to be that the build is not actually marked as "skipped" Edit2: apparently that's a bug that's being worked on, see readthedocs/readthedocs.org#9807 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7470/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1552997940 | PR_kwDOAMm_X85IUl4t | 7469 | create separate environment files for `python=3.11` | keewis 14808389 | closed | 0 | 0 | 2023-01-23T12:17:08Z | 2023-01-23T14:03:36Z | 2023-01-23T13:03:13Z | MEMBER | 0 | pydata/xarray/pulls/7469 | This builds on #7353, which found that In order to still test that we support |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7469/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1519058102 | PR_kwDOAMm_X85GoVw9 | 7415 | install `numbagg` from `conda-forge` | keewis 14808389 | closed | 0 | 5 | 2023-01-04T14:17:44Z | 2023-01-20T19:46:46Z | 2023-01-20T19:46:43Z | MEMBER | 0 | pydata/xarray/pulls/7415 | It seems there is a Not sure what to do about the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7415/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1519154372 | PR_kwDOAMm_X85Goq2J | 7416 | remove `numbagg` and `numba` from the upstream-dev CI | keewis 14808389 | closed | 0 | 3 | 2023-01-04T15:18:51Z | 2023-01-04T20:07:33Z | 2023-01-04T20:07:29Z | MEMBER | 0 | pydata/xarray/pulls/7416 |
Using the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7416/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
760574919 | MDU6SXNzdWU3NjA1NzQ5MTk= | 4670 | increase the visibility of the upstream-dev PR CI | keewis 14808389 | closed | 0 | 3 | 2020-12-09T18:37:57Z | 2022-12-29T21:15:05Z | 2021-01-19T15:27:26Z | MEMBER | We currently have two upstream-dev PR CI: the old pipelines CI and the new github actions CI added together with the scheduled upstream-dev ("nightly") CI. Since we don't need both I think we should disable one of these, presumably the old pipelines CI. There's an issue with the CI result, though: since github doesn't have a icon for "passed with issues", we have to choose between "passed" or "failed" as the CI result (neither of which is optimal). The advantage of using "failed" is that a failure is easily visible, but often the total CI result on PRs is set to "failed" because we didn't get around to fixing bugs introduced by recent changes to dependencies (which is confusing for contributors). In #4584 I switched the pipelines upstream-dev CI to "allowed failure" so we get a warning instead of a failure. However, github doesn't print the number of warnings in the summary line, which means that if the icon is green nobody checks the status and upstream-dev CI failures are easily missed. Our new scheduled nightly CI improves the situation quite a bit since we automatically get a issue containing the failures, but that means we aren't able to catch these failures before actually merging. As pointed out in https://github.com/pydata/xarray/issues/4574#issuecomment-725795622 that might be acceptable, though. If we still want to fix this, we could have the PR CI automatically add a comment to the PR, which would contain a summary of the failures but also state that these failures can be ignored as long as they get the approval of a maintainer. This would increase the noise on PRs, though. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4670/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1440354343 | PR_kwDOAMm_X85Cbqgn | 7267 | `keep_attrs` for pad | keewis 14808389 | closed | 0 | 6 | 2022-11-08T14:55:05Z | 2022-12-12T15:59:46Z | 2022-12-12T15:59:42Z | MEMBER | 0 | pydata/xarray/pulls/7267 | I ran into this while trying
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7267/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1440486521 | PR_kwDOAMm_X85CcHQ5 | 7269 | pin mypy to a known good version | keewis 14808389 | closed | 0 | 0 | 2022-11-08T16:06:47Z | 2022-11-08T16:49:16Z | 2022-11-08T16:49:13Z | MEMBER | 0 | pydata/xarray/pulls/7269 | The type checking CI has started to fail with the new upgrade. In order not to disturb unrelated PRs, this pins |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7269/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1438416173 | PR_kwDOAMm_X85CVEkR | 7260 | use the moving release tag of `issue-from-pytest-log` | keewis 14808389 | closed | 0 | 0 | 2022-11-07T14:01:35Z | 2022-11-07T14:40:28Z | 2022-11-07T14:32:02Z | MEMBER | 0 | pydata/xarray/pulls/7260 | Like most github actions (or at least all the official In order to decrease the amount of updating PRs, this makes use of that tag. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7260/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1429769840 | PR_kwDOAMm_X85B4VCq | 7242 | fix the environment setup: actually use the python version | keewis 14808389 | closed | 0 | 0 | 2022-10-31T12:30:37Z | 2022-10-31T13:24:36Z | 2022-10-31T13:16:46Z | MEMBER | 0 | pydata/xarray/pulls/7242 | While working on #7241, I realized that I forgot to specify the python in our main CI, with the effect that we've been testing python 3.10 twice for each OS since the introduction of the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7242/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1422482990 | PR_kwDOAMm_X85BgEYZ | 7212 | use the new action to create an issue from the output of reportlog | keewis 14808389 | closed | 0 | 0 | 2022-10-25T13:37:42Z | 2022-10-26T09:34:09Z | 2022-10-26T09:12:42Z | MEMBER | 0 | pydata/xarray/pulls/7212 | I'm not sure if we actually need the dedicated "report" job, or whether adding an additional step to the main ci job would suffice? I can see two reasons for a separate job: 1. we get to control the python version independently from the version of the main job 2. we upload the reportlog files as artifacts I think if we can change the action to abort if it is run on a python version it does not support the first concern would not matter anymore, and for 2 we might just keep the "upload artifact" action (but is it even possible to manually access artifacts? If not we might not even need the upload). Since I don't think either is a major concern, I went ahead and joined the jobs.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7212/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1422502185 | PR_kwDOAMm_X85BgIjA | 7213 | use the moving release tag of ci-trigger | keewis 14808389 | closed | 0 | 0 | 2022-10-25T13:50:13Z | 2022-10-25T14:29:54Z | 2022-10-25T14:29:51Z | MEMBER | 0 | pydata/xarray/pulls/7213 | In order to follow the minor releases of |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7213/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1413425793 | PR_kwDOAMm_X85BBvaI | 7185 | indexes section in the HTML repr | keewis 14808389 | closed | 0 | 4 | 2022-10-18T15:25:34Z | 2022-10-20T06:59:05Z | 2022-10-19T21:12:46Z | MEMBER | 0 | pydata/xarray/pulls/7185 | To see the effect, try this:
```python
import xarray as xr
from xarray.core.indexes import Index
class CustomIndex(Index):
def __init__(self, names, options):
self.names = names
self.options = options
@classmethod
def from_variables(cls, variables, options):
names = list(variables.keys())
return cls(names, options)
def __repr__(self):
options = (
{"names": repr(self.names)}
| {str(k): str(v) for k, v in self.options.items()}
)
return f"CustomIndex({', '.join(k + '=' + v for k, v in options.items())})"
def _repr_html_(self):
header_row = "<tr><td>KDTree params</td></tr>"
option_rows = [
f"<tr><td>{option}</td><td>{value}</td></tr>"
for option, value in self.options.items()
]
return f"<left><table>{header_row}{''.join(option_rows)}</table></left>"
ds = xr.tutorial.open_dataset("rasm")
ds1 = ds.set_xindex(["xc", "yc"], CustomIndex, param1="a", param2="b")
with xr.set_options(display_style="text"):
display(ds1)
with xr.set_options(display_style="html"):
display(ds1)
```
~The repr looks a bit strange because I've been borrowing the variable CSS classes.~ Edit: @benbovy fixed that for me Also, the discussion about what
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7185/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1412926287 | PR_kwDOAMm_X85BADV9 | 7183 | use `_repr_inline_` for indexes that define it | keewis 14808389 | closed | 0 | 6 | 2022-10-18T10:00:47Z | 2022-10-19T14:06:51Z | 2022-10-19T14:06:47Z | MEMBER | 0 | pydata/xarray/pulls/7183 | Also, some tests for the index summarizer.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7183/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1306887842 | PR_kwDOAMm_X847g7WQ | 6795 | display the indexes in the string reprs | keewis 14808389 | closed | 0 | 7 | 2022-07-16T19:42:19Z | 2022-10-15T18:28:36Z | 2022-10-12T16:52:53Z | MEMBER | 0 | pydata/xarray/pulls/6795 | With the flexible indexes refactor indexes have become much more important, which means we should include them in the reprs of This is a initial attempt, covering only the string reprs, with a few unanswered questions:
- how do we format indexes? Do we delegate to their (also, how do we best test this?)
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6795/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1404894283 | PR_kwDOAMm_X85AlZGn | 7153 | use a hook to synchronize the versions of `black` | keewis 14808389 | closed | 0 | 5 | 2022-10-11T16:07:05Z | 2022-10-12T08:00:10Z | 2022-10-12T08:00:07Z | MEMBER | 0 | pydata/xarray/pulls/7153 | We started to pin the version of |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7153/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
818059250 | MDExOlB1bGxSZXF1ZXN0NTgxNDIzNTIx | 4972 | Automatic duck array testing - reductions | keewis 14808389 | open | 0 | 23 | 2021-02-27T23:57:23Z | 2022-08-16T13:47:05Z | MEMBER | 1 | pydata/xarray/pulls/4972 | This is the first of a series of PRs to add a framework to make testing the integration of duck arrays as simple as possible. It uses
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4972/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1327082521 | PR_kwDOAMm_X848kdpQ | 6873 | rely on `numpy`'s version of `nanprod` and `nansum` | keewis 14808389 | closed | 0 | 1 | 2022-08-03T11:33:35Z | 2022-08-09T17:31:27Z | 2022-08-09T14:55:21Z | MEMBER | 0 | pydata/xarray/pulls/6873 | At the moment, I'm proposing to rely on |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6873/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1326997649 | PR_kwDOAMm_X848kLEu | 6872 | skip creating a cupy-backed IndexVariable | keewis 14808389 | closed | 0 | 0 | 2022-08-03T10:21:06Z | 2022-08-03T15:34:18Z | 2022-08-03T15:04:56Z | MEMBER | 0 | pydata/xarray/pulls/6872 | We could probably replace the default indexes with |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6872/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
532696790 | MDU6SXNzdWU1MzI2OTY3OTA= | 3594 | support for units with pint | keewis 14808389 | open | 0 | 7 | 2019-12-04T13:49:28Z | 2022-08-03T11:44:05Z | MEMBER |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3594/reactions", "total_count": 14, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 14, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1250755592 | I_kwDOAMm_X85KjQQI | 6645 | pre-release for v2022.06.0 | keewis 14808389 | closed | 0 | 14 | 2022-05-27T13:14:06Z | 2022-07-22T15:44:59Z | 2022-07-22T15:44:59Z | MEMBER | There's a few unreleased and potentially breaking changes in I am planning to create the pre-release tomorrow, but if there's any big changes that should be included please post here. cc @TomNicholas Edit: the version will be called |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6645/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
818957820 | MDU6SXNzdWU4MTg5NTc4MjA= | 4976 | reported version in the docs is misleading | keewis 14808389 | closed | 0 | 3 | 2021-03-01T15:08:12Z | 2022-07-10T13:00:46Z | 2022-07-10T13:00:46Z | MEMBER | The editable install on RTD is reported to have the version This is not something I can reproduce with We should try to get the version right. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4976/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1284543780 | PR_kwDOAMm_X846Wk8u | 6727 | resolve the timeouts on RTD | keewis 14808389 | closed | 0 | 1 | 2022-06-25T10:38:36Z | 2022-06-30T01:00:21Z | 2022-06-25T11:00:50Z | MEMBER | 0 | pydata/xarray/pulls/6727 | The reason the changes introduced in #6542 caused timeouts is that they redefined The fix is to explicitly set the dataset before calling
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6727/reactions", "total_count": 4, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 3, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1280027449 | I_kwDOAMm_X85MS6s5 | 6714 | `mypy` CI is failing on `main` | keewis 14808389 | closed | 0 | 1 | 2022-06-22T11:54:16Z | 2022-06-22T16:01:45Z | 2022-06-22T16:01:45Z | MEMBER | The most recent runs of the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6714/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1274838660 | PR_kwDOAMm_X8451-Gp | 6701 | try to import `UndefinedVariableError` from the new location | keewis 14808389 | closed | 0 | 1 | 2022-06-17T10:05:18Z | 2022-06-22T10:33:28Z | 2022-06-22T10:33:25Z | MEMBER | 0 | pydata/xarray/pulls/6701 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6701/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1271869460 | PR_kwDOAMm_X845sEYR | 6699 | use `pytest-reportlog` to generate upstream-dev CI failure reports | keewis 14808389 | closed | 0 | 0 | 2022-06-15T08:31:00Z | 2022-06-16T08:12:28Z | 2022-06-16T08:11:22Z | MEMBER | 0 | pydata/xarray/pulls/6699 | We currently use the output of Instead, we can use The new script will collect failure summaries like the old version, but it should be fairly easy to create a fancy report with more information. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6699/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1264767977 | PR_kwDOAMm_X845Uh71 | 6674 | use micromamba instead of mamba | keewis 14808389 | closed | 0 | 3 | 2022-06-08T13:38:15Z | 2022-06-10T11:33:05Z | 2022-06-10T11:33:00Z | MEMBER | 0 | pydata/xarray/pulls/6674 | I'm not sure if this is exactly equal to what we had before, but we might be able to save 3-4 minutes of CI time with this.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6674/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1264868027 | PR_kwDOAMm_X845U3bZ | 6675 | install the development version of `matplotlib` into the upstream-dev CI | keewis 14808389 | closed | 0 | 1 | 2022-06-08T14:42:15Z | 2022-06-10T11:25:33Z | 2022-06-10T11:25:31Z | MEMBER | 0 | pydata/xarray/pulls/6675 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6675/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
597566530 | MDExOlB1bGxSZXF1ZXN0NDAxNjU2MTc1 | 3960 | examples for special methods on accessors | keewis 14808389 | open | 0 | 6 | 2020-04-09T21:34:30Z | 2022-06-09T14:50:17Z | MEMBER | 0 | pydata/xarray/pulls/3960 | This starts adding the parametrized accessor examples from #3829 to the accessor documentation as suggested by @jhamman. Since then the Also, this feature can be abused to add functions to the main (~When trying to build the docs locally, sphinx keeps complaining about a code block without code. Not sure what that is about~ seems the
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3960/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
801728730 | MDExOlB1bGxSZXF1ZXN0NTY3OTkzOTI3 | 4863 | apply to dataset | keewis 14808389 | open | 0 | 14 | 2021-02-05T00:05:22Z | 2022-06-09T14:50:17Z | MEMBER | 0 | pydata/xarray/pulls/4863 | as discussed in #4837, this adds a method that applies a function to a This function is really similar to
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4863/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
959063390 | MDExOlB1bGxSZXF1ZXN0NzAyMjM0ODc1 | 5668 | create the context objects passed to custom `combine_attrs` functions | keewis 14808389 | open | 0 | 1 | 2021-08-03T12:24:50Z | 2022-06-09T14:50:16Z | MEMBER | 0 | pydata/xarray/pulls/5668 | Follow-up to #4896: this creates the context object in reduce methods and passes it to
Note that for now this is a bit inconvenient to use for provenance tracking (as discussed in the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5668/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
1264971669 | PR_kwDOAMm_X845VNfZ | 6676 | release notes for the pre-release | keewis 14808389 | closed | 0 | 2 | 2022-06-08T15:59:28Z | 2022-06-09T14:41:34Z | 2022-06-09T14:41:32Z | MEMBER | 0 | pydata/xarray/pulls/6676 | The only thing it contains so far is the known regression, do we want to have a summary of the most notable changes like in the previous releases?
cc @pydata/xarray |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6676/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1265366275 | I_kwDOAMm_X85La_UD | 6678 | exception groups | keewis 14808389 | open | 0 | 1 | 2022-06-08T22:09:37Z | 2022-06-08T23:38:28Z | MEMBER | What is your issue?As I mentioned in the meeting today, we have a lot of features where the the exception group support from PEP654 (which is scheduled for python 3.11 and consists of the class and a syntax change) might be useful. For example, we might want to collect all errors raised by For |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6678/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
1264494714 | PR_kwDOAMm_X845TmsM | 6673 | more testpypi workflow fixes | keewis 14808389 | closed | 0 | 0 | 2022-06-08T09:57:50Z | 2022-06-08T13:59:29Z | 2022-06-08T13:52:08Z | MEMBER | 0 | pydata/xarray/pulls/6673 | Hopefully the final follow-up to #6660. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6673/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1262891004 | PR_kwDOAMm_X845OOpR | 6671 | try to finally fix the TestPyPI workflow | keewis 14808389 | closed | 0 | 0 | 2022-06-07T08:04:32Z | 2022-06-07T08:33:15Z | 2022-06-07T08:33:01Z | MEMBER | 0 | pydata/xarray/pulls/6671 | As per https://github.com/pydata/xarray/issues/6659#issuecomment-1148285401, don't invoke
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6671/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1262480325 | PR_kwDOAMm_X845M3wk | 6669 | pin setuptools in the testpypi configure script | keewis 14808389 | closed | 0 | 1 | 2022-06-06T22:04:12Z | 2022-06-06T22:38:48Z | 2022-06-06T22:33:16Z | MEMBER | 0 | pydata/xarray/pulls/6669 | This works around a incompatibility between
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6669/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1262396039 | PR_kwDOAMm_X845MlHs | 6668 | fix the python version for the TestPyPI release workflow | keewis 14808389 | closed | 0 | 0 | 2022-06-06T20:42:51Z | 2022-06-06T21:16:01Z | 2022-06-06T21:12:23Z | MEMBER | 0 | pydata/xarray/pulls/6668 | follow-up to #6660 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6668/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1259827097 | PR_kwDOAMm_X845EJs5 | 6660 | upload wheels from `main` to TestPyPI | keewis 14808389 | closed | 0 | 4 | 2022-06-03T12:00:02Z | 2022-06-06T19:49:08Z | 2022-06-06T19:49:02Z | MEMBER | 0 | pydata/xarray/pulls/6660 | This adds a new workflow that uploads every commit to main as a new wheel to TestPyPI. No tests, though, so those wheels might be broken (but that's fine, I guess). Should we document this somewhere, like the install guide?
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6660/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
624778130 | MDU6SXNzdWU2MjQ3NzgxMzA= | 4095 | merging non-dimension coordinates with the Dataset constructor | keewis 14808389 | open | 0 | 1 | 2020-05-26T10:30:37Z | 2022-04-19T13:54:43Z | MEMBER | When adding two This fails: ```python In [1]: import xarray as xr ...: import numpy as np In [2]: a = np.linspace(0, 1, 10)
...: b = np.linspace(-1, 0, 12)
...:
...: x_a = np.arange(10)
...: x_b = np.arange(12)
...:
...: y_a = x_a * 1000
...: y_b = x_b * 1000
...:
...: arr1 = xr.DataArray(data=a, coords={"x": x_a, "y": ("x", y_a)}, dims="x")
...: arr2 = xr.DataArray(data=b, coords={"x": x_b, "y": ("x", y_b)}, dims="x")
...:
...: xr.Dataset({"a": arr1, "b": arr2})
...
MergeError: conflicting values for variable 'y' on objects to be combined. You can skip this check by specifying compat='override'.
I can work around this by calling:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4095/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
628436420 | MDU6SXNzdWU2Mjg0MzY0MjA= | 4116 | xarray ufuncs | keewis 14808389 | closed | 0 | 5 | 2020-06-01T13:25:54Z | 2022-04-19T03:26:53Z | 2022-04-19T03:26:53Z | MEMBER | The documentation warns that the universal functions in Since we only support Since there are also functions that are not true ufuncs (e.g. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4116/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1183366218 | PR_kwDOAMm_X841Jvka | 6419 | unpin `jinja2` | keewis 14808389 | closed | 0 | 0 | 2022-03-28T12:27:48Z | 2022-03-30T13:54:51Z | 2022-03-30T13:54:50Z | MEMBER | 0 | pydata/xarray/pulls/6419 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6419/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1181704575 | PR_kwDOAMm_X841EMNH | 6414 | use the `DaskIndexingAdapter` for `duck dask` arrays | keewis 14808389 | closed | 0 | 2 | 2022-03-26T12:00:34Z | 2022-03-27T20:38:43Z | 2022-03-27T20:38:40Z | MEMBER | 0 | pydata/xarray/pulls/6414 | (detected while trying to implement a This fixes position-based indexing of In [1]: import xarray as xr ...: impIn [1]: import xarray as xr ...: import dask.array as da ...: import pint ...: ...: ureg = pint.UnitRegistry(force_ndarray_like=True) ...: ...: a = da.zeros((20, 20), chunks=(10, 10)) ...: q = ureg.Quantity(a, "m") ...: ...: arr1 = xr.DataArray(a, dims=("x", "y")) ...: arr2 = xr.DataArray(q, dims=("x", "y")) In [2]: arr1.isel(x=[0, 2, 4], y=[1, 3, 5]) Out[2]: <xarray.DataArray 'zeros_like-d81259c3a77e6dff3e60975e2afe4ff9' (x: 3, y: 3)> dask.array<getitem, shape=(3, 3), dtype=float64, chunksize=(3, 3), chunktype=numpy.ndarray> Dimensions without coordinates: x, y In [3]: arr2.isel(x=[0, 2, 4], y=[1, 3, 5])NotImplementedError Traceback (most recent call last) Input In [3], in <module> ----> 1 arr2.isel(x=[0, 2, 4], y=[1, 3, 5]) File .../xarray/core/dataarray.py:1220, in DataArray.isel(self, indexers, drop, missing_dims, **indexers_kwargs) 1215 return self._from_temp_dataset(ds) 1217 # Much faster algorithm for when all indexers are ints, slices, one-dimensional 1218 # lists, or zero or one-dimensional np.ndarray's -> 1220 variable = self._variable.isel(indexers, missing_dims=missing_dims) 1221 indexes, index_variables = isel_indexes(self.xindexes, indexers) 1223 coords = {} File .../xarray/core/variable.py:1172, in Variable.isel(self, indexers, missing_dims, **indexers_kwargs) 1169 indexers = drop_dims_from_indexers(indexers, self.dims, missing_dims) 1171 key = tuple(indexers.get(dim, slice(None)) for dim in self.dims) -> 1172 return self[key] File .../xarray/core/variable.py:765, in Variable.getitem(self, key)
752 """Return a new Variable object whose contents are consistent with
753 getting the provided key from the underlying data.
754
(...)
762 array File .../xarray/core/indexing.py:1269, in NumpyIndexingAdapter.getitem(self, key) 1267 def getitem(self, key): 1268 array, key = self._indexing_array_and_key(key) -> 1269 return array[key] File .../lib/python3.9/site-packages/pint/quantity.py:1899, in Quantity.getitem(self, key) 1897 def getitem(self, key): 1898 try: -> 1899 return type(self)(self._magnitude[key], self._units) 1900 except PintTypeError: 1901 raise File .../lib/python3.9/site-packages/dask/array/core.py:1892, in Array.getitem(self, index) 1889 return self 1891 out = "getitem-" + tokenize(self, index2) -> 1892 dsk, chunks = slice_array(out, self.name, self.chunks, index2, self.itemsize) 1894 graph = HighLevelGraph.from_collections(out, dsk, dependencies=[self]) 1896 meta = meta_from_array(self._meta, ndim=len(chunks)) File .../lib/python3.9/site-packages/dask/array/slicing.py:174, in slice_array(out_name, in_name, blockdims, index, itemsize) 171 index += (slice(None, None, None),) * missing 173 # Pass down to next function --> 174 dsk_out, bd_out = slice_with_newaxes(out_name, in_name, blockdims, index, itemsize) 176 bd_out = tuple(map(tuple, bd_out)) 177 return dsk_out, bd_out File .../lib/python3.9/site-packages/dask/array/slicing.py:196, in slice_with_newaxes(out_name, in_name, blockdims, index, itemsize) 193 where_none[i] -= n 195 # Pass down and do work --> 196 dsk, blockdims2 = slice_wrap_lists(out_name, in_name, blockdims, index2, itemsize) 198 if where_none: 199 expand = expander(where_none) File .../lib/python3.9/site-packages/dask/array/slicing.py:242, in slice_wrap_lists(out_name, in_name, blockdims, index, itemsize) 238 where_list = [ 239 i for i, ind in enumerate(index) if is_arraylike(ind) and ind.ndim > 0 240 ] 241 if len(where_list) > 1: --> 242 raise NotImplementedError("Don't yet support nd fancy indexing") 243 # Is the single list an empty list? In this case just treat it as a zero 244 # length slice 245 if where_list and not index[where_list[0]].size: NotImplementedError: Don't yet support nd fancy indexing ```
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6414/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1181715207 | PR_kwDOAMm_X841EOjL | 6415 | upgrade `sphinx` | keewis 14808389 | closed | 0 | 2 | 2022-03-26T12:16:08Z | 2022-03-26T22:13:53Z | 2022-03-26T22:13:50Z | MEMBER | 0 | pydata/xarray/pulls/6415 |
Along with it, this updates our RTD configuration. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6415/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1174386963 | I_kwDOAMm_X85F_7kT | 6382 | `reindex` drops attrs | keewis 14808389 | closed | 0 | 1 | 2022-03-19T22:37:46Z | 2022-03-21T07:53:05Z | 2022-03-21T07:53:04Z | MEMBER | What happened?
As far as I can tell, the new reindexing code in Minimal Complete Verifiable Example```Python before #5692In [1]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature") In [2]: ds.reindex(lat=range(10, 80, 5)).lat Out[2]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y In [3]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[3]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 Attributes: standard_name: latitude long_name: Latitude units: degrees_north axis: Y after #5692In [3]: import xarray as xr ...: ...: xr.set_options(keep_attrs=True) ...: ...: ds = xr.tutorial.open_dataset("air_temperature") In [4]: ds.reindex(lat=range(10, 80, 5)).lat Out[4]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 In [5]: ds.reindex(lat=xr.DataArray(range(10, 80, 5), attrs={"attr": "value"}, dims="lat")).lat Out[5]: <xarray.DataArray 'lat' (lat: 14)> array([10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) Coordinates: * lat (lat) int64 10 15 20 25 30 35 40 45 50 55 60 65 70 75 ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6382/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1170003912 | PR_kwDOAMm_X840etlq | 6361 | Revert "explicitly install `ipython_genutils`" | keewis 14808389 | closed | 0 | 1 | 2022-03-15T17:57:29Z | 2022-03-15T19:06:32Z | 2022-03-15T19:06:31Z | MEMBER | 0 | pydata/xarray/pulls/6361 | Since the dependency issue has been fixed upstream, this reverts pydata/xarray#6350 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6361/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1166353506 | PR_kwDOAMm_X840S8Yo | 6350 | explicitly install `ipython_genutils` | keewis 14808389 | closed | 0 | 2 | 2022-03-11T12:19:49Z | 2022-03-15T17:56:41Z | 2022-03-11T14:54:45Z | MEMBER | 0 | pydata/xarray/pulls/6350 | This can be reverted once the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6350/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1125877338 | I_kwDOAMm_X85DG4Za | 6250 | failing docs builds because the `scipy` intersphinx registry is unreachable | keewis 14808389 | closed | 0 | 6 | 2022-02-07T11:44:44Z | 2022-02-08T21:49:48Z | 2022-02-08T21:49:47Z | MEMBER | What happened?
There's nothing we can do to really fix this, but we can try to the avoid docs build failures by disabling that intersphinx entry (not sure if that results in other errors, though) What did you expect to happen?No response Minimal Complete Verifiable ExampleNo response Relevant log output
Anything else we need to know?No response EnvironmentSee RTD |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6250/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);