issues
201 rows where state = "closed" and user = 10194086 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, draft, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2137065741 | PR_kwDOAMm_X85nAXC5 | 8756 | suppress base & loffset deprecation warnings | mathause 10194086 | closed | 0 | 2 | 2024-02-15T17:23:27Z | 2024-02-16T09:44:32Z | 2024-02-15T19:11:10Z | MEMBER | 0 | pydata/xarray/pulls/8756 | Supress some more internal warnings in the test suite. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8756/reactions", "total_count": 3, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 3, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2090265314 | PR_kwDOAMm_X85kiCi8 | 8627 | unify freq strings (independent of pd version) | mathause 10194086 | closed | 0 | 4 | 2024-01-19T10:57:04Z | 2024-02-15T17:53:42Z | 2024-02-15T16:53:36Z | MEMBER | 0 | pydata/xarray/pulls/8627 |
Probably not ready for review yet. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8627/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2083501344 | I_kwDOAMm_X858L7Ug | 8612 | more frequency string updates? | mathause 10194086 | closed | 0 | 5 | 2024-01-16T09:56:48Z | 2024-02-15T16:53:37Z | 2024-02-15T16:53:37Z | MEMBER | What is your issue?I looked a bit into the frequency string update & found 3 issues we could improve upon.
I have played around with 2. and 3. and can open a PR if you are on board. @spencerkclark @aulemahal
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8612/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2130313810 | PR_kwDOAMm_X85mpS8i | 8737 | unstack: require unique MultiIndex | mathause 10194086 | closed | 0 | 2 | 2024-02-12T14:58:06Z | 2024-02-13T09:48:51Z | 2024-02-13T09:48:36Z | MEMBER | 0 | pydata/xarray/pulls/8737 |
Unstacking non-unique MultiIndex can lead to silent data loss, so we raise an error. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8737/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1918089795 | I_kwDOAMm_X85yU7pD | 8252 | cannot use negative step to sel from zarr (without dask) | mathause 10194086 | closed | 0 | 0 | 2023-09-28T18:52:07Z | 2024-02-10T02:57:33Z | 2024-02-10T02:57:33Z | MEMBER | What happened?As per: https://github.com/pydata/xarray/pull/8246#discussion_r1340357405 Passing a negative step in a What did you expect to happen?zarr should allow negative step (probably?) Minimal Complete Verifiable Example```Python import xarray as xr create a zarr datasetair = xr.tutorial.open_dataset("air_temperature") air.to_zarr("test.zarr") ds = xr.open_dataset("test.zarr", engine="zarr") ds.air[::-1, ].load() note that this works if the dataset is backed by daskds_dask = xr.open_dataset("test.zarr", engine="zarr", chunks="auto") ds_dask.air[::-1, ].load() ``` MVCE confirmation
Relevant log output```Python File ~/code/xarray/xarray/core/parallelcompat.py:93, in guess_chunkmanager(manager) 91 if isinstance(manager, str): 92 if manager not in chunkmanagers: ---> 93 raise ValueError( 94 f"unrecognized chunk manager {manager} - must be one of: {list(chunkmanagers)}" 95 ) 97 return chunkmanagers[manager] 98 elif isinstance(manager, ChunkManagerEntrypoint): 99 # already a valid ChunkManager so just pass through ValueError: unrecognized chunk manager dask - must be one of: [] ``` Anything else we need to know?The error comes from https://github.com/zarr-developers/zarr-python/blob/6ec746ef1242dd9fec26b128cc0b3455d28ad6f0/zarr/indexing.py#L174 so it would need an upstream fix first. cc @dcherian is this what you had in mind? Environment
INSTALLED VERSIONS
------------------
commit: f6d69a1f6d952dcd67609c97f3fb3069abdda586
python: 3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:40:32) [GCC 12.3.0]
python-bits: 64
OS: Linux
OS-release: 6.2.0-33-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.14.2
libnetcdf: 4.9.2
xarray: 2023.9.1.dev8+gf6d69a1f
pandas: 2.1.1
numpy: 1.24.4
scipy: 1.11.3
netCDF4: 1.6.4
pydap: installed
h5netcdf: 1.2.0
h5py: 3.9.0
Nio: None
zarr: 2.16.1
cftime: 1.6.2
nc_time_axis: 1.4.1
PseudoNetCDF: 3.2.2
iris: 3.7.0
bottleneck: 1.3.7
dask: 2023.9.2
distributed: None
matplotlib: 3.8.0
cartopy: 0.22.0
seaborn: 0.12.2
numbagg: 0.2.2
fsspec: 2023.9.2
cupy: None
pint: 0.20.1
sparse: 0.14.0
flox: 0.7.2
numpy_groupies: 0.10.1
setuptools: 68.2.2
pip: 23.2.1
conda: None
pytest: 7.4.2
mypy: None
IPython: 8.15.0
sphinx: None
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8252/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2126795367 | PR_kwDOAMm_X85mdn7J | 8727 | ruff: move some config to lint section | mathause 10194086 | closed | 0 | 0 | 2024-02-09T09:48:17Z | 2024-02-09T15:49:03Z | 2024-02-09T15:49:03Z | MEMBER | 0 | pydata/xarray/pulls/8727 | Fix a warning from ruff concerning the config: warning: The top-level linter settings are deprecated in favour of their counterparts in the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8727/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2098122391 | PR_kwDOAMm_X85k8eI1 | 8651 | allow negative freq strings | mathause 10194086 | closed | 0 | 2 | 2024-01-24T12:04:39Z | 2024-02-01T09:17:11Z | 2024-02-01T09:01:44Z | MEMBER | 0 | pydata/xarray/pulls/8651 |
This allows negative freq strings as discussed in https://github.com/pydata/xarray/pull/8627#issuecomment-1905981660 Deciding which tests to update was not easy. The pandas
I am slightly nervous about this but all the tests still pass... Once again cc @spencerkclark |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8651/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2105738254 | PR_kwDOAMm_X85lVsgz | 8680 | use ruff.flake8-tidy-imports to enforce absolute imports | mathause 10194086 | closed | 0 | 1 | 2024-01-29T15:19:34Z | 2024-01-30T16:42:46Z | 2024-01-30T16:38:48Z | MEMBER | 0 | pydata/xarray/pulls/8680 | use ruff.flake8-tidy-imports to enforce absolute imports
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8680/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2098131640 | PR_kwDOAMm_X85k8gJe | 8652 | new whats-new section | mathause 10194086 | closed | 0 | 2 | 2024-01-24T12:10:07Z | 2024-01-26T10:07:39Z | 2024-01-24T12:59:49Z | MEMBER | 0 | pydata/xarray/pulls/8652 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8652/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2097971637 | PR_kwDOAMm_X85k789- | 8649 | ruff: use extend-exclude | mathause 10194086 | closed | 0 | 1 | 2024-01-24T10:39:46Z | 2024-01-24T18:32:20Z | 2024-01-24T15:59:11Z | MEMBER | 0 | pydata/xarray/pulls/8649 | I think we should use From https://docs.astral.sh/ruff/settings/#exclude:
(I really dislike how github formats toml files... What would be the correct syntax, then?) |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8649/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2094542307 | PR_kwDOAMm_X85kwUlb | 8642 | infer_freq: return 'YE' (#8629 follow-up) | mathause 10194086 | closed | 0 | 0 | 2024-01-22T18:53:52Z | 2024-01-23T12:44:14Z | 2024-01-23T12:44:14Z | MEMBER | 0 | pydata/xarray/pulls/8642 | I realized that the return value of Sorry for all the spam @spencerkclark |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8642/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2090340727 | PR_kwDOAMm_X85kiTjg | 8629 | rename "Y" freq string to "YE" (pandas parity) | mathause 10194086 | closed | 0 | 10 | 2024-01-19T11:31:58Z | 2024-01-22T18:38:06Z | 2024-01-22T08:01:24Z | MEMBER | 0 | pydata/xarray/pulls/8629 |
This renames the frequency string Let me know what you think @spencerkclark @aulemahal |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8629/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2070895451 | PR_kwDOAMm_X85jf-2J | 8600 | fix and test empty CFTimeIndex | mathause 10194086 | closed | 0 | 1 | 2024-01-08T17:11:43Z | 2024-01-17T12:29:11Z | 2024-01-15T21:49:34Z | MEMBER | 0 | pydata/xarray/pulls/8600 |
Otherwise |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8600/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1455395909 | I_kwDOAMm_X85Wv5RF | 7298 | html repr fails for empty cftime arrays | mathause 10194086 | closed | 0 | 1 | 2022-11-18T16:09:00Z | 2024-01-15T21:49:36Z | 2024-01-15T21:49:35Z | MEMBER | What happened?The html repr of a cftime array wants to display the "calendar", which it cannot if it is empty. What did you expect to happen?No error. Minimal Complete Verifiable Example```Python import numpy as np import xarray as xr data_obs = np.random.randn(3) time_obs = xr.date_range("2000-01-01", periods=3, freq="YS", calendar="noleap") obs = xr.DataArray(data_obs, coords={"time": time_obs}) o = obs[:0] xr.core.formatting_html.array_repr(o) ``` MVCE confirmation
Relevant log output```Python ValueError Traceback (most recent call last) Input In [1], in <cell line: 12>() 8 obs = xr.DataArray(data_obs, coords={"time": time_obs}) 10 o = obs[:0] ---> 12 xr.core.formatting_html.array_repr(o) File ~/code/xarray/xarray/core/formatting_html.py:318, in array_repr(arr) 316 if hasattr(arr, "xindexes"): 317 indexes = _get_indexes_dict(arr.xindexes) --> 318 sections.append(index_section(indexes)) 320 sections.append(attr_section(arr.attrs)) 322 return _obj_repr(arr, header_components, sections) File ~/code/xarray/xarray/core/formatting_html.py:195, in _mapping_section(mapping, name, details_func, max_items_collapse, expand_option_name, enabled) 188 expanded = _get_boolean_with_default( 189 expand_option_name, n_items < max_items_collapse 190 ) 191 collapsed = not expanded 193 return collapsible_section( 194 name, --> 195 details=details_func(mapping), 196 n_items=n_items, 197 enabled=enabled, 198 collapsed=collapsed, 199 ) File ~/code/xarray/xarray/core/formatting_html.py:155, in summarize_indexes(indexes) 154 def summarize_indexes(indexes): --> 155 indexes_li = "".join( 156 f"
File ~/code/xarray/xarray/core/formatting_html.py:156, in <genexpr>(.0) 154 def summarize_indexes(indexes): 155 indexes_li = "".join( --> 156 f"
File ~/code/xarray/xarray/core/formatting_html.py:140, in summarize_index(coord_names, index) 138 index_id = f"index-{uuid.uuid4()}" 139 preview = escape(inline_index_repr(index)) --> 140 details = short_index_repr_html(index) 142 data_icon = _icon("icon-database") 144 return ( 145 f" {name} {preview} "
(...)
150 f"{details} "
151 )
File ~/code/xarray/xarray/core/formatting_html.py:132, in short_index_repr_html(index) 129 if hasattr(index, "repr_html"): 130 return index.repr_html() --> 132 return f" {escape(repr(index))}" File ~/code/xarray/xarray/core/indexes.py:547, in PandasIndex.repr(self) 546 def repr(self): --> 547 return f"PandasIndex({repr(self.index)})" File ~/code/xarray/xarray/coding/cftimeindex.py:353, in CFTimeIndex.repr(self) 345 end_str = format_times( 346 self.values[-REPR_ELLIPSIS_SHOW_ITEMS_FRONT_END:], 347 display_width, 348 offset=offset, 349 first_row_offset=offset, 350 ) 351 datastr = "\n".join([front_str, f"{' '*offset}...", end_str]) --> 353 attrs_str = format_attrs(self) 354 # oneliner only if smaller than display_width 355 full_repr_str = f"{klass_name}([{datastr}], {attrs_str})" File ~/code/xarray/xarray/coding/cftimeindex.py:272, in format_attrs(index, separator) 267 def format_attrs(index, separator=", "): 268 """Format attributes of CFTimeIndex for repr.""" 269 attrs = { 270 "dtype": f"'{index.dtype}'", 271 "length": f"{len(index)}", --> 272 "calendar": f"'{index.calendar}'", 273 "freq": f"'{index.freq}'" if len(index) >= 3 else None, 274 } 276 attrs_str = [f"{k}={v}" for k, v in attrs.items()] 277 attrs_str = f"{separator}".join(attrs_str) File ~/code/xarray/xarray/coding/cftimeindex.py:698, in CFTimeIndex.calendar(self) 695 """The calendar used by the datetimes in the index.""" 696 from .times import infer_calendar_name --> 698 return infer_calendar_name(self) File ~/code/xarray/xarray/coding/times.py:374, in infer_calendar_name(dates) 371 return sample.calendar 373 # Error raise if dtype is neither datetime or "O", if cftime is not importable, and if element of 'O' dtype is not cftime. --> 374 raise ValueError("Array does not contain datetime objects.") ValueError: Array does not contain datetime objects. ``` Anything else we need to know?Bisected to 7379923de756a2bcc59044d548f8ab7a68b91d4e use Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7298/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
2070231449 | PR_kwDOAMm_X85jdtRr | 8597 | _infer_dtype: remove duplicated code | mathause 10194086 | closed | 0 | 0 | 2024-01-08T11:12:18Z | 2024-01-08T19:40:06Z | 2024-01-08T19:40:06Z | MEMBER | 0 | pydata/xarray/pulls/8597 | By chance I saw that in #4700 the same code block was added twice. I think this can be removed. cc @andersy005 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8597/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 } |
xarray 13221727 | pull | |||||
2070561434 | PR_kwDOAMm_X85je1rK | 8598 | small string fixes | mathause 10194086 | closed | 0 | 1 | 2024-01-08T14:20:56Z | 2024-01-08T16:59:27Z | 2024-01-08T16:53:00Z | MEMBER | 0 | pydata/xarray/pulls/8598 | { "url": "https://api.github.com/repos/pydata/xarray/issues/8598/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2025652693 | PR_kwDOAMm_X85hJh0D | 8521 | test and fix empty xindexes repr | mathause 10194086 | closed | 0 | 4 | 2023-12-05T08:54:56Z | 2024-01-08T10:58:09Z | 2023-12-06T17:06:15Z | MEMBER | 0 | pydata/xarray/pulls/8521 |
Uses |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8521/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1959175248 | I_kwDOAMm_X850xqRQ | 8367 | `da.xindexes` or `da.indexes` raises an error if there are none (in the repr) | mathause 10194086 | closed | 0 | 1 | 2023-10-24T12:45:12Z | 2023-12-06T17:06:16Z | 2023-12-06T17:06:16Z | MEMBER | What happened?
What did you expect to happen?Displaying an empty Mappable? Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output```Python Out[9]: --------------------------------------------------------------------------- ValueError Traceback (most recent call last) File ~/.conda/envs/xarray_dev/lib/python3.10/site-packages/IPython/core/formatters.py:708, in PlainTextFormatter.call(self, obj) 701 stream = StringIO() 702 printer = pretty.RepresentationPrinter(stream, self.verbose, 703 self.max_width, self.newline, 704 max_seq_length=self.max_seq_length, 705 singleton_pprinters=self.singleton_printers, 706 type_pprinters=self.type_printers, 707 deferred_pprinters=self.deferred_printers) --> 708 printer.pretty(obj) 709 printer.flush() 710 return stream.getvalue() File ~/.conda/envs/xarray_dev/lib/python3.10/site-packages/IPython/lib/pretty.py:410, in RepresentationPrinter.pretty(self, obj) 407 return meth(obj, self, cycle) 408 if cls is not object \ 409 and callable(cls.dict.get('repr')): --> 410 return _repr_pprint(obj, self, cycle) 412 return _default_pprint(obj, self, cycle) 413 finally: File ~/.conda/envs/xarray_dev/lib/python3.10/site-packages/IPython/lib/pretty.py:778, in repr_pprint(obj, p, cycle) 776 """A pprint that just redirects to the normal repr function.""" 777 # Find newlines and replace them with p.break() --> 778 output = repr(obj) 779 lines = output.splitlines() 780 with p.group(): File ~/code/xarray/xarray/core/indexes.py:1659, in Indexes.repr(self) 1657 def repr(self): 1658 indexes = formatting._get_indexes_dict(self) -> 1659 return formatting.indexes_repr(indexes) File ~/code/xarray/xarray/core/formatting.py:474, in indexes_repr(indexes, max_rows) 473 def indexes_repr(indexes, max_rows: int | None = None) -> str: --> 474 col_width = _calculate_col_width(chain.from_iterable(indexes)) 476 return _mapping_repr( 477 indexes, 478 "Indexes", (...) 482 max_rows=max_rows, 483 ) File ~/code/xarray/xarray/core/formatting.py:341, in _calculate_col_width(col_items) 340 def _calculate_col_width(col_items): --> 341 max_name_length = max(len(str(s)) for s in col_items) if col_items else 0 342 col_width = max(max_name_length, 7) + 6 343 return col_width ValueError: max() arg is an empty sequence ``` Anything else we need to know?No response Environment
INSTALLED VERSIONS
------------------
commit: ccc8f9987b553809fb6a40c52fa1a8a8095c8c5f
python: 3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:40:32) [GCC 12.3.0]
python-bits: 64
OS: Linux
OS-release: 6.2.0-35-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.14.2
libnetcdf: 4.9.2
xarray: 2023.9.1.dev8+gf6d69a1f
pandas: 2.1.1
numpy: 1.24.4
scipy: 1.11.3
netCDF4: 1.6.4
pydap: installed
h5netcdf: 1.2.0
h5py: 3.9.0
Nio: None
zarr: 2.16.1
cftime: 1.6.2
nc_time_axis: 1.4.1
PseudoNetCDF: 3.2.2
iris: 3.7.0
bottleneck: 1.3.7
dask: 2023.9.2
distributed: None
matplotlib: 3.8.0
cartopy: 0.22.0
seaborn: 0.12.2
numbagg: 0.2.2
fsspec: 2023.9.2
cupy: None
pint: 0.20.1
sparse: 0.14.0
flox: 0.7.2
numpy_groupies: 0.10.1
setuptools: 68.2.2
pip: 23.2.1
conda: None
pytest: 7.4.2
mypy: 1.5.1
IPython: 8.15.0
sphinx: None
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8367/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
722168932 | MDU6SXNzdWU3MjIxNjg5MzI= | 4513 | where should keep_attrs be set in groupby, resample, weighted etc.? | mathause 10194086 | closed | 0 | 2 | 2020-10-15T09:36:43Z | 2023-11-10T16:58:35Z | 2023-11-10T16:58:35Z | MEMBER | I really should not open this can of worms but per https://github.com/pydata/xarray/issues/4450#issuecomment-697507489:
Also as I try to fix the
So the working consensus seems to be to to (Edit: looking at this it is only half as bad, "only" Detailed analysis
```python
import xarray as xr
ds = xr.tutorial.open_dataset("air_temperature")
da = ds.air
```
### coarsen
```python
ds.coarsen(time=2, keep_attrs=True).mean() # keeps global attributes
ds.coarsen(time=2).mean(keep_attrs=True) # keeps DataArray attributes
ds.coarsen(time=2, keep_attrs=True).mean(keep_attrs=True) # keeps both
da.coarsen(time=2).mean(keep_attrs=True) # error
da.coarsen(time=2, keep_attrs=True).mean() # keeps DataArray attributes
```
### groupby
```python
ds.groupby("time.month").mean(keep_attrs=True) # keeps both
da.groupby("time.month").mean(keep_attrs=True) # keeps DataArray attributes
ds.groupby("time.month", keep_attrs=True).mean() # error
da.groupby("time.month", keep_attrs=True).mean() # error
```
### groupby_bins
```python
ds.groupby_bins(ds.lat, np.arange(0, 90, 10)).mean(keep_attrs=True) # keeps both
da.groupby_bins(ds.lat, np.arange(0, 90, 10)).mean(keep_attrs=True) # keeps DataArray attrs
ds.groupby_bins(ds.lat, np.arange(0, 90, 10), keep_attrs=True) # errors
da.groupby_bins(ds.lat, np.arange(0, 90, 10), keep_attrs=True) # errors
```
### resample
```python
ds.resample(time="A").mean(keep_attrs=True) # keeps both
da.resample(time="A").mean(keep_attrs=True) # keeps DataArray attributes
ds.resample(time="A", keep_attrs=False).mean() # ignored
da.resample(time="A", keep_attrs=False).mean() # ignored
```
### rolling
```python
ds.rolling(time=2).mean(keep_attrs=True) # keeps both
da.rolling(time=2).mean(keep_attrs=True) # keeps DataArray attributes
ds.rolling(time=2, keep_attrs=True).mean() # DeprecationWarning; keeps both
da.rolling(time=2, keep_attrs=True).mean() # DeprecationWarning; keeps DataArray attributes
```
see #4510
### rolling_exp
```python
ds.rolling_exp(time=5, keep_attrs=True).mean() # ignored
da.rolling_exp(time=5, keep_attrs=True).mean() # ignored
ds.rolling_exp(time=5).mean(keep_attrs=True) # keeps both
da.rolling_exp(time=5).mean(keep_attrs=True) # keeps DataArray attributes
```
### weighted
```python
ds.weighted(ds.lat).mean(keep_attrs=True) # keeps both
da.weighted(ds.lat).mean(keep_attrs=True) # keeps DataArray attrs
```
edit: moved |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4513/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1986324822 | I_kwDOAMm_X852ZOlW | 8436 | align fails when more than one xindex is set | mathause 10194086 | closed | 0 | 2 | 2023-11-09T20:07:52Z | 2023-11-10T12:53:49Z | 2023-11-10T12:53:49Z | MEMBER | What happened?I tried a DataArray with more than one dimension coordinate. Unfortunately What did you expect to happen?No response Minimal Complete Verifiable Example```Python import numpy as np import xarray as xr data = np.arange(12).reshape(3, 4) y = [10, 20, 30] s = ["a", "b", "c"] x = [1, 2, 3, 4] da = xr.DataArray(data, dims=("y", "x"), coords={"x": x, "y": y, "s": ("y", s)}) da = da.set_xindex("s") xr.align(da, da.y) # errors da + da # errors da + da.x # errors ``` MVCE confirmation
Relevant log output```PythonValueError Traceback (most recent call last) /home/mathause/code/mesmer/devel/prepare_for_surfer.ipynb Cell 28 line 1 12 da = xr.DataArray(data, dims=("y", "x"), coords={"x": x, "y": y, "s": ("y", s)}) 13 da = da.set_xindex("s") ---> 15 xr.align(da, da.y) # errors 17 da + da.x # errors File ~/.conda/envs/mesmer_dev/lib/python3.9/site-packages/xarray/core/alignment.py:888, in align(join, copy, indexes, exclude, fill_value, *objects) 692 """ 693 Given any number of Dataset and/or DataArray objects, returns new 694 objects with aligned indexes and dimension sizes. ref='~/.conda/envs/mesmer_dev/lib/python3.9/site-packages/xarray/core/alignment.py:0'>0;32m (...) 878 879 """ 880 aligner = Aligner( 881 objects, 882 join=join, ref='~/.conda/envs/mesmer_dev/lib/python3.9/site-packages/xarray/core/alignment.py:0'>0;32m (...) 886 fill_value=fill_value, 887 ) --> 888 aligner.align() 889 return aligner.results File ~/.conda/envs/mesmer_dev/lib/python3.9/site-packages/xarray/core/alignment.py:573, in Aligner.align(self) 571 self.find_matching_indexes() 572 self.find_matching_unindexed_dims() --> 573 self.assert_no_index_conflict() 574 self.align_indexes() 575 self.assert_unindexed_dim_sizes_equal() File ~/.conda/envs/mesmer_dev/lib/python3.9/site-packages/xarray/core/alignment.py:318, in Aligner.assert_no_index_conflict(self) 314 if dup: 315 items_msg = ", ".join( 316 f"{k!r} ({v} conflicting indexes)" for k, v in dup.items() 317 ) --> 318 raise ValueError( 319 "cannot re-index or align objects with conflicting indexes found for " 320 f"the following {msg}: {items_msg}\n" 321 "Conflicting indexes may occur when\n" 322 "- they relate to different sets of coordinate and/or dimension names\n" 323 "- they don't have the same type\n" 324 "- they may be used to reindex data along common dimensions" 325 ) ValueError: cannot re-index or align objects with conflicting indexes found for the following dimensions: 'y' (2 conflicting indexes) Conflicting indexes may occur when - they relate to different sets of coordinate and/or dimension names - they don't have the same type - they may be used to reindex data along common dimensions ``` Anything else we need to know?No response Environment
INSTALLED VERSIONS
------------------
commit: feba6984aa914327408fee3c286dae15969d2a2f
python: 3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:40:32) [GCC 12.3.0]
python-bits: 64
OS: Linux
OS-release: 6.2.0-36-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.14.2
libnetcdf: 4.9.2
xarray: 2023.9.1.dev8+gf6d69a1f
pandas: 2.1.1
numpy: 1.24.4
scipy: 1.11.3
netCDF4: 1.6.4
pydap: installed
h5netcdf: 1.2.0
h5py: 3.9.0
Nio: None
zarr: 2.16.1
cftime: 1.6.2
nc_time_axis: 1.4.1
PseudoNetCDF: 3.2.2
iris: 3.7.0
bottleneck: 1.3.7
dask: 2023.9.2
distributed: None
matplotlib: 3.8.0
cartopy: 0.22.0
seaborn: 0.12.2
numbagg: 0.2.2
fsspec: 2023.9.2
cupy: None
pint: 0.20.1
sparse: 0.14.0
flox: 0.7.2
numpy_groupies: 0.10.1
setuptools: 68.2.2
pip: 23.2.1
conda: None
pytest: 7.4.2
mypy: 1.5.1
IPython: 8.15.0
sphinx: None
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8436/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1657036222 | I_kwDOAMm_X85ixF2- | 7730 | flox performance regression for cftime resampling | mathause 10194086 | closed | 0 | 8 | 2023-04-06T09:38:03Z | 2023-10-15T03:48:44Z | 2023-10-15T03:48:44Z | MEMBER | What happened?Running an in-memory What did you expect to happen?flox to be at least on par with our naive implementation Minimal Complete Verifiable Example```Python import numpy as np import xarray as xr arr = np.random.randn(10, 10, 36530) time = xr.date_range("2000", periods=30365, calendar="noleap") da = xr.DataArray(arr, dims=("y", "x", "time"), coords={"time": time}) using maxprint("max:") xr.set_options(use_flox=True) %timeit da.groupby("time.year").max("time") %timeit da.groupby("time.year").max("time", engine="flox") xr.set_options(use_flox=False) %timeit da.groupby("time.year").max("time") as reference%timeit [da.sel(time=str(year)).max("time") for year in range(2000, 2030)] using meanprint("mean:") xr.set_options(use_flox=True) %timeit da.groupby("time.year").mean("time") %timeit da.groupby("time.year").mean("time", engine="flox") xr.set_options(use_flox=False) %timeit da.groupby("time.year").mean("time") as reference%timeit [da.sel(time=str(year)).mean("time") for year in range(2000, 2030)] ``` MVCE confirmation
Relevant log output```Python max: 158 ms ± 4.41 ms per loop (mean ± std. dev. of 7 runs, 1 loop each) 28.1 ms ± 318 µs per loop (mean ± std. dev. of 7 runs, 10 loops each) 11.5 ms ± 52.3 µs per loop (mean ± std. dev. of 7 runs, 100 loops each) mean: 95.6 ms ± 10.8 ms per loop (mean ± std. dev. of 7 runs, 10 loops each) 34.8 ms ± 2.88 ms per loop (mean ± std. dev. of 7 runs, 10 loops each) 15.2 ms ± 232 µs per loop (mean ± std. dev. of 7 runs, 100 loops each) ``` Anything else we need to know?No response Environment
INSTALLED VERSIONS
------------------
commit: f8127fc9ad24fe8b41cce9f891ab2c98eb2c679a
python: 3.10.10 | packaged by conda-forge | (main, Mar 24 2023, 20:08:06) [GCC 11.3.0]
python-bits: 64
OS: Linux
OS-release: 5.15.0-69-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.2
libnetcdf: 4.9.1
xarray: main
pandas: 1.5.3
numpy: 1.23.5
scipy: 1.10.1
netCDF4: 1.6.3
pydap: installed
h5netcdf: 1.1.0
h5py: 3.8.0
Nio: None
zarr: 2.14.2
cftime: 1.6.2
nc_time_axis: 1.4.1
PseudoNetCDF: 3.2.2
iris: 3.4.1
bottleneck: 1.3.7
dask: 2023.3.2
distributed: 2023.3.2.1
matplotlib: 3.7.1
cartopy: 0.21.1
seaborn: 0.12.2
numbagg: 0.2.2
fsspec: 2023.3.0
cupy: None
pint: 0.20.1
sparse: 0.14.0
flox: 0.6.10
numpy_groupies: 0.9.20
setuptools: 67.6.1
pip: 23.0.1
conda: None
pytest: 7.2.2
mypy: None
IPython: 8.12.0
sphinx: None
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7730/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1177919687 | PR_kwDOAMm_X8403yVS | 6403 | make more args kw only (except 'dim') | mathause 10194086 | closed | 0 | 9 | 2022-03-23T10:28:02Z | 2023-10-05T20:38:49Z | 2023-10-05T20:38:49Z | MEMBER | 0 | pydata/xarray/pulls/6403 |
This makes many arguments keyword-only, except for Question: do we want an deprecation cycle? Currently it just errors for
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6403/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1917660013 | PR_kwDOAMm_X85bc7Pv | 8246 | update pytest config and un-xfail some tests | mathause 10194086 | closed | 0 | 1 | 2023-09-28T14:21:58Z | 2023-09-30T01:26:39Z | 2023-09-30T01:26:35Z | MEMBER | 0 | pydata/xarray/pulls/8246 |
This partly updates the pytest config as suggested in #8239 and un-xfails some tests (or xfails the tests more precisely). See https://github.com/pydata/xarray/issues/8239#issuecomment-1739363809 for why we cannot exactly follow the suggestions given in #8239 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8246/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
235224055 | MDU6SXNzdWUyMzUyMjQwNTU= | 1449 | time.units truncated when saving to_netcdf | mathause 10194086 | closed | 0 | 6 | 2017-06-12T12:58:37Z | 2023-09-13T13:25:25Z | 2023-09-13T13:25:24Z | MEMBER | When I manually specify the
Some programs seem to require the hours to be present to interpret the time properly (e.g. panoply). When specifying the hour, a 'T' is added.
When xarray defines the
xarray version 0.9.6 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1449/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1876205625 | PR_kwDOAMm_X85ZRl7U | 8130 | to_stacked_array: better error msg & refactor | mathause 10194086 | closed | 0 | 0 | 2023-08-31T19:51:08Z | 2023-09-10T15:33:41Z | 2023-09-10T15:33:37Z | MEMBER | 0 | pydata/xarray/pulls/8130 |
I found the error message in |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8130/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1596025651 | PR_kwDOAMm_X85Kj_KM | 7548 | supress namespace_package deprecation warning (doctests) | mathause 10194086 | closed | 0 | 0 | 2023-02-23T00:15:41Z | 2023-02-23T18:38:16Z | 2023-02-23T18:38:15Z | MEMBER | 0 | pydata/xarray/pulls/7548 | Suppress the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7548/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1086346755 | PR_kwDOAMm_X84wKOjC | 6096 | Replace distutils.version with packaging.version | mathause 10194086 | closed | 0 | 9 | 2021-12-22T00:51:21Z | 2023-01-20T21:00:42Z | 2021-12-24T14:50:48Z | MEMBER | 0 | pydata/xarray/pulls/6096 |
One change is that it is no longer possible to compare to a string, i.e. As mentioned in #6092 there are 3 options - if there is a preference I am happy to update this PR. ```python from distutils.version import LooseVersion from packaging import version LooseVersion(xr.version) version.parse(xr.version) version.Version(xr.version) currently:if LooseVersion(mod.version) < LooseVersion(minversion): pass options:if version.parse(mod.version) < version.parse(minversion): pass if version.Version(mod.version) < version.Version(minversion): pass if Version(mod.version) < Version(minversion): pass ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6096/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1466191758 | PR_kwDOAMm_X85Dylku | 7326 | fix doctests: supress urllib3 warning | mathause 10194086 | closed | 0 | 1 | 2022-11-28T10:40:46Z | 2022-12-05T20:11:16Z | 2022-11-28T19:31:03Z | MEMBER | 0 | pydata/xarray/pulls/7326 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7326/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1285767883 | PR_kwDOAMm_X846ahUs | 6730 | move da and ds fixtures to conftest.py | mathause 10194086 | closed | 0 | 9 | 2022-06-27T12:56:05Z | 2022-12-05T20:11:08Z | 2022-07-11T12:44:55Z | MEMBER | 0 | pydata/xarray/pulls/6730 | This PR renames the Removing the flake8 error ignores also unearthed some unused imports: https://github.com/pydata/xarray/blob/787a96c15161c9025182291b672b3d3c5548a6c7/setup.cfg#L155-L156 (What I actually wanted to do is move the tests for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6730/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1344222732 | PR_kwDOAMm_X849c2Wu | 6934 | deprecate_positional_args: remove stray print | mathause 10194086 | closed | 0 | 0 | 2022-08-19T09:58:53Z | 2022-12-05T20:11:08Z | 2022-08-19T10:25:32Z | MEMBER | 0 | pydata/xarray/pulls/6934 | I forgot to remove some debug print statements in #6910 - thanks for noting @shoyer & @dcherian |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6934/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1464824094 | PR_kwDOAMm_X85DuSjU | 7321 | fix flake8 config | mathause 10194086 | closed | 0 | 2 | 2022-11-25T18:16:07Z | 2022-11-28T10:36:29Z | 2022-11-28T10:33:00Z | MEMBER | 0 | pydata/xarray/pulls/7321 | flake8 v6 now errors on inline comments in the config file. I don't like it but oh well... |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7321/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
715730538 | MDU6SXNzdWU3MTU3MzA1Mzg= | 4491 | deprecate pynio backend | mathause 10194086 | closed | 0 | 21 | 2020-10-06T14:27:20Z | 2022-11-26T15:40:37Z | 2022-11-26T15:40:37Z | MEMBER | We are currently not testing with the newest version of netCDF4 because it is incompatible with pynio (the newest version is 1.5.4, we are at 1.5.3). This is unlikely to be fixed, see conda-forge/pynio-feedstock#90. Therefore we need to think how to setup the tests so we use the newest version of netCDF4. Maybe just remove it from And long term what to do with the pynio backend? Deprecate? Move to an external repo? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4491/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1372729718 | I_kwDOAMm_X85R0jF2 | 7036 | index refactor: more `_coord_names` than `_variables` on Dataset | mathause 10194086 | closed | 0 | 3 | 2022-09-14T10:19:00Z | 2022-09-27T10:35:40Z | 2022-09-27T10:35:40Z | MEMBER | What happened?
What did you expect to happen?Well it seems this assumption is now wrong. Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
Anything else we need to know?The error comes from here Bisected to #5692 - which probably does not help too much. Environment |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7036/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1118802352 | PR_kwDOAMm_X84xzhTi | 6212 | better warning filter for assert_* | mathause 10194086 | closed | 0 | 1 | 2022-01-31T00:22:37Z | 2022-09-05T07:52:09Z | 2022-09-05T07:52:06Z | MEMBER | 0 | pydata/xarray/pulls/6212 | In #4864 I added a a decorator for the https://github.com/pydata/xarray/blob/5470d933452d88deb17cc9294a164c4a03f55dec/xarray/testing.py#L32 However, this is sub-optimal because this now removes all I thought of setting
but this could suppress warnings we want to keep. So now I remove all I am not sure I expressed myself very clearly... let me know and I can try again. @keewis you had a look at #4864 maybe you can review this PR as well? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6212/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1355581692 | PR_kwDOAMm_X84-Cbgk | 6967 | fix _deprecate_positional_args helper | mathause 10194086 | closed | 0 | 0 | 2022-08-30T11:02:33Z | 2022-09-02T21:54:07Z | 2022-09-02T21:54:03Z | MEMBER | 0 | pydata/xarray/pulls/6967 | I tried to use the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6967/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1355361572 | PR_kwDOAMm_X84-Brev | 6966 | enable pydap in tests again | mathause 10194086 | closed | 0 | 1 | 2022-08-30T08:18:07Z | 2022-09-01T10:16:05Z | 2022-09-01T10:16:03Z | MEMBER | 0 | pydata/xarray/pulls/6966 | 5844 excluded pydap from our tests - but the new version has been released in the meantime (on conda not on pypi, though, pydap/pydap#268) - so let's see if this still works. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6966/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1355349486 | PR_kwDOAMm_X84-Bo54 | 6965 | no longer install pydap for 'io' extras in py3.10 | mathause 10194086 | closed | 0 | 2 | 2022-08-30T08:08:12Z | 2022-09-01T10:15:30Z | 2022-09-01T10:15:27Z | MEMBER | 0 | pydata/xarray/pulls/6965 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6965/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1331969418 | PR_kwDOAMm_X8480cLZ | 6890 | tests don't use `pytest.warns(None)` | mathause 10194086 | closed | 0 | 0 | 2022-08-08T14:36:01Z | 2022-08-30T12:15:33Z | 2022-08-08T17:27:53Z | MEMBER | 0 | pydata/xarray/pulls/6890 | Get rid of some warnings in the tests. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6890/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1344900323 | PR_kwDOAMm_X849fIGC | 6937 | terminology: fix italics [skip-ci] | mathause 10194086 | closed | 0 | 0 | 2022-08-19T21:13:52Z | 2022-08-20T07:30:41Z | 2022-08-20T07:30:41Z | MEMBER | 0 | pydata/xarray/pulls/6937 |
@zmoon - obviously it would be nice if we had a linter for this but this is for another time. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6937/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1337166287 | PR_kwDOAMm_X849FuuD | 6910 | decorator to deprecate positional arguments | mathause 10194086 | closed | 0 | 7 | 2022-08-12T12:48:47Z | 2022-08-18T18:14:09Z | 2022-08-18T15:59:52Z | MEMBER | 0 | pydata/xarray/pulls/6910 |
Adds a helper function to deprecate positional arguments. IMHO this offers a good trade-off between magic and complexity. (As mentioned this was adapted from scikit-learn). edit: I suggest to actually deprecate positional arguments in another PR. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6910/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1155634014 | PR_kwDOAMm_X84zvnTl | 6316 | fix typos (using codespell) | mathause 10194086 | closed | 0 | 2 | 2022-03-01T17:52:24Z | 2022-07-18T13:33:02Z | 2022-03-02T13:57:29Z | MEMBER | 0 | pydata/xarray/pulls/6316 | fix some typos (using codespell). Called using:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6316/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
144630996 | MDU6SXNzdWUxNDQ2MzA5OTY= | 810 | correct DJF mean | mathause 10194086 | closed | 0 | 4 | 2016-03-30T15:36:42Z | 2022-04-06T16:19:47Z | 2016-05-04T12:56:30Z | MEMBER | This started as a question and I add it as reference. Maybe you have a comment. There are several ways to calculate time series of seasonal data (starting from monthly or daily data): ``` load librariesimport pandas as pd import matplotlib.pyplot import numpy as np import xarray as xr Create Example Datasettime = pd.date_range('2000.01.01', '2010.12.31', freq='M') data = np.random.rand(*time.shape) ds = xr.DataArray(data, coords=dict(time=time)) (1) using resampleds_res = ds.resample('Q-FEB', 'time') ds_res = ds_res.sel(time=ds_res['time.month'] == 2) ds_res = ds_res.groupby('time.year').mean('time') (2) this is wrongds_season = ds.where(ds['time.season'] == 'DJF').groupby('time.year').mean('time') (3) using where and rollingmask other months with nands_DJF = ds.where(ds['time.season'] == 'DJF') rolling mean -> only Jan is not nanhowever, we loose Jan/ Feb in the first year and Dec in the lastds_DJF = ds_DJF.rolling(min_periods=3, center=True, time=3).mean() make annual meands_DJF = ds_DJF.groupby('time.year').mean('time') ds_res.plot(marker='*') ds_season.plot() ds_DJF.plot() plt.show() ``` (1) The first is to use resample with 'Q-FEB' as argument. This works fine. It does include Jan/ Feb in the first year, and Dec in the last year + 1. If this makes sense can be debated. One case where this does not work is when you have, say, two regions in your data set, for one you want to calculate DJF and for the other you want NovDecJan. (2) Using 'time.season' is wrong as it combines Jan, Feb and Dec from the same year. (3) The third uses |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/810/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1150224882 | PR_kwDOAMm_X84zdCrl | 6303 | quantile: use skipna=None | mathause 10194086 | closed | 0 | 0 | 2022-02-25T09:24:05Z | 2022-03-03T09:43:38Z | 2022-03-03T09:43:35Z | MEMBER | 0 | pydata/xarray/pulls/6303 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6303/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1149708477 | PR_kwDOAMm_X84zbVnG | 6302 | from_dict: doctest | mathause 10194086 | closed | 0 | 0 | 2022-02-24T20:17:24Z | 2022-02-28T09:11:05Z | 2022-02-28T09:11:02Z | MEMBER | 0 | pydata/xarray/pulls/6302 |
Convert the code block in |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6302/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1126086052 | PR_kwDOAMm_X84yLQ48 | 6251 | use `warnings.catch_warnings(record=True)` instead of `pytest.warns(None)` | mathause 10194086 | closed | 0 | 4 | 2022-02-07T14:42:26Z | 2022-02-18T16:51:58Z | 2022-02-18T16:51:55Z | MEMBER | 0 | pydata/xarray/pulls/6251 | pytest v7.0.0 no longer want's us to use |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6251/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1088615118 | PR_kwDOAMm_X84wRifr | 6108 | quantile: rename interpolation arg to method | mathause 10194086 | closed | 0 | 3 | 2021-12-25T15:06:44Z | 2022-02-08T17:09:47Z | 2022-02-07T09:40:05Z | MEMBER | 0 | pydata/xarray/pulls/6108 | numpy/numpy#20327 introduces some changes to
(Side note in TODO: need to import |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6108/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1125661464 | PR_kwDOAMm_X84yJ3Rz | 6248 | test bottleneck master in upstream CI [test-upstream] [skip-ci] | mathause 10194086 | closed | 0 | 1 | 2022-02-07T08:25:35Z | 2022-02-07T09:05:28Z | 2022-02-07T09:05:24Z | MEMBER | 0 | pydata/xarray/pulls/6248 |
pydata/bottleneck#378 was merged - so this should work again. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6248/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1111644832 | I_kwDOAMm_X85CQlqg | 6186 | upstream dev CI: enable bottleneck again | mathause 10194086 | closed | 0 | 2 | 2022-01-22T18:11:25Z | 2022-02-07T09:05:24Z | 2022-02-07T09:05:24Z | MEMBER | bottleneck cannot be built with python 3.10. See https://github.com/pydata/xarray/actions/runs/1731371015 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6186/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1118836906 | PR_kwDOAMm_X84xzojx | 6213 | fix or suppress test warnings | mathause 10194086 | closed | 0 | 1 | 2022-01-31T01:34:20Z | 2022-02-01T09:40:15Z | 2022-02-01T09:40:11Z | MEMBER | 0 | pydata/xarray/pulls/6213 | Fixes or suppresses a number of warnings that turn up in our upstream CI.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6213/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1118168483 | PR_kwDOAMm_X84xxms4 | 6208 | Revert "MNT: prepare h5netcdf backend for (coming) change in dimension handling" | mathause 10194086 | closed | 0 | 8 | 2022-01-29T10:27:11Z | 2022-01-29T13:48:17Z | 2022-01-29T13:20:51Z | MEMBER | 0 | pydata/xarray/pulls/6208 | Reverts pydata/xarray#6200 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6208/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1114414215 | PR_kwDOAMm_X84xlfet | 6194 | doc: fix pd datetime parsing warning [skip-ci] | mathause 10194086 | closed | 0 | 0 | 2022-01-25T22:12:53Z | 2022-01-28T08:37:18Z | 2022-01-28T05:41:49Z | MEMBER | 0 | pydata/xarray/pulls/6194 | And another tiny one... The somewhat ambiguous date string triggers a warning in pandas which makes our doc build fail. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6194/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1115026697 | PR_kwDOAMm_X84xneFL | 6195 | MAINT: pandas 1.4: no longer use get_loc with method | mathause 10194086 | closed | 0 | 5 | 2022-01-26T13:35:04Z | 2022-01-27T22:11:04Z | 2022-01-27T21:06:40Z | MEMBER | 0 | pydata/xarray/pulls/6195 |
Fixed as per @shoyer & @spencerkclark suggestion from https://github.com/pydata/xarray/issues/5721#issuecomment-903095007 Now that pandas 1.4 is out it would be good to get this fixed (there are about 5000 warnings in our tests, mostly because of |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6195/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
975385095 | MDU6SXNzdWU5NzUzODUwOTU= | 5721 | pandas deprecates Index.get_loc with method | mathause 10194086 | closed | 0 | 7 | 2021-08-20T08:24:16Z | 2022-01-27T21:06:40Z | 2022-01-27T21:06:40Z | MEMBER | pandas deprecates the
We should fix this before pandas releases because the warning will not be silent (
We use this here:
https://github.com/pydata/xarray/blob/4bb9d9c6df77137f05e85c7cc6508fe7a93dc0e4/xarray/core/indexes.py#L233-L235
Is this only ever called with one item? Then we might be able to use
```python
indexer = self.index.get_indexer(
[label_value], method=method, tolerance=tolerance
).item()
if indexer == -1:
raise KeyError(label_value)
```
---
https://github.com/pydata/xarray/blob/3956b73a7792f41e4410349f2c40b9a9a80decd2/xarray/core/missing.py#L571-L572
This one could be easy to fix (replace with `imin = index.get_indexer([minval], method="nearest").item()`)
---
It is also defined in `CFTimeIndex`, which complicates things:
https://github.com/pydata/xarray/blob/eea76733770be03e78a0834803291659136bca31/xarray/coding/cftimeindex.py#L461-L466
because `get_indexer` expects an iterable and thus the `if isinstance(key, str)` test no longer works.
@benbovy @spencerkclark |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5721/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1114392372 | PR_kwDOAMm_X84xla15 | 6192 | fix cftime doctests | mathause 10194086 | closed | 0 | 0 | 2022-01-25T21:43:55Z | 2022-01-26T21:45:19Z | 2022-01-26T21:45:17Z | MEMBER | 0 | pydata/xarray/pulls/6192 | Fixes the doctests for the newest version of cftime. @spencerkclark This of course means that the doctests will fail for environments with older versions of cftime present. I don't think there is anything we can do. Thanks for pytest-accept b.t.w @max-sixty |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6192/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1039272725 | PR_kwDOAMm_X84t1ecc | 5914 | #5740 follow up: supress xr.ufunc warnings in tests | mathause 10194086 | closed | 0 | 2 | 2021-10-29T07:53:07Z | 2022-01-26T08:41:41Z | 2021-10-29T15:16:03Z | MEMBER | 0 | pydata/xarray/pulls/5914 | 5740 changed
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5914/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1083281083 | PR_kwDOAMm_X84wATnw | 6082 | cftime: 'gregorian' -> 'standard' [test-upstream] | mathause 10194086 | closed | 0 | 3 | 2021-12-17T13:51:07Z | 2022-01-26T08:41:33Z | 2021-12-22T11:40:05Z | MEMBER | 0 | pydata/xarray/pulls/6082 |
cftime 1.5.2 renames "gregorian" to "standard". AFAIK this only changes the repr of cftime indices and does not seem to influence the creation of cftime indices. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6082/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1088419434 | PR_kwDOAMm_X84wQ-nD | 6107 | is_dask_collection: micro optimization | mathause 10194086 | closed | 0 | 1 | 2021-12-24T15:04:42Z | 2022-01-26T08:41:28Z | 2021-12-29T16:27:55Z | MEMBER | 0 | pydata/xarray/pulls/6107 | In #6096 I realized that ```python import xarray as xr %timeit xr.core.pycompat.DuckArrayModule("dask").available %timeit xr.core.pycompat.dsk.available ```
Which leads to an incredible speed up of our tests of about 2.7 s :grin: ((18.9 - 0.0771) * 145835 / 1e6). |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6107/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
752870062 | MDExOlB1bGxSZXF1ZXN0NTI5MDc4NDA0 | 4616 | don't type check __getattr__ | mathause 10194086 | closed | 0 | 4 | 2020-11-29T08:53:09Z | 2022-01-26T08:41:18Z | 2021-10-18T14:06:30Z | MEMBER | 1 | pydata/xarray/pulls/4616 |
It's not pretty as I had to define a number of empty methods... I think this should wait for 0.17 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4616/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
778069594 | MDExOlB1bGxSZXF1ZXN0NTQ4MjI1MDQz | 4760 | WIP: testing.assert_* check dtype | mathause 10194086 | closed | 0 | 8 | 2021-01-04T12:45:00Z | 2022-01-26T08:41:17Z | 2021-10-18T14:06:38Z | MEMBER | 1 | pydata/xarray/pulls/4760 |
This adds a dtype check for When I set
```python import numpy as np import xarray as xr da = xr.DataArray(np.array([0, np.nan], dtype=object)).chunk() da.prod().dtype # -> dtype('O') da.prod().compute().dtype # -> dtype('int64') ```
```python da0 = xr.DataArray(np.array([0], dtype=object)) da1 = xr.DataArray(np.array([0.], dtype=object)) xr.testting.assert_equal(da0, da1, check_dtype=True) ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4760/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1107431006 | PR_kwDOAMm_X84xOsZX | 6171 | unpin dask again | mathause 10194086 | closed | 0 | 1 | 2022-01-18T22:37:31Z | 2022-01-26T08:41:02Z | 2022-01-18T23:39:12Z | MEMBER | 0 | pydata/xarray/pulls/6171 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6171/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1109572055 | PR_kwDOAMm_X84xVqMq | 6177 | remove no longer necessary version checks | mathause 10194086 | closed | 0 | 2 | 2022-01-20T17:24:21Z | 2022-01-26T08:40:55Z | 2022-01-21T18:00:51Z | MEMBER | 0 | pydata/xarray/pulls/6177 | I hunted down some version checks that should no longer be necessary as we have moved beyond the minimum versions. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6177/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1114401347 | PR_kwDOAMm_X84xlcvk | 6193 | don't install bottleneck wheel for upstream CI | mathause 10194086 | closed | 0 | 3 | 2022-01-25T21:55:49Z | 2022-01-26T08:31:42Z | 2022-01-26T08:31:39Z | MEMBER | 0 | pydata/xarray/pulls/6193 |
I think it would be good to re-enable the upstream CI, even if this means we have to stick to py3.9 for the moment. I just subscribed to pydata/bottleneck#378, so I should see when we can switch to 3.10. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6193/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1099288617 | PR_kwDOAMm_X84wzh1F | 6155 | typing fixes for mypy 0.931 and numpy 1.22 | mathause 10194086 | closed | 0 | 2 | 2022-01-11T15:19:43Z | 2022-01-13T17:13:00Z | 2022-01-13T17:12:57Z | MEMBER | 0 | pydata/xarray/pulls/6155 | typing fixes for mypy 0.931 and numpy 1.22. Also tested with numpy 1.20 which probably many still have installed. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6155/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
432074821 | MDU6SXNzdWU0MzIwNzQ4MjE= | 2889 | nansum vs nanmean for all-nan vectors | mathause 10194086 | closed | 0 | 3 | 2019-04-11T15:04:39Z | 2022-01-05T21:59:48Z | 2019-04-11T16:08:02Z | MEMBER | ```python import xarray as xr import numpy as np ds = xr.DataArray([np.NaN, np.NaN]) ds.mean() ds.sum() ``` Problem description
Expected OutputI would expect both to return Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2889/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1078998718 | PR_kwDOAMm_X84vyLHe | 6077 | disable pytest-xdist (to check CI failure) | mathause 10194086 | closed | 0 | 3 | 2021-12-13T20:43:38Z | 2022-01-03T08:30:02Z | 2021-12-22T12:55:23Z | MEMBER | 0 | pydata/xarray/pulls/6077 | Our CI fails with some pytest-xdist error. Let's see if we get a clearer picture when disabling parallel tests. (Maybe some interaction between dask and pytest-xdist?). |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6077/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1090752550 | PR_kwDOAMm_X84wYT5m | 6127 | Revert "disable pytest-xdist (to check CI failure)" | mathause 10194086 | closed | 0 | 2 | 2021-12-29T21:15:36Z | 2022-01-03T08:29:52Z | 2022-01-03T08:29:49Z | MEMBER | 0 | pydata/xarray/pulls/6127 |
Reverts pydata/xarray#6077 (after dask has been pinned in #6111) |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6127/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1086797050 | I_kwDOAMm_X85AxzT6 | 6101 | enable pytest-xdist again (after dask release) | mathause 10194086 | closed | 0 | 0 | 2021-12-22T12:57:03Z | 2022-01-03T08:29:48Z | 2022-01-03T08:29:48Z | MEMBER | I disabled pytest-xdist because a dask issue renders our CI unusable. As soon as dask releases a new version we should revert #6077 again. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6101/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1086360190 | PR_kwDOAMm_X84wKRVp | 6097 | fix tests for h5netcdf v0.12 | mathause 10194086 | closed | 0 | 6 | 2021-12-22T01:22:09Z | 2021-12-23T20:29:33Z | 2021-12-23T20:29:12Z | MEMBER | 0 | pydata/xarray/pulls/6097 | h5netcdf no longer warns for invalid netCDF (unless passing @kmuehlbauer edit: I added h5netcdf to the upstream tests - I can also revert this change if you prefer. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6097/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1036287825 | PR_kwDOAMm_X84tryph | 5899 | [test-upstream] fix pd skipna=None | mathause 10194086 | closed | 0 | 2 | 2021-10-26T13:16:21Z | 2021-10-28T11:54:49Z | 2021-10-28T11:46:04Z | MEMBER | 0 | pydata/xarray/pulls/5899 |
pandas will disallow |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5899/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
1029142676 | PR_kwDOAMm_X84tVCEd | 5875 | fix test with pseudonetcdf 3.2 | mathause 10194086 | closed | 0 | 5 | 2021-10-18T13:49:23Z | 2021-10-22T21:24:09Z | 2021-10-22T21:23:34Z | MEMBER | 0 | pydata/xarray/pulls/5875 | Fixes one part of #5872 pseudoNETCDF adds two attrs to ict files, which breaks the following two tests: Test 1: https://github.com/pydata/xarray/blob/07de257c5884df49335496ee6347fb633a7c302c/xarray/tests/test_backends.py#L3944 Test 2: I reproduced the test file so that the tests pass again. To reproduce the file I used the following bit of code: ```python import xarray as xr from xarray.tests import test_backends fN = "xarray/tests/data/example.ict" fmtkw = {"format": "ffi1001"} ds = xr.open_dataset(fN, engine="pseudonetcdf", backend_kwargs={"format": "ffi1001"}) c = test_backends.TestPseudoNetCDFFormat() c.save(ds, fN, **fmtkw) ``` The @barronh I would appreciate your review here - I am not sure if this is the right approach. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5875/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
877166445 | MDExOlB1bGxSZXF1ZXN0NjMxMTcwNzI4 | 5265 | Warn ignored keep attrs | mathause 10194086 | closed | 0 | 1 | 2021-05-06T07:20:16Z | 2021-10-18T14:06:37Z | 2021-05-06T16:31:05Z | MEMBER | 0 | pydata/xarray/pulls/5265 |
This PR warns when passing |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5265/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
869763597 | MDExOlB1bGxSZXF1ZXN0NjI1MDc0NjA5 | 5227 | coarsen: better keep_attrs | mathause 10194086 | closed | 0 | 0 | 2021-04-28T09:56:45Z | 2021-10-18T14:06:35Z | 2021-04-29T17:40:57Z | MEMBER | 0 | pydata/xarray/pulls/5227 |
As per https://github.com/pydata/xarray/issues/3891#issuecomment-612522628 I also changed the default to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5227/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
758033677 | MDExOlB1bGxSZXF1ZXN0NTMzMjc0NDY3 | 4656 | unpin pip 20.2 again | mathause 10194086 | closed | 0 | 7 | 2020-12-06T22:00:12Z | 2021-10-18T14:06:34Z | 2021-04-18T21:42:25Z | MEMBER | 0 | pydata/xarray/pulls/4656 | Another enormous PR from my side ;) unpin pip again. numpy probably fixed the issue re the name of the nightly build. But I also need to doublecheck if scipy is ok. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4656/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
802400938 | MDExOlB1bGxSZXF1ZXN0NTY4NTUwNDEx | 4865 | fix da.pad example for numpy 1.20 | mathause 10194086 | closed | 0 | 4 | 2021-02-05T19:00:04Z | 2021-10-18T14:06:33Z | 2021-02-07T21:57:34Z | MEMBER | 0 | pydata/xarray/pulls/4865 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4865/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
794344392 | MDExOlB1bGxSZXF1ZXN0NTYxODc2OTg5 | 4845 | iris update doc url | mathause 10194086 | closed | 0 | 1 | 2021-01-26T15:51:18Z | 2021-10-18T14:06:31Z | 2021-01-26T17:30:20Z | MEMBER | 0 | pydata/xarray/pulls/4845 | iris moved its documentation to https://scitools-iris.readthedocs.io/en/stable/ |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4845/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
738958305 | MDExOlB1bGxSZXF1ZXN0NTE3NzA0OTI2 | 4569 | pin h5py to v2.10 | mathause 10194086 | closed | 0 | 0 | 2020-11-09T11:46:39Z | 2021-10-18T14:06:28Z | 2020-11-09T12:52:27Z | MEMBER | 0 | pydata/xarray/pulls/4569 | There is a compatibility issue with h5py v3. Pin h5py to version 2 for the moment. I can open an issue shortly. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4569/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
724975973 | MDExOlB1bGxSZXF1ZXN0NTA2Mjc3OTk4 | 4525 | unpin eccodes again | mathause 10194086 | closed | 0 | 2 | 2020-10-19T21:07:23Z | 2021-10-18T14:06:27Z | 2020-10-19T22:21:13Z | MEMBER | 0 | pydata/xarray/pulls/4525 |
That was fast - eccodes already fixed the issue. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4525/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
684430261 | MDExOlB1bGxSZXF1ZXN0NDcyMzE4MzUw | 4371 | mention all ignored flake8 errors | mathause 10194086 | closed | 0 | 1 | 2020-08-24T07:17:03Z | 2021-10-18T14:06:18Z | 2020-08-24T10:45:05Z | MEMBER | 0 | pydata/xarray/pulls/4371 | and put the comment on the same line |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4371/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
577830239 | MDExOlB1bGxSZXF1ZXN0Mzg1NTIyOTEy | 3849 | update installation instruction | mathause 10194086 | closed | 0 | 6 | 2020-03-09T11:14:13Z | 2021-10-18T14:06:16Z | 2020-03-09T14:07:03Z | MEMBER | 0 | pydata/xarray/pulls/3849 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3849/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
572269093 | MDExOlB1bGxSZXF1ZXN0MzgxMDAyMTU2 | 3805 | un-xfail tests that append to netCDF files with scipy | mathause 10194086 | closed | 0 | 3 | 2020-02-27T18:23:56Z | 2021-10-18T14:06:14Z | 2020-03-09T07:18:07Z | MEMBER | 0 | pydata/xarray/pulls/3805 |
Let's see if this passes.... |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3805/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
539059754 | MDExOlB1bGxSZXF1ZXN0MzU0MDk5Mzkz | 3635 | Fix/quantile wrong errmsg | mathause 10194086 | closed | 0 | 2 | 2019-12-17T13:16:40Z | 2021-10-18T14:06:13Z | 2019-12-17T13:50:06Z | MEMBER | 0 | pydata/xarray/pulls/3635 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3635/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
928539812 | MDExOlB1bGxSZXF1ZXN0Njc2NTI5NjQ4 | 5522 | typing for numpy 1.21 | mathause 10194086 | closed | 0 | 2 | 2021-06-23T18:40:28Z | 2021-10-18T14:05:47Z | 2021-06-24T08:58:07Z | MEMBER | 0 | pydata/xarray/pulls/5522 |
The minimal typing for numpy 1.21. As always I am by no means a typing specialist. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5522/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
235542564 | MDExOlB1bGxSZXF1ZXN0MTI1MzU1MTI5 | 1451 | inconsistent time.units fmt in encode_cf_datetime | mathause 10194086 | closed | 0 | 7 | 2017-06-13T12:49:31Z | 2021-06-24T08:45:18Z | 2021-06-23T16:14:27Z | MEMBER | 0 | pydata/xarray/pulls/1451 |
This is my naïve approach.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1451/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
913958248 | MDExOlB1bGxSZXF1ZXN0NjYzOTE2NDQw | 5451 | Silence some test warnings | mathause 10194086 | closed | 0 | 1 | 2021-06-07T21:12:50Z | 2021-06-09T17:55:48Z | 2021-06-09T17:27:21Z | MEMBER | 0 | pydata/xarray/pulls/5451 | Silences a number of warnings that accumulated in our test suite (c.f. #3266). The changes are mostly unrelated but small. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5451/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
913916040 | MDExOlB1bGxSZXF1ZXN0NjYzODgwMjI1 | 5450 | plt.gca() no longer accepts kwargs | mathause 10194086 | closed | 0 | 0 | 2021-06-07T20:10:57Z | 2021-06-09T17:27:02Z | 2021-06-09T17:26:58Z | MEMBER | 0 | pydata/xarray/pulls/5450 | matplotlib warns: This only uses |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5450/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
913830070 | MDExOlB1bGxSZXF1ZXN0NjYzODA1MDQy | 5449 | fix dask meta and output_dtypes error | mathause 10194086 | closed | 0 | 8 | 2021-06-07T18:25:20Z | 2021-06-08T07:51:50Z | 2021-06-07T21:05:24Z | MEMBER | 0 | pydata/xarray/pulls/5449 |
This was changed in dask/dask#7669. Looks like they did not deprecate this behavior (i.e. passing both |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5449/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
834641104 | MDU6SXNzdWU4MzQ2NDExMDQ= | 5053 | ImportError: module 'xarray.backends.*' has no attribute '*_backend' | mathause 10194086 | closed | 0 | 3 | 2021-03-18T10:44:33Z | 2021-04-25T16:23:20Z | 2021-04-25T16:23:19Z | MEMBER | What happened: I could not open the test dataset on master. It's a bit strange that this is not picked up by the tests, so probably something to do with the environment I have (I just updated all packages). @alexamici @aurghs does that tell you anything? I can also try to figure it out. Minimal Complete Verifiable Example: calling
Anything else we need to know?: And the traceback:
```python-traceback
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
~/conda/envs/xarray_dev/lib/python3.8/site-packages/pkg_resources/__init__.py in resolve(self)
2479 try:
-> 2480 return functools.reduce(getattr, self.attrs, module)
2481 except AttributeError as exc:
AttributeError: module 'xarray.backends.cfgrib_' has no attribute 'cfgrib_backend'
The above exception was the direct cause of the following exception:
ImportError Traceback (most recent call last)
<ipython-input-2-16bed41155fa> in <module>
----> 1 air = xr.tutorial.open_dataset("air_temperature")
~/code/xarray/xarray/tutorial.py in open_dataset(name, cache, cache_dir, github_url, branch, **kws)
93 raise OSError(msg)
94
---> 95 ds = _open_dataset(localfile, **kws)
96
97 if not cache:
~/code/xarray/xarray/backends/api.py in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, backend_kwargs, *args, **kwargs)
491
492 if engine is None:
--> 493 engine = plugins.guess_engine(filename_or_obj)
494
495 backend = plugins.get_backend(engine)
~/code/xarray/xarray/backends/plugins.py in guess_engine(store_spec)
99
100 def guess_engine(store_spec):
--> 101 engines = list_engines()
102
103 for engine, backend in engines.items():
~/code/xarray/xarray/backends/plugins.py in list_engines()
95 def list_engines():
96 pkg_entrypoints = pkg_resources.iter_entry_points("xarray.backends")
---> 97 return build_engines(pkg_entrypoints)
98
99
~/code/xarray/xarray/backends/plugins.py in build_engines(pkg_entrypoints)
82 backend_entrypoints = BACKEND_ENTRYPOINTS.copy()
83 pkg_entrypoints = remove_duplicates(pkg_entrypoints)
---> 84 external_backend_entrypoints = backends_dict_from_pkg(pkg_entrypoints)
85 backend_entrypoints.update(external_backend_entrypoints)
86 backend_entrypoints = sort_backends(backend_entrypoints)
~/code/xarray/xarray/backends/plugins.py in backends_dict_from_pkg(pkg_entrypoints)
56 for pkg_ep in pkg_entrypoints:
57 name = pkg_ep.name
---> 58 backend = pkg_ep.load()
59 backend_entrypoints[name] = backend
60 return backend_entrypoints
~/conda/envs/xarray_dev/lib/python3.8/site-packages/pkg_resources/__init__.py in load(self, require, *args, **kwargs)
2470 if require:
2471 self.require(*args, **kwargs)
-> 2472 return self.resolve()
2473
2474 def resolve(self):
~/conda/envs/xarray_dev/lib/python3.8/site-packages/pkg_resources/__init__.py in resolve(self)
2480 return functools.reduce(getattr, self.attrs, module)
2481 except AttributeError as exc:
-> 2482 raise ImportError(str(exc)) from exc
2483
2484 def require(self, env=None, installer=None):
ImportError: module 'xarray.backends.cfgrib_' has no attribute 'cfgrib_backend'
```
Environment: Output of <tt>xr.show_versions()</tt>``` INSTALLED VERSIONS ------------------ commit: a6f51c680f4e4c3ed5101b9c1111f0b94d28a781 python: 3.8.6 | packaged by conda-forge | (default, Jan 25 2021, 23:21:18) [GCC 9.3.0] python-bits: 64 OS: Linux OS-release: 5.4.0-67-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.2.dev111+g0d93c4f9.d20201219 pandas: 1.2.3 numpy: 1.20.1 scipy: 1.6.1 netCDF4: 1.5.6 pydap: installed h5netcdf: 0.10.0 h5py: 3.1.0 Nio: None zarr: 2.6.1 cftime: 1.4.1 nc_time_axis: 1.2.0 PseudoNetCDF: installed rasterio: 1.2.1 cfgrib: 0.9.8.5 iris: 2.4.0 bottleneck: 1.3.2 dask: 2021.03.0 distributed: 2021.03.0 matplotlib: 3.3.4 cartopy: 0.18.0 seaborn: 0.11.1 numbagg: installed pint: 0.16.1 setuptools: 49.6.0.post20210108 pip: 21.0.1 conda: None pytest: 6.2.2 IPython: 7.21.0 sphinx: None ```and my conda list:
```
# packages in environment at /home/mathause/conda/envs/xarray_dev:
#
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 1_gnu conda-forge
affine 2.3.0 py_0 conda-forge
antlr-python-runtime 4.7.2 py38h578d9bd_1002 conda-forge
apipkg 1.5 py_0 conda-forge
appdirs 1.4.4 pyh9f0ad1d_0 conda-forge
asciitree 0.3.3 py_2 conda-forge
attrs 20.3.0 pyhd3deb0d_0 conda-forge
backcall 0.2.0 pyh9f0ad1d_0 conda-forge
backports 1.0 py_2 conda-forge
backports.functools_lru_cache 1.6.1 py_0 conda-forge
beautifulsoup4 4.9.3 pyhb0f4dca_0 conda-forge
black 20.8b1 py_1 conda-forge
bokeh 2.3.0 py38h578d9bd_0 conda-forge
boost-cpp 1.72.0 h9d3c048_4 conda-forge
boto3 1.17.30 pyhd8ed1ab_0 conda-forge
botocore 1.20.30 pyhd8ed1ab_0 conda-forge
bottleneck 1.3.2 py38h5c078b8_3 conda-forge
brotlipy 0.7.0 py38h497a2fe_1001 conda-forge
bzip2 1.0.8 h7f98852_4 conda-forge
c-ares 1.17.1 h7f98852_1 conda-forge
ca-certificates 2020.12.5 ha878542_0 conda-forge
cached-property 1.5.2 hd8ed1ab_1 conda-forge
cached_property 1.5.2 pyha770c72_1 conda-forge
cairo 1.16.0 h7979940_1007 conda-forge
cartopy 0.18.0 py38hab71064_13 conda-forge
cdat_info 8.2.1 pyh9f0ad1d_1 conda-forge
cdms2 3.1.5 pypi_0 pypi
cdtime 3.1.4 py38h49bcaf2_2 conda-forge
certifi 2020.12.5 py38h578d9bd_1 conda-forge
cf-units 2.1.4 py38hab2c0dc_2 conda-forge
cffi 1.14.5 py38ha65f79e_0 conda-forge
cfgrib 0.9.8.5 pyhd8ed1ab_0 conda-forge
cfgv 3.2.0 py_0 conda-forge
cfitsio 3.470 hb418390_7 conda-forge
cftime 1.4.1 py38h5c078b8_0 conda-forge
chardet 4.0.0 py38h578d9bd_1 conda-forge
click 7.1.2 pyh9f0ad1d_0 conda-forge
click-plugins 1.1.1 py_0 conda-forge
cligj 0.7.1 pyhd8ed1ab_0 conda-forge
cloudpickle 1.6.0 py_0 conda-forge
coverage 5.5 py38h497a2fe_0 conda-forge
coveralls 3.0.1 pyhd8ed1ab_0 conda-forge
cryptography 3.4.6 py38ha5dfef3_0 conda-forge
curl 7.75.0 h979ede3_0 conda-forge
cycler 0.10.0 py_2 conda-forge
cytoolz 0.11.0 py38h497a2fe_3 conda-forge
dask 2021.3.0 pyhd8ed1ab_0 conda-forge
dask-core 2021.3.0 pyhd8ed1ab_0 conda-forge
dataclasses 0.8 pyhc8e2a94_1 conda-forge
dbus 1.13.6 hfdff14a_1 conda-forge
decorator 4.4.2 py_0 conda-forge
distarray 2.12.2 py_1 conda-forge
distlib 0.3.1 pyh9f0ad1d_0 conda-forge
distributed 2021.3.0 py38h578d9bd_0 conda-forge
docopt 0.6.2 py_1 conda-forge
eccodes 2.20.0 ha0e6eb6_0 conda-forge
editdistance 0.5.3 py38h709712a_3 conda-forge
esmf 8.0.1 mpi_mpich_h3cbecb6_102 conda-forge
esmpy 8.0.1 mpi_mpich_py38h6f0bf2d_102 conda-forge
execnet 1.8.0 pyh44b312d_0 conda-forge
expat 2.2.10 h9c3ff4c_0 conda-forge
fasteners 0.14.1 py_3 conda-forge
filelock 3.0.12 pyh9f0ad1d_0 conda-forge
flake8 3.9.0 pyhd8ed1ab_0 conda-forge
fontconfig 2.13.1 hba837de_1004 conda-forge
freetype 2.10.4 h0708190_1 conda-forge
freexl 1.0.6 h7f98852_0 conda-forge
fsspec 0.8.7 pyhd8ed1ab_0 conda-forge
future 0.18.2 py38h578d9bd_3 conda-forge
g2clib 1.6.0 hf3f1b0b_9 conda-forge
geos 3.9.1 h9c3ff4c_2 conda-forge
geotiff 1.6.0 h11d48b3_4 conda-forge
gettext 0.19.8.1 h0b5b191_1005 conda-forge
giflib 5.2.1 h516909a_2 conda-forge
glib 2.66.7 h9c3ff4c_1 conda-forge
glib-tools 2.66.7 h9c3ff4c_1 conda-forge
gprof2dot 2019.11.30 py_0 conda-forge
gst-plugins-base 1.18.4 h29181c9_0 conda-forge
gstreamer 1.18.4 h76c114f_0 conda-forge
h5netcdf 0.10.0 pyhd8ed1ab_0 conda-forge
h5py 3.1.0 nompi_py38hafa665b_100 conda-forge
hdf4 4.2.13 h10796ff_1004 conda-forge
hdf5 1.10.6 mpi_mpich_h996c276_1014 conda-forge
heapdict 1.0.1 py_0 conda-forge
hypothesis 6.8.1 pyhd8ed1ab_0 conda-forge
icu 68.1 h58526e2_0 conda-forge
identify 2.1.3 pyhd8ed1ab_0 conda-forge
idna 2.10 pyh9f0ad1d_0 conda-forge
importlib-metadata 3.7.3 py38h578d9bd_0 conda-forge
importlib_metadata 3.7.3 hd8ed1ab_0 conda-forge
importlib_resources 5.1.2 py38h578d9bd_0 conda-forge
iniconfig 1.1.1 pyh9f0ad1d_0 conda-forge
ipython 7.21.0 py38h81c977d_0 conda-forge
ipython_genutils 0.2.0 py_1 conda-forge
iris 2.4.0 py38h578d9bd_1 conda-forge
isort 5.7.0 pyhd8ed1ab_0 conda-forge
jasper 1.900.1 h07fcdf6_1006 conda-forge
jedi 0.18.0 py38h578d9bd_2 conda-forge
jinja2 2.11.3 pyh44b312d_0 conda-forge
jmespath 0.10.0 pyh9f0ad1d_0 conda-forge
jpeg 9d h516909a_0 conda-forge
json-c 0.15 h98cffda_0 conda-forge
jsonschema 3.2.0 py38h32f6830_1 conda-forge
jupyter_core 4.7.1 py38h578d9bd_0 conda-forge
kealib 1.4.14 hcc255d8_2 conda-forge
kiwisolver 1.3.1 py38h1fd1430_1 conda-forge
krb5 1.17.2 h926e7f8_0 conda-forge
lazy-object-proxy 1.5.2 py38h497a2fe_1 conda-forge
lcms2 2.12 hddcbb42_0 conda-forge
ld_impl_linux-64 2.35.1 hea4e1c9_2 conda-forge
libaec 1.0.4 he1b5a44_1 conda-forge
libblas 3.8.0 17_openblas conda-forge
libcblas 3.8.0 17_openblas conda-forge
libcdms 3.1.2 h981a4fd_113 conda-forge
libcf 1.0.3 py38h88b7cc0_109 conda-forge
libclang 11.1.0 default_ha53f305_0 conda-forge
libcst 0.3.17 py38h578d9bd_0 conda-forge
libcurl 7.75.0 hc4aaa36_0 conda-forge
libdap4 3.20.6 hd7c4107_1 conda-forge
libdrs 3.1.2 h7918d09_113 conda-forge
libdrs_f 3.1.2 h5026c31_111 conda-forge
libedit 3.1.20191231 he28a2e2_2 conda-forge
libev 4.33 h516909a_1 conda-forge
libevent 2.1.10 hcdb4288_3 conda-forge
libffi 3.3 h58526e2_2 conda-forge
libgcc-ng 9.3.0 h2828fa1_18 conda-forge
libgdal 3.2.1 h38ff51b_7 conda-forge
libgfortran-ng 9.3.0 hff62375_18 conda-forge
libgfortran5 9.3.0 hff62375_18 conda-forge
libglib 2.66.7 h3e27bee_1 conda-forge
libgomp 9.3.0 h2828fa1_18 conda-forge
libiconv 1.16 h516909a_0 conda-forge
libkml 1.3.0 hd79254b_1012 conda-forge
liblapack 3.8.0 17_openblas conda-forge
libllvm10 10.0.1 he513fc3_3 conda-forge
libllvm11 11.1.0 hf817b99_0 conda-forge
libnetcdf 4.7.4 mpi_mpich_hdef422e_7 conda-forge
libnghttp2 1.43.0 h812cca2_0 conda-forge
libopenblas 0.3.10 pthreads_h4812303_5 conda-forge
libpng 1.6.37 hed695b0_2 conda-forge
libpq 13.1 hfd2b0eb_2 conda-forge
librttopo 1.1.0 h1185371_6 conda-forge
libspatialite 5.0.1 he52d314_3 conda-forge
libssh2 1.9.0 ha56f1ee_6 conda-forge
libstdcxx-ng 9.3.0 h6de172a_18 conda-forge
libtiff 4.2.0 hdc55705_0 conda-forge
libuuid 2.32.1 h14c3975_1000 conda-forge
libwebp-base 1.2.0 h7f98852_2 conda-forge
libxcb 1.13 h7f98852_1003 conda-forge
libxkbcommon 1.0.3 he3ba5ed_0 conda-forge
libxml2 2.9.10 h72842e0_3 conda-forge
libxslt 1.1.33 h15afd5d_2 conda-forge
line_profiler 3.1.0 py38h82cb98a_1 conda-forge
llvmlite 0.36.0 py38h4630a5e_0 conda-forge
locket 0.2.0 py_2 conda-forge
lxml 4.6.2 py38hf1fe3a4_1 conda-forge
lz4-c 1.9.3 h9c3ff4c_0 conda-forge
markupsafe 1.1.1 py38h497a2fe_3 conda-forge
matplotlib 3.3.4 py38h578d9bd_0 conda-forge
matplotlib-base 3.3.4 py38h0efea84_0 conda-forge
mccabe 0.6.1 py_1 conda-forge
mechanicalsoup 1.0.0 pyhd8ed1ab_0 conda-forge
monkeytype 20.5.0 pyh516909a_0 conda-forge
monotonic 1.5 py_0 conda-forge
more-itertools 8.7.0 pyhd8ed1ab_0 conda-forge
mpi 1.0 mpich conda-forge
mpi4py 3.0.3 py38he865349_5 conda-forge
mpich 3.4.1 h846660c_104 conda-forge
msgpack-python 1.0.2 py38h1fd1430_1 conda-forge
mypy 0.812 pyhd8ed1ab_0 conda-forge
mypy_extensions 0.4.3 py38h578d9bd_3 conda-forge
mysql-common 8.0.23 ha770c72_1 conda-forge
mysql-libs 8.0.23 h935591d_1 conda-forge
nbformat 5.1.2 pyhd8ed1ab_1 conda-forge
nc-time-axis 1.2.0 py_1 conda-forge
ncurses 6.2 h58526e2_4 conda-forge
netcdf-fortran 4.5.3 mpi_mpich_h7ad8bfe_1 conda-forge
netcdf4 1.5.6 nompi_py38h1cdf482_100 conda-forge
nodeenv 1.5.0 pyh9f0ad1d_0 conda-forge
nspr 4.30 h9c3ff4c_0 conda-forge
nss 3.62 hb5efdd6_0 conda-forge
numba 0.53.0 py38h5e62926_1 conda-forge
numbagg 0.1 pypi_0 pypi
numcodecs 0.7.3 py38h709712a_0 conda-forge
numpy 1.20.1 py38h18fd61f_0 conda-forge
olefile 0.46 pyh9f0ad1d_1 conda-forge
openblas 0.3.10 pthreads_h04b7a96_5 conda-forge
openjpeg 2.4.0 hf7af979_0 conda-forge
openssl 1.1.1j h7f98852_0 conda-forge
packaging 20.9 pyh44b312d_0 conda-forge
pandas 1.2.3 py38h51da96c_0 conda-forge
parso 0.8.1 pyhd8ed1ab_0 conda-forge
partd 1.1.0 py_0 conda-forge
pathspec 0.8.1 pyhd3deb0d_0 conda-forge
patsy 0.5.1 py_0 conda-forge
pcre 8.44 he1b5a44_0 conda-forge
pexpect 4.8.0 py38h32f6830_1 conda-forge
pickleshare 0.7.5 py38h32f6830_1002 conda-forge
pillow 8.1.2 py38ha0e1e83_0 conda-forge
pint 0.16.1 py_0 conda-forge
pip 21.0.1 pyhd8ed1ab_0 conda-forge
pixman 0.40.0 h36c2ea0_0 conda-forge
pluggy 0.13.1 py38h578d9bd_4 conda-forge
poppler 0.89.0 h2de54a5_5 conda-forge
poppler-data 0.4.10 0 conda-forge
postgresql 13.1 h6303168_2 conda-forge
pre-commit 2.11.1 py38h578d9bd_0 conda-forge
proj 7.2.0 h277dcde_2 conda-forge
prompt-toolkit 3.0.17 pyha770c72_0 conda-forge
pseudonetcdf 3.1.0 py_1 conda-forge
psutil 5.8.0 py38h497a2fe_1 conda-forge
pthread-stubs 0.4 h36c2ea0_1001 conda-forge
ptyprocess 0.7.0 pyhd3deb0d_0 conda-forge
py 1.10.0 pyhd3deb0d_0 conda-forge
pycodestyle 2.7.0 pyhd8ed1ab_0 conda-forge
pycparser 2.20 pyh9f0ad1d_2 conda-forge
pydap 3.2.2 py38_1000 conda-forge
pyflakes 2.3.0 pyhd8ed1ab_0 conda-forge
pygments 2.8.1 pyhd8ed1ab_0 conda-forge
pyke 1.1.1 py38h578d9bd_1003 conda-forge
pyopenssl 20.0.1 pyhd8ed1ab_0 conda-forge
pyparsing 2.4.7 pyh9f0ad1d_0 conda-forge
pyqt 5.12.3 py38h578d9bd_7 conda-forge
pyqt-impl 5.12.3 py38h7400c14_7 conda-forge
pyqt5-sip 4.19.18 py38h709712a_7 conda-forge
pyqtchart 5.12 py38h7400c14_7 conda-forge
pyqtwebengine 5.12.1 py38h7400c14_7 conda-forge
pyrsistent 0.17.3 py38h497a2fe_2 conda-forge
pyshp 2.1.3 pyh44b312d_0 conda-forge
pysocks 1.7.1 py38h578d9bd_3 conda-forge
pytest 6.2.2 py38h578d9bd_0 conda-forge
pytest-cov 2.11.1 pyh44b312d_0 conda-forge
pytest-env 0.6.2 py_0 conda-forge
pytest-forked 1.3.0 pyhd3deb0d_0 conda-forge
pytest-profiling 1.7.0 py_1 conda-forge
pytest-xdist 2.2.1 pyhd8ed1ab_0 conda-forge
python 3.8.6 hffdb5ce_5_cpython conda-forge
python-dateutil 2.8.1 py_0 conda-forge
python-xxhash 2.0.0 py38h497a2fe_1 conda-forge
python_abi 3.8 1_cp38 conda-forge
pytz 2021.1 pyhd8ed1ab_0 conda-forge
pyyaml 5.4.1 py38h497a2fe_0 conda-forge
qt 5.12.9 hda022c4_4 conda-forge
rasterio 1.2.1 py38h57accd2_2 conda-forge
readline 8.0 he28a2e2_2 conda-forge
regex 2020.11.13 py38h497a2fe_1 conda-forge
regrid2 3.1.5 pypi_0 pypi
requests 2.25.1 pyhd3deb0d_0 conda-forge
s3transfer 0.3.4 pyhd8ed1ab_0 conda-forge
scipy 1.6.1 py38hb2138dd_0 conda-forge
seaborn 0.11.1 ha770c72_0 conda-forge
seaborn-base 0.11.1 pyhd8ed1ab_1 conda-forge
setuptools 49.6.0 py38h578d9bd_3 conda-forge
shapely 1.7.1 py38h4fc1155_4 conda-forge
six 1.15.0 pyh9f0ad1d_0 conda-forge
snuggs 1.4.7 py_0 conda-forge
sortedcontainers 2.3.0 pyhd8ed1ab_0 conda-forge
soupsieve 2.0.1 py38h32f6830_0 conda-forge
sparse 0.11.2 py_0 conda-forge
sqlite 3.34.0 h74cdb3f_0 conda-forge
statsmodels 0.12.2 py38h5c078b8_0 conda-forge
tblib 1.6.0 py_0 conda-forge
tiledb 2.2.5 h91fcb0e_0 conda-forge
tk 8.6.10 hed695b0_1 conda-forge
toml 0.10.2 pyhd8ed1ab_0 conda-forge
toolz 0.11.1 py_0 conda-forge
tornado 6.1 py38h497a2fe_1 conda-forge
traitlets 5.0.5 py_0 conda-forge
typed-ast 1.4.2 py38h497a2fe_0 conda-forge
typing_extensions 3.7.4.3 py_0 conda-forge
typing_inspect 0.6.0 pyh9f0ad1d_0 conda-forge
tzcode 2021a h7f98852_1 conda-forge
tzdata 2021a he74cb21_0 conda-forge
udunits2 2.2.27.27 h360fe7b_0 conda-forge
urllib3 1.26.4 pyhd8ed1ab_0 conda-forge
virtualenv 20.4.3 py38h578d9bd_0 conda-forge
wcwidth 0.2.5 pyh9f0ad1d_2 conda-forge
webob 1.8.6 py_0 conda-forge
wheel 0.36.2 pyhd3deb0d_0 conda-forge
xarray 0.16.2.dev111+g0d93c4f9.d20201219 dev_0 <develop>
xerces-c 3.2.3 h9d8b166_2 conda-forge
xorg-kbproto 1.0.7 h14c3975_1002 conda-forge
xorg-libice 1.0.10 h516909a_0 conda-forge
xorg-libsm 1.2.3 hd9c2040_1000 conda-forge
xorg-libx11 1.7.0 h36c2ea0_0 conda-forge
xorg-libxau 1.0.9 h14c3975_0 conda-forge
xorg-libxdmcp 1.1.3 h516909a_0 conda-forge
xorg-libxext 1.3.4 h7f98852_1 conda-forge
xorg-libxrender 0.9.10 h7f98852_1003 conda-forge
xorg-renderproto 0.11.1 h14c3975_1002 conda-forge
xorg-xextproto 7.3.0 h14c3975_1002 conda-forge
xorg-xproto 7.0.31 h14c3975_1007 conda-forge
xxhash 0.8.0 h7f98852_3 conda-forge
xz 5.2.5 h516909a_1 conda-forge
yaml 0.2.5 h516909a_0 conda-forge
zarr 2.6.1 pyhd8ed1ab_0 conda-forge
zict 2.0.0 py_0 conda-forge
zipp 3.4.1 pyhd8ed1ab_0 conda-forge
zlib 1.2.11 h516909a_1010 conda-forge
zstd 1.4.9 ha95c52a_0 conda-forge
```
edit: added the traceback |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5053/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
771482993 | MDExOlB1bGxSZXF1ZXN0NTQyOTk3MjQx | 4716 | .coveragerc omit: wildcards | mathause 10194086 | closed | 0 | 2 | 2020-12-20T00:26:59Z | 2021-04-19T20:34:07Z | 2020-12-20T00:48:43Z | MEMBER | 0 | pydata/xarray/pulls/4716 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4716/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
845248555 | MDExOlB1bGxSZXF1ZXN0NjA0NDI2MTYx | 5096 | type: ignore - use error codes | mathause 10194086 | closed | 0 | 2 | 2021-03-30T20:53:50Z | 2021-04-01T10:23:56Z | 2021-04-01T10:23:53Z | MEMBER | 0 | pydata/xarray/pulls/5096 |
Adds error codes to all |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5096/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
843217462 | MDExOlB1bGxSZXF1ZXN0NjAyNjMyMjgw | 5090 | ensure combine_by_coords raises on different types | mathause 10194086 | closed | 0 | 3 | 2021-03-29T10:13:34Z | 2021-03-31T15:53:23Z | 2021-03-31T13:36:44Z | MEMBER | 0 | pydata/xarray/pulls/5090 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5090/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
308039063 | MDExOlB1bGxSZXF1ZXN0MTc3MDc3MTU5 | 2011 | rolling: periodic | mathause 10194086 | closed | 0 | 9 | 2018-03-23T13:57:25Z | 2021-03-30T15:08:22Z | 2021-03-30T15:08:18Z | MEMBER | 0 | pydata/xarray/pulls/2011 |
Ok, this was easier to do than initially thought, we can use I added an initial test, but could use some pointers where else you want this to be tested. Questions:
* is <sup>*</sup> |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2011/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
819884612 | MDExOlB1bGxSZXF1ZXN0NTgyOTE5ODUy | 4982 | pin netCDF4=1.5.3 in min-all-deps | mathause 10194086 | closed | 0 | 1 | 2021-03-02T10:36:18Z | 2021-03-08T09:10:20Z | 2021-03-08T00:20:38Z | MEMBER | 0 | pydata/xarray/pulls/4982 |
The clean thing here would be to update |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4982/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
817951965 | MDU6SXNzdWU4MTc5NTE5NjU= | 4970 | minimum version and non-semantic versioning (netCDF4) | mathause 10194086 | closed | 0 | 1 | 2021-02-27T15:33:48Z | 2021-03-08T00:20:38Z | 2021-03-08T00:20:38Z | MEMBER | We currently pin netCDF4 to version 1.5. However, I think netCDF4 does not really follow semantic versioning, e.g. python 2 support was dropped in version 1.5.6. So they may actually be doing something like So I wonder if we would need to pin netCDF to version to version 1.5.4. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4970/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
814813503 | MDExOlB1bGxSZXF1ZXN0NTc4NzQwMzM2 | 4946 | Upstream CI: limit runtime | mathause 10194086 | closed | 0 | 5 | 2021-02-23T20:40:34Z | 2021-02-24T14:37:04Z | 2021-02-23T22:37:07Z | MEMBER | 0 | pydata/xarray/pulls/4946 |
Try to limit the time of "CI Upstream" to avoid a silent failure. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4946/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
814806676 | MDU6SXNzdWU4MTQ4MDY2NzY= | 4945 | Upstream CI failing silently | mathause 10194086 | closed | 0 | 1 | 2021-02-23T20:30:29Z | 2021-02-24T08:14:00Z | 2021-02-24T08:14:00Z | MEMBER | The last 5 days our Upstream CI failed silently with a timeout after 6h: This was probably caused by #4934. As mentioned in dask/dask#4934 this is probably dask/dask#6738 which was merged 5 days ago. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4945/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
803049855 | MDExOlB1bGxSZXF1ZXN0NTY5MDQ0NzMy | 4878 | typing for numpy 1.20 | mathause 10194086 | closed | 0 | 2 | 2021-02-07T20:32:27Z | 2021-02-23T20:52:50Z | 2021-02-23T20:52:47Z | MEMBER | 0 | pydata/xarray/pulls/4878 | numpy v1.20.0 introduced type hints which leads to some mypy errors in xarray. This is the minimum set of changes to make mypy happy again. I tried to avoid I am sure there is much more fun to be had with numpy typing ;-) |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4878/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
812214755 | MDExOlB1bGxSZXF1ZXN0NTc2NjE2MzM1 | 4929 | CI: run mypy in full env | mathause 10194086 | closed | 0 | 3 | 2021-02-19T17:47:28Z | 2021-02-22T16:42:09Z | 2021-02-22T16:33:51Z | MEMBER | 0 | pydata/xarray/pulls/4929 |
I only added one run for py3.8 latest. To be entirely sure we could also check the typing Ok, looks good - the failure is expected - see #4878. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4929/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
803402841 | MDU6SXNzdWU4MDM0MDI4NDE= | 4881 | check mypy at the end of some CI runs? | mathause 10194086 | closed | 0 | 2 | 2021-02-08T10:04:44Z | 2021-02-22T16:33:50Z | 2021-02-22T16:33:50Z | MEMBER | We currently run mypy in the I think we should at least add this to |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4881/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
811379942 | MDExOlB1bGxSZXF1ZXN0NTc1OTE3NjYz | 4923 | [skip-ci] doc: fix pynio warning | mathause 10194086 | closed | 0 | 1 | 2021-02-18T19:09:00Z | 2021-02-18T19:23:23Z | 2021-02-18T19:23:20Z | MEMBER | 0 | pydata/xarray/pulls/4923 | Small doc fix, see http://xarray.pydata.org/en/stable/io.html#formats-supported-by-pynio (the |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4923/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
739008382 | MDU6SXNzdWU3MzkwMDgzODI= | 4570 | fix compatibility with h5py version 3 and unpin tests | mathause 10194086 | closed | 0 | 6 | 2020-11-09T13:00:01Z | 2021-02-17T08:41:20Z | 2021-02-17T08:41:20Z | MEMBER | h5py version 3.1 broke our tests. I pinned it to version 2.10 in #4569. We should therefore
The failures could be related to a change how strings are read: https://docs.h5py.org/en/latest/strings.html I am not sure if this has to be fixed in xarray or in h5necdf. I'd be happy if someone else took this one. Failed tests:
```
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_zero_dimensional_variable
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_write_store - As...
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_roundtrip_test_data
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_load - Assertion...
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_dataset_compute
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_roundtrip_object_dtype
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_roundtrip_string_data
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_orthogonal_indexing
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_vectorized_indexing
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_isel_dataarray
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_array_type_after_indexing
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_append_write - A...
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_append_overwrite_values
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_write_groups - A...
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_encoding_kwarg_vlen_string
FAILED xarray/tests/test_backends.py::TestH5NetCDFData::test_compression_encoding
FAILED xarray/tests/test_backends.py::TestH5NetCDFFileObject::test_zero_dimensional_variable
FAILED xarray/tests/test_backends.py::TestH5NetCDFFileObject::test_write_store
FAILED xarray/tests/test_backends.py::TestH5NetCDFFileObject::test_roundtrip_test_data
FAILED xarray/tests/test_backends.py::TestH5NetCDFFileObject::test_load - Ass...
FAILED xarray/tests/test_backends.py::TestH5NetCDFFileObject::test_dataset_compute
FAILED xarray/tests/test_backends.py::TestH5NetCDFFileObject::test_roundtrip_object_dtype
FAILED xarray/tests/test_backends.py::TestH5NetCDFViaDaskData::test_encoding_kwarg_vlen_string
FAILED xarray/tests/test_backends.py::TestH5NetCDFViaDaskData::test_compression_encoding
FAILED xarray/tests/test_distributed.py::test_dask_distributed_netcdf_roundtrip[h5netcdf-NETCDF4]
FAILED xarray/tests/test_distributed.py::test_dask_distributed_read_netcdf_integration_test[h5netcdf-NETCDF4]
```
Example failure: ```python traceback
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4570/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);