home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

6 rows where state = "closed", type = "issue" and user = 13301940 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

type 1

  • issue · 6 ✖

state 1

  • closed · 6 ✖

repo 1

  • xarray 6
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
2157624683 I_kwDOAMm_X86Amr1r 8788 CI Failure in Xarray test suite post-Dask tokenization update andersy005 13301940 closed 0 crusaderky 6213168   1 2024-02-27T21:23:48Z 2024-03-01T03:29:52Z 2024-03-01T03:29:52Z MEMBER      

What is your issue?

Recent changes in Dask's tokenization process (https://github.com/dask/dask/pull/10876) seem to have introduced unexpected behavior in Xarray's test suite. This has led to CI failures, specifically in tests related to tokenization.

  • https://github.com/pydata/xarray/actions/runs/8069874717/job/22045898877

```python ---------- coverage: platform linux, python 3.12.2-final-0 ----------- Coverage XML written to file coverage.xml

=========================== short test summary info ============================ FAILED xarray/tests/test_dask.py::test_token_identical[obj0-<lambda>1] - AssertionError: assert 'bbd9679bdaf2...d3db65e29a72d' == '6352792990cf...e8004a9055314'

  • 6352792990cfe23adb7e8004a9055314
  • bbd9679bdaf284c371cd3db65e29a72d FAILED xarray/tests/test_dask.py::test_token_identical[obj0-<lambda>2] - AssertionError: assert 'bbd9679bdaf2...d3db65e29a72d' == '6352792990cf...e8004a9055314'

  • 6352792990cfe23adb7e8004a9055314

  • bbd9679bdaf284c371cd3db65e29a72d FAILED xarray/tests/test_dask.py::test_token_identical[obj1-<lambda>1] - AssertionError: assert 'c520b8516da8...0e9e0d02b79d0' == '9e2ab1c44990...6ac737226fa02'

  • 9e2ab1c44990adb4fb76ac737226fa02

  • c520b8516da8b6a98c10e9e0d02b79d0 FAILED xarray/tests/test_dask.py::test_token_identical[obj1-<lambda>2] - AssertionError: assert 'c520b8516da8...0e9e0d02b79d0' == '9e2ab1c44990...6ac737226fa02'

  • 9e2ab1c44990adb4fb76ac737226fa02

  • c520b8516da8b6a98c10e9e0d02b79d0 = 4 failed, 16293 passed, 628 skipped, 90 xfailed, 71 xpassed, 213 warnings in 472.07s (0:07:52) = Error: Process completed with exit code 1. ```

previously, the following code snippet would pass, verifying the consistency of tokenization in Xarray objects:

```python In [1]: import xarray as xr, numpy as np

In [2]: def make_da(): ...: da = xr.DataArray( ...: np.ones((10, 20)), ...: dims=["x", "y"], ...: coords={"x": np.arange(10), "y": np.arange(100, 120)}, ...: name="a", ...: ).chunk({"x": 4, "y": 5}) ...: da.x.attrs["long_name"] = "x" ...: da.attrs["test"] = "test" ...: da.coords["c2"] = 0.5 ...: da.coords["ndcoord"] = da.x * 2 ...: da.coords["cxy"] = (da.x * da.y).chunk({"x": 4, "y": 5}) ...: ...: return da ...:

In [3]: da = make_da()

In [4]: import dask.base

In [5]: assert dask.base.tokenize(da) == dask.base.tokenize(da.copy(deep=False))

In [6]: assert dask.base.tokenize(da) == dask.base.tokenize(da.copy(deep=True))

In [9]: dask.version Out[9]: '2023.3.0' ```

However, post-update in Dask version '2024.2.1', the same code fails:

```python In [55]: ...: def make_da(): ...: da = xr.DataArray( ...: np.ones((10, 20)), ...: dims=["x", "y"], ...: coords={"x": np.arange(10), "y": np.arange(100, 120)}, ...: name="a", ...: ).chunk({"x": 4, "y": 5}) ...: da.x.attrs["long_name"] = "x" ...: da.attrs["test"] = "test" ...: da.coords["c2"] = 0.5 ...: da.coords["ndcoord"] = da.x * 2 ...: da.coords["cxy"] = (da.x * da.y).chunk({"x": 4, "y": 5}) ...: ...: return da ...:

In [56]: da = make_da() ```

```python In [57]: assert dask.base.tokenize(da) == dask.base.tokenize(da.copy(deep=False))


AssertionError Traceback (most recent call last) Cell In[57], line 1 ----> 1 assert dask.base.tokenize(da) == dask.base.tokenize(da.copy(deep=False))

AssertionError:

In [58]: dask.base.tokenize(da) Out[58]: 'bbd9679bdaf284c371cd3db65e29a72d'

In [59]: dask.base.tokenize(da.copy(deep=False)) Out[59]: '6352792990cfe23adb7e8004a9055314'

In [61]: dask.version Out[61]: '2024.2.1' ```

additionally, a deeper dive into dask.base.normalize_token() across the two Dask versions revealed that the latest version includes additional state or metadata in tokenization that was not present in earlier versions.

  • old version python In [29]: dask.base.normalize_token((type(da), da._variable, da._coords, da._name)) Out[29]: ('tuple', [xarray.core.dataarray.DataArray, ('tuple', [xarray.core.variable.Variable, ('tuple', ['x', 'y']), 'xarray-<this-array>-14cc91345e4b75c769b9032d473f6f6e', ('list', [('tuple', ['test', 'test'])])]), ('list', [('tuple', ['c2', ('tuple', [xarray.core.variable.Variable, ('tuple', []), (0.5, dtype('float64')), ('list', [])])]), ('tuple', ['cxy', ('tuple', [xarray.core.variable.Variable, ('tuple', ['x', 'y']), 'xarray-<this-array>-8e98950eca22c69d304f0a48bc6c2df9', ('list', [])])]), ('tuple', ['ndcoord', ('tuple', [xarray.core.variable.Variable, ('tuple', ['x']), 'xarray-ndcoord-82411ea5e080aa9b9f554554befc2f39', ('list', [])])]), ('tuple', ['x', ('tuple', [xarray.core.variable.IndexVariable, ('tuple', ['x']), ['x', ('603944b9792513fa0c686bb494a66d96c667f879', dtype('int64'), (10,), (8,))], ('list', [('tuple', ['long_name', 'x'])])])]), ('tuple', ['y', ('tuple', [xarray.core.variable.IndexVariable, ('tuple', ['y']), ['y', ('fc411db876ae0f4734dac8b64152d5c6526a537a', dtype('int64'), (20,), (8,))], ('list', [])])])]), 'a'])

  • most recent version

python In [44]: dask.base.normalize_token((type(da), da._variable, da._coords, da._name)) Out[44]: ('tuple', [('7b61e7593a274e48', []), ('tuple', [('215b115b265c420c', []), ('tuple', ['x', 'y']), 'xarray-<this-array>-980383b18aab94069bdb02e9e0956184', ('dict', [('tuple', ['test', 'test'])])]), ('dict', [('tuple', ['c2', ('tuple', [('__seen', 2), ('tuple', []), ('6825817183edbca7', ['48cb5e118059da42']), ('dict', [])])]), ('tuple', ['cxy', ('tuple', [('__seen', 2), ('tuple', ['x', 'y']), 'xarray-<this-array>-6babb4e95665a53f34a3e337129d54b5', ('dict', [])])]), ('tuple', ['ndcoord', ('tuple', [('__seen', 2), ('tuple', ['x']), 'xarray-ndcoord-8636fac37e5e6f4401eab2aef399f402', ('dict', [])])]), ('tuple', ['x', ('tuple', [('abc1995cae8530ae', []), ('tuple', ['x']), ['x', ('99b2df4006e7d28a', ['04673d65c892b5ba'])], ('dict', [('tuple', ['long_name', 'x'])])])]), ('tuple', ['y', ('tuple', [('__seen', 25), ('tuple', ['y']), ['y', ('88974ea603e15c49', ['a6c0f2053e85c87e'])], ('dict', [])])])]), 'a'])

Cc @dcherian / @crusaderky for visibility

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8788/reactions",
    "total_count": 2,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 1
}
  completed xarray 13221727 issue
2106570846 I_kwDOAMm_X859j7he 8681 CI Failures Associated with Pytest v8.0.0 Release andersy005 13301940 closed 0     2 2024-01-29T22:45:26Z 2024-01-31T16:53:46Z 2024-01-31T16:53:46Z MEMBER      

What is your issue?

A recent release of pytest (v8.0.0) appears to have broken our CI.

bash pytest 8.0.0 pyhd8ed1ab_0 conda-forge pytest-cov 4.1.0 pyhd8ed1ab_0 conda-forge pytest-env 1.1.3 pyhd8ed1ab_0 conda-forge pytest-github-actions-annotate-failures 0.2.0 pypi_0 pypi pytest-timeout 2.2.0 pyhd8ed1ab_0 conda-forge pytest-xdist 3.5.0 pyhd8ed1ab_0 conda-forge

Strangely, the issue doesn't seem to occur when using previous versions (e.g. v7.4.4). our last successful CI run used pytest v7.4.4

bash pytest 7.4.4 pyhd8ed1ab_0 conda-forge pytest-cov 4.1.0 pyhd8ed1ab_0 conda-forge pytest-env 1.1.3 pyhd8ed1ab_0 conda-forge pytest-github-actions-annotate-failures 0.2.0 pypi_0 pypi pytest-timeout 2.2.0 pyhd8ed1ab_0 conda-forge pytest-xdist 3.5.0 pyhd8ed1ab_0 conda-forge

i recreated the environment and successfully ran tests locally. the CI failures appear to be connected to the latest release of pytest. i haven't had a chance to do an in-depth exploration of the changes from pytest which could be influencing this disruption. so, i wanted to open an issue to track what is going on. in the meantime, i'm going to pin pytest to an earlier version.

any insights, especially from those familiar with changes in the pytest v8.0.0 update, are warmly welcomed.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8681/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1035640211 I_kwDOAMm_X849up2T 5898 Update docs for Dataset `reduce` methods to indicate that non-numeric data variables are dropped andersy005 13301940 closed 0     2 2021-10-25T22:48:49Z 2022-03-12T08:17:48Z 2022-03-12T08:17:48Z MEMBER      

xr.Dataset reduce methods such as mean drop non-numeric data variables prior to the reduction. However, as far as I can tell this info isn't mentioned anywhere in the documentation/docstrings. I think this would be useful information to include here for example:

```python In [47]: import xarray as xr

In [48]: import numpy as np, pandas as pd

In [50]: ds['foo'] = xr.DataArray(np.arange(6).reshape(2, 3), dims=['x', 'y'])

In [53]: ds['bar'] = xr.DataArray(pd.date_range(start='2000', periods=6).values.reshape(2, 3), dims=['x', 'y'])

In [54]: ds Out[54]: <xarray.Dataset> Dimensions: (x: 2, y: 3) Dimensions without coordinates: x, y Data variables: foo (x, y) int64 0 1 2 3 4 5 bar (x, y) datetime64[ns] 2000-01-01 2000-01-02 ... 2000-01-06 ```

```python In [55]: ds.mean('x') Out[55]: <xarray.Dataset> Dimensions: (y: 3) Dimensions without coordinates: y Data variables: foo (y) float64 1.5 2.5 3.5

In [56]: ds.bar.mean('x') Out[56]: <xarray.DataArray 'bar' (y: 3)> array(['2000-01-02T12:00:00.000000000', '2000-01-03T12:00:00.000000000', '2000-01-04T12:00:00.000000000'], dtype='datetime64[ns]') Dimensions without coordinates: y ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5898/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
731813879 MDU6SXNzdWU3MzE4MTM4Nzk= 4549 [Proposal] Migrate general discussions from the xarray gitter room to GitHub Discussions andersy005 13301940 closed 0     5 2020-10-28T21:48:29Z 2020-11-25T22:28:41Z 2020-11-25T22:28:41Z MEMBER      

Currently, xarray has a room on Gitter: https://gitter.im/pydata/xarray. This room works fine for discussions outside of the codebase. However, Gitter has a few disadvantages:

  • The contents are not indexed by search engines
  • Searching through existing discussions is almost impossible
  • Linking to prior conversations in the room is also complicated

A few months ago, GitHub announced GitHub discussions which is meant to serve as a forum for discussions outside of the codebase. I am of the opinion that GitHub discussions is a better alternative to Gitter. I am wondering if xarray folks would be interested in enabling GitHub discussion on this repo, and migrating general discussions from Gitter to GitHub discussions?

GitHub Discussions is still in beta, but projects can request early access here

Here is a list of a few projects with beta access:

  • https://github.com/vercel/vercel/discussions
  • https://github.com/KaTeX/KaTeX/discussions
  • https://github.com/vercel/next.js/discussions
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4549/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
679445732 MDU6SXNzdWU2Nzk0NDU3MzI= 4341 Computing averaged time produces wrong/incorrect time values andersy005 13301940 closed 0     3 2020-08-14T23:15:01Z 2020-08-15T20:05:23Z 2020-08-15T20:05:23Z MEMBER      

What happened:

While computing averaged time using time_bounds via times = bounds.mean('d2'), I get weird results (see example below). It's my understanding that this is a bug, but I don't know yet where it's coming from. I should note that in addition to getting wrong time values, the resulting time values are not monotonically increasing even though my time bounds are.

What you expected to happen:

Correct averaged time values

Minimal Complete Verifiable Example:

```python In [1]: import xarray as xr

In [2]: import numpy as np

In [3]: dates = xr.cftime_range(start='0400-01', end='2101-01', freq='120Y', calendar='noleap')

In [4]: bounds = xr.DataArray(np.vstack([dates[:-1], dates[1:]]).T, dims=['time', 'd2'])

In [5]: bounds
Out[5]: <xarray.DataArray (time: 14, d2: 2)> array([[cftime.DatetimeNoLeap(400, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(520, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(520, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(640, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(640, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(760, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(760, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(880, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(880, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1000, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1000, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1120, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1120, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1240, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1240, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1360, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1360, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1480, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1480, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1600, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1600, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1720, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1720, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1840, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1840, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(1960, 12, 31, 0, 0, 0, 0)], [cftime.DatetimeNoLeap(1960, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(2080, 12, 31, 0, 0, 0, 0)]], dtype=object) Dimensions without coordinates: time, d2

In [6]: bounds.mean('d2')
Out[6]: <xarray.DataArray (time: 14)> array([cftime.DatetimeNoLeap(460, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(580, 12, 31, 0, 0, 0, 0), cftime.DatetimeNoLeap(116, 1, 21, 0, 25, 26, 290448), cftime.DatetimeNoLeap(236, 1, 21, 0, 25, 26, 290448), cftime.DatetimeNoLeap(356, 1, 21, 0, 25, 26, 290448), cftime.DatetimeNoLeap(476, 1, 21, 0, 25, 26, 290448), cftime.DatetimeNoLeap(596, 1, 21, 0, 25, 26, 290448), cftime.DatetimeNoLeap(131, 2, 11, 0, 50, 52, 580897), cftime.DatetimeNoLeap(251, 2, 11, 0, 50, 52, 580897), cftime.DatetimeNoLeap(371, 2, 11, 0, 50, 52, 580897), cftime.DatetimeNoLeap(491, 2, 11, 0, 50, 52, 580897), cftime.DatetimeNoLeap(611, 2, 11, 0, 50, 52, 580897), cftime.DatetimeNoLeap(146, 3, 4, 1, 16, 18, 871345), cftime.DatetimeNoLeap(266, 3, 4, 1, 16, 18, 871345)], dtype=object) Dimensions without coordinates: time

```

Anything else we need to know?:

Environment:

Output of <tt>xr.show_versions()</tt> ```python INSTALLED VERSIONS ------------------ commit: None python: 3.7.8 | packaged by conda-forge | (default, Jul 23 2020, 03:54:19) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 3.10.0-1127.13.1.el7.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: en_US.UTF-8 LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.1.0 numpy: 1.19.1 scipy: 1.5.2 netCDF4: 1.5.4 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: 2.4.0 cftime: 1.2.1 nc_time_axis: 1.2.0 PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.22.0 distributed: 2.22.0 matplotlib: 3.3.0 cartopy: 0.18.0 seaborn: 0.10.1 numbagg: None pint: None setuptools: 49.2.1.post20200802 pip: 20.2.1 conda: None pytest: None IPython: 7.17.0 sphinx: None ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4341/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
510326302 MDU6SXNzdWU1MTAzMjYzMDI= 3426 `.sel()` failures when using latest cftime release (v1.0.4) andersy005 13301940 closed 0     3 2019-10-21T22:19:24Z 2019-10-22T18:31:34Z 2019-10-22T18:31:34Z MEMBER      

I just updated to the latest cftime release, and all of a sudden sel() appears to be broken:

```python In [1]: import xarray as xr

In [2]: import cftime

In [3]: ds = xr.tutorial.load_dataset('rasm')

In [4]: ds
Out[4]: <xarray.Dataset> Dimensions: (time: 36, x: 275, y: 205) Coordinates: * time (time) object 1980-09-16 12:00:00 ... 1983-08-17 00:00:00 xc (y, x) float64 189.2 189.4 189.6 189.7 ... 17.65 17.4 17.15 16.91 yc (y, x) float64 16.53 16.78 17.02 17.27 ... 28.26 28.01 27.76 27.51 Dimensions without coordinates: x, y Data variables: Tair (time, y, x) float64 nan nan nan nan nan ... 29.8 28.66 28.19 28.21 Attributes: title: /workspace/jhamman/processed/R1002RBRxaaa01a/l... institution: U.W. source: RACM R1002RBRxaaa01a output_frequency: daily output_mode: averaged convention: CF-1.4 references: Based on the initial model of Liang et al., 19... comment: Output from the Variable Infiltration Capacity... nco_openmp_thread_number: 1 NCO: "4.6.0" history: Tue Dec 27 14:15:22 2016: ncatted -a dimension...

In [5]: ds.sel(time=slice("1980", "1982"))

ValueError Traceback (most recent call last) <ipython-input-5-2c26e36a673a> in <module> ----> 1 ds.sel(time=slice("1980", "1982"))

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/xarray/core/dataset.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs) 1998 indexers = either_dict_or_kwargs(indexers, indexers_kwargs, "sel") 1999 pos_indexers, new_indexes = remap_label_indexers( -> 2000 self, indexers=indexers, method=method, tolerance=tolerance 2001 ) 2002 result = self.isel(indexers=pos_indexers, drop=drop)

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/xarray/core/coordinates.py in remap_label_indexers(obj, indexers, method, tolerance, **indexers_kwargs) 390 391 pos_indexers, new_indexes = indexing.remap_label_indexers( --> 392 obj, v_indexers, method=method, tolerance=tolerance 393 ) 394 # attach indexer's coordinate to pos_indexers

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/xarray/core/indexing.py in remap_label_indexers(data_obj, indexers, method, tolerance) 259 coords_dtype = data_obj.coords[dim].dtype 260 label = maybe_cast_to_coords_dtype(label, coords_dtype) --> 261 idxr, new_idx = convert_label_indexer(index, label, dim, method, tolerance) 262 pos_indexers[dim] = idxr 263 if new_idx is not None:

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/xarray/core/indexing.py in convert_label_indexer(index, label, index_name, method, tolerance) 123 _sanitize_slice_element(label.start), 124 _sanitize_slice_element(label.stop), --> 125 _sanitize_slice_element(label.step), 126 ) 127 if not isinstance(indexer, slice):

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/pandas/core/indexes/base.py in slice_indexer(self, start, end, step, kind) 5032 slice(1, 3) 5033 """ -> 5034 start_slice, end_slice = self.slice_locs(start, end, step=step, kind=kind) 5035 5036 # return a slice

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/pandas/core/indexes/base.py in slice_locs(self, start, end, step, kind) 5246 start_slice = None 5247 if start is not None: -> 5248 start_slice = self.get_slice_bound(start, "left", kind) 5249 if start_slice is None: 5250 start_slice = 0

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/pandas/core/indexes/base.py in get_slice_bound(self, label, side, kind) 5158 # For datetime indices label may be a string that has to be converted 5159 # to datetime boundary according to its resolution. -> 5160 label = self._maybe_cast_slice_bound(label, side, kind) 5161 5162 # we need to look up the label

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/xarray/coding/cftimeindex.py in _maybe_cast_slice_bound(self, label, side, kind) 336 pandas.tseries.index.DatetimeIndex._maybe_cast_slice_bound""" 337 if isinstance(label, str): --> 338 parsed, resolution = _parse_iso8601_with_reso(self.date_type, label) 339 start, end = _parsed_string_to_bounds(self.date_type, resolution, parsed) 340 if self.is_monotonic_decreasing and len(self) > 1:

~/opt/miniconda3/envs/intake-esm-dev/lib/python3.7/site-packages/xarray/coding/cftimeindex.py in _parse_iso8601_with_reso(date_type, timestr) 114 # 1.0.3.4. 115 replace["dayofwk"] = -1 --> 116 return default.replace(**replace), resolution 117 118

cftime/_cftime.pyx in cftime._cftime.datetime.replace()

ValueError: Replacing the dayofyr or dayofwk of a datetime is not supported.

```

Output of xr.show_versions()

# Paste the output here xr.show_versions() here ```python In [6]: xr.show_versions() INSTALLED VERSIONS ------------------ commit: None python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 14:38:56) [Clang 4.0.1 (tags/RELEASE_401/final)] python-bits: 64 OS: Darwin OS-release: 18.7.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.14.0 pandas: 0.25.2 numpy: 1.17.2 scipy: None netCDF4: 1.5.1.2 pydap: None h5netcdf: None h5py: None Nio: None zarr: 2.3.2 cftime: 1.0.4 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.2.0 distributed: 2.5.1 matplotlib: None cartopy: None seaborn: None numbagg: None setuptools: 41.4.0 pip: 19.2.1 conda: None pytest: 5.0.1 IPython: 7.8.0 sphinx: 2.1.2 ```

Expected Output

I can confirm that everything works just fine with an older version of cftime:

```python In [4]: ds.sel(time=slice("1980", "1982"))
Out[4]: <xarray.Dataset> Dimensions: (time: 28, x: 275, y: 205) Coordinates: * time (time) object 1980-09-16 12:00:00 ... 1982-12-17 00:00:00 xc (y, x) float64 ... yc (y, x) float64 ... Dimensions without coordinates: x, y Data variables: Tair (time, y, x) float64 ... Attributes: title: /workspace/jhamman/processed/R1002RBRxaaa01a/l... institution: U.W. source: RACM R1002RBRxaaa01a output_frequency: daily output_mode: averaged convention: CF-1.4 references: Based on the initial model of Liang et al., 19... comment: Output from the Variable Infiltration Capacity... nco_openmp_thread_number: 1 NCO: "4.6.0" history: Tue Dec 27 14:15:22 2016: ncatted -a dimension... In [5]: import cftime

In [6]: cftime.version
Out[6]: '1.0.3.4'

In [7]: xr.version
Out[7]: '0.14.0' ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3426/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 32.373ms · About: xarray-datasette