home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

9 rows where comments = 2, repo = 13221727 and user = 500246 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

type 2

  • issue 8
  • pull 1

state 2

  • closed 7
  • open 2

repo 1

  • xarray · 9 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
741806260 MDU6SXNzdWU3NDE4MDYyNjA= 4579 Invisible differences between arrays using IntervalIndex gerritholl 500246 open 0     2 2020-11-12T17:54:55Z 2022-10-03T15:09:25Z   CONTRIBUTOR      

What happened:

I have two DataArrays that each have a coordinate constructed with pandas.interval_range. In one case I pass the interval_range directly, in the other case I call .to_numpy() first. The two DataArrays look identical but aren't. This can lead to hard-to-find bugs, because behaviour is not identical: the former supports indexing whereas the latter doesn't.

What you expected to happen:

I expect two arrays that appear identical to behave identically. If they don't behave identically then there should be some way to tell the difference (apart from equals, which tells me they are different but not how).

Minimal Complete Verifiable Example:

```python import xarray import pandas

da1 = xarray.DataArray([0, 1, 2], dims=("x",), coords={"x": pandas.interval_range(0, 2, 3)}) da2 = xarray.DataArray([0, 1, 2], dims=("x",), coords={"x": pandas.interval_range(0, 2, 3).to_numpy()})

print(repr(da1) == repr(da2)) print(repr(da1.x) == repr(da2.x)) print(da1.x.dtype == da2.x.dtype)

identical? No:

print(da1.equals(da2)) print(da1.x.equals(da2.x))

in particular:

da1.sel(x=1) # works da2.sel(x=1) # fails ```

Results in:

``` True True True False False Traceback (most recent call last): File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/pandas/core/indexes/base.py", line 2895, in get_loc return self._engine.get_loc(casted_key) File "pandas/_libs/index.pyx", line 70, in pandas._libs.index.IndexEngine.get_loc File "pandas/_libs/index.pyx", line 101, in pandas._libs.index.IndexEngine.get_loc File "pandas/_libs/hashtable_class_helper.pxi", line 1675, in pandas._libs.hashtable.PyObjectHashTable.get_item File "pandas/_libs/hashtable_class_helper.pxi", line 1683, in pandas._libs.hashtable.PyObjectHashTable.get_item KeyError: 1

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "mwe105.py", line 19, in <module> da2.sel(x=1) # fails File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/dataarray.py", line 1143, in sel ds = self._to_temp_dataset().sel( File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/dataset.py", line 2105, in sel pos_indexers, new_indexes = remap_label_indexers( File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/coordinates.py", line 397, in remap_label_indexers pos_indexers, new_indexes = indexing.remap_label_indexers( File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/indexing.py", line 275, in remap_label_indexers idxr, new_idx = convert_label_indexer(index, label, dim, method, tolerance) File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/indexing.py", line 196, in convert_label_indexer indexer = index.get_loc(label_value, method=method, tolerance=tolerance) File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/pandas/core/indexes/base.py", line 2897, in get_loc raise KeyError(key) from err KeyError: 1 ```

Additional context

I suppose this happens because under the hood xarray does something clever to support pandas-style indexing even though the coordinate variable appears like a numpy array with an object dtype, and that this cleverness is lost if the object is already converted to a numpy array. But there is, as far as I can see, no way to tell the difference once the objects have been created.

Environment:

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.8.6 | packaged by conda-forge | (default, Oct 7 2020, 19:08:05) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.1 pandas: 1.1.4 numpy: 1.19.4 scipy: 1.5.3 netCDF4: 1.5.4 pydap: None h5netcdf: 0.8.1 h5py: 3.1.0 Nio: None zarr: 2.5.0 cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.7 cfgrib: None iris: None bottleneck: None dask: 2.30.0 distributed: 2.30.1 matplotlib: 3.3.2 cartopy: 0.18.0 seaborn: None numbagg: None pint: None setuptools: 49.6.0.post20201009 pip: 20.2.4 conda: installed pytest: 6.1.2 IPython: 7.19.0 sphinx: 3.3.0
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4579/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
686461572 MDU6SXNzdWU2ODY0NjE1NzI= 4378 Plotting when Interval coordinate is timedelta-based gerritholl 500246 open 0     2 2020-08-26T16:36:27Z 2022-04-18T21:55:15Z   CONTRIBUTOR      

Is your feature request related to a problem? Please describe.

The xarray plotting interface supports coordinates containing pandas.Interval iff those intervals contain numbers. It fails when those intervals contain pandas.Timedelta:

```python import numpy as np import pandas as pd import xarray as xr

da = xr.DataArray( np.arange(10), dims=("x",), coords={"x": [pd.Interval(i, i+1) for i in range(10)]}) da.plot() # works

da = xr.DataArray( np.arange(10), dims=("x",), coords={"x": [pd.Interval( d-pd.Timestamp("2000-01-01"), d-pd.Timestamp("2000-01-01")+pd.Timedelta("1H")) for d in pd.date_range("2000-01-01", "2000-01-02", 10)]}) da.plot() # fails ```

The latter fails with:

Traceback (most recent call last): File "mwe82.py", line 18, in <module> da.plot() # fails File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/plot/plot.py", line 446, in __call__ return plot(self._da, **kwargs) File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/plot/plot.py", line 200, in plot return plotfunc(darray, **kwargs) File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/plot/plot.py", line 302, in line _ensure_plottable(xplt_val, yplt_val) File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/plot/utils.py", line 551, in _ensure_plottable raise TypeError( TypeError: Plotting requires coordinates to be numeric or dates of type np.datetime64, datetime.datetime, cftime.datetime or pd.Interval.

This error message is somewhat confusing, because the coordinates are "dates of type (...) pd.Interval", but perhaps a timedelta is not considered a date.

Describe the solution you'd like

I would like that I can use the xarray plotting interface for any pandas.Interval coordinate, including pandas.Timestamp and pandas.Timedelta.

Describe alternatives you've considered

I'll "manually" calculate the midpoints and use those as a timedelta coordinate instead.

Additional context

It seems that regular timedeltas aren't really supported either, although they don't cause an error message, they rather produce incorrect results. There's probably a related issue somewhere, but I can't find it now.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4378/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
618985094 MDU6SXNzdWU2MTg5ODUwOTQ= 4065 keep_attrs not respected for unary operators gerritholl 500246 closed 0     2 2020-05-15T13:55:14Z 2020-10-14T16:29:51Z 2020-10-14T16:29:51Z CONTRIBUTOR      

The xarray global option keep_attrs (introduced in #2482 ) is not respected for unary operators.

MCVE Code Sample

python import xarray as xr x = xr.DataArray([1, 2, 3], attrs={"A": "B"}) with xr.set_options(keep_attrs=True): y = ~x print(x.attrs, y.attrs)

Expected Output

I expect

{'A': 'B'} {'A': 'B'}

Problem Description

I get:

{'A': 'B'} {}

I get the same for the other unary operators +x, -x, and abs(x).

Versions

Tested with latest xarray master (see below for details).

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.8.2 | packaged by conda-forge | (default, Mar 23 2020, 18:16:37) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.15.2.dev64+g2542a63f pandas: 1.0.3 numpy: 1.18.1 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: 2.4.0 cftime: 1.1.1.2 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.3 cfgrib: None iris: None bottleneck: None dask: 2.14.0 distributed: 2.14.0 matplotlib: 3.2.1 cartopy: 0.17.0 seaborn: None numbagg: None pint: None setuptools: 46.1.3.post20200325 pip: 20.0.2 conda: installed pytest: 5.4.1 IPython: 7.13.0 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4065/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
528154893 MDU6SXNzdWU1MjgxNTQ4OTM= 3572 Context manager `AttributeError` when engine='h5netcdf' gerritholl 500246 closed 0     2 2019-11-25T15:19:29Z 2019-11-25T16:12:37Z 2019-11-25T16:12:37Z CONTRIBUTOR      

Opening this NetCDF file works fine with the default engine, but fails with AttributeError with the h5netcdf engine:

MCVE Code Sample

Data available from EUMETSAT: https://www.eumetsat.int/website/home/Satellites/FutureSatellites/MeteosatThirdGeneration/MTGData/MTGUserTestData/index.html --> ftp://ftp.eumetsat.int/pub/OPS/out/test-data/Test-data-for-External-Users/MTG_FCI_Test-Data/ --> uncompressed

```python

import xarray f = "/path/to/.../W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410114434_GTT_DEV_20170410113925_20170410113934_N__C_0070_0067.nc" ds = xarray.open_dataset(f, engine="h5netcdf") ```

Expected Output

No output at all.

Problem Description

Results in AttributeError:

Traceback (most recent call last): File "mwe4.py", line 3, in <module> with xarray.open_dataset(f, engine="h5netcdf") as ds: File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/api.py", line 535, in open_dataset ds = maybe_decode_store(store) File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/api.py", line 450, in maybe_decode_store use_cftime=use_cftime, File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/conventions.py", line 570, in decode_cf vars, attrs = obj.load() File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/common.py", line 123, in load (_decode_variable_name(k), v) for k, v in self.get_variables().items() File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/h5netcdf_.py", line 156, in get_variables (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items() File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/core/utils.py", line 402, in FrozenDict return Frozen(dict(*args, **kwargs)) File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/h5netcdf_.py", line 156, in <genexpr> (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items() File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/xarray/backends/h5netcdf_.py", line 120, in open_store_variable dimensions = var.dimensions File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/h5netcdf/core.py", line 114, in dimensions self._dimensions = self._lookup_dimensions() File "/media/nas/x21324/miniconda3/envs/py37e/lib/python3.7/site-packages/h5netcdf/core.py", line 98, in _lookup_dimensions for axis, dim in enumerate(self._h5ds.dims): AttributeError: 'Datatype' object has no attribute 'dims'

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.79-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.14.1 pandas: 0.25.3 numpy: 1.17.3 scipy: 1.3.2 netCDF4: 1.5.3 pydap: None h5netcdf: 0.7.4 h5py: 2.10.0 Nio: None zarr: 2.3.2 cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.0 cfgrib: None iris: None bottleneck: None dask: 2.8.0 distributed: 2.8.0 matplotlib: 3.1.2 cartopy: 0.17.0 seaborn: None numbagg: None setuptools: 41.6.0.post20191101 pip: 19.3.1 conda: None pytest: 5.3.0 IPython: 7.9.0 sphinx: 2.2.1
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3572/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
246093122 MDU6SXNzdWUyNDYwOTMxMjI= 1494 AssertionError when storing datetime coordinates of wrong units gerritholl 500246 closed 0     2 2017-07-27T16:11:48Z 2019-06-30T04:28:18Z 2019-06-30T04:28:17Z CONTRIBUTOR      

The following code should probably fail somewhere else than with an AssertionError triggered by to_netcdf:

``` $ cat mwe.py

!/usr/bin/env python3.6

import numpy import xarray

x = xarray.DataArray( [1, 2, 3], dims=["X"], coords={"X": numpy.zeros(shape=3, dtype="M8[ms]")})

x.to_netcdf("/tmp/test.nc") $ python3.6 mwe.py Traceback (most recent call last): File "mwe.py", line 11, in <module> x.to_netcdf("/tmp/test.nc") File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataarray.py", line 1351, in to_netcdf dataset.to_netcdf(args, *kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py", line 977, in to_netcdf unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/api.py", line 573, in to_netcdf unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py", line 916, in dump_to_store unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py", line 244, in store cf_variables, cf_attrs = cf_encoder(variables, attributes) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 1089, in cf_encoder for k, v in iteritems(variables)) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 1089, in <genexpr> for k, v in iteritems(variables)) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 734, in encode_cf_variable var = maybe_encode_datetime(var) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 585, in maybe_encode_datetime data, encoding.pop('units', None), encoding.pop('calendar', None)) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 293, in encode_cf_datetime assert dates.dtype == 'datetime64[ns]' AssertionError ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1494/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
381633612 MDExOlB1bGxSZXF1ZXN0MjMxNTU2ODM5 2557 add missing comma and article in error message gerritholl 500246 closed 0     2 2018-11-16T14:59:02Z 2018-11-16T16:40:03Z 2018-11-16T16:40:03Z CONTRIBUTOR   0 pydata/xarray/pulls/2557

Add missing comma and article in error message when attribute values have the wrong type.

I tihnk this change is sufficiently minor that no documentation or whatsnew changes should be necessary.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2557/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
289790965 MDU6SXNzdWUyODk3OTA5NjU= 1838 DataArray.sum does not respect dtype keyword gerritholl 500246 closed 0     2 2018-01-18T22:01:07Z 2018-01-20T18:29:02Z 2018-01-20T18:29:02Z CONTRIBUTOR      

Code Sample, a copy-pastable example if possible

```python

Your code here

da = xarray.DataArray(arange(5, dtype="i2")) print(da.sum(dtype="i4").dtype) ```

Problem description

The result is int64. This is a problem because I asked for int32.

Expected Output

Expected output int32.

Output of xr.show_versions()

# Paste the output here xr.show_versions() here INSTALLED VERSIONS ------------------ commit: None python: 3.6.1.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-696.6.3.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.0+dev12.gf882a58 pandas: 0.22.0 numpy: 1.14.0 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: None Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.16.1 matplotlib: 2.1.1 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 38.4.0 pip: 9.0.1 conda: 4.3.16 pytest: 3.1.2 IPython: 6.1.0 sphinx: 1.6.2
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1838/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
212501628 MDU6SXNzdWUyMTI1MDE2Mjg= 1300 git version label yields version in violation of PEP 440 gerritholl 500246 closed 0     2 2017-03-07T17:23:00Z 2017-12-15T07:26:24Z 2017-12-15T07:26:24Z CONTRIBUTOR      

When an xarray installation does not match a released version ,it has a version number like 0.9.1-28-g769f120.

This violates PEP 440, which leads to multiple problems:

  • pip install --update will revert xarray back to 0.9.1, because it does not recognise that 0.9.1-28-g769f120 > 0.9.1
  • packages with an xarray dependencies will be considered not satisfied. Running a script through setuptools load_entry_point fails with pkg_resources.ContextualVersionConflict: (xarray 0.9.1-28-g769f120 (/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages), Requirement.parse('xarray>=0.8')

Instead, the version number above should be written as 0.9.1+r10345 or so, which would satisfy PEP 440 and not cause problems with pip or setuptools.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1300/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
228023777 MDU6SXNzdWUyMjgwMjM3Nzc= 1405 Using uint64 for Dataset indexing gives ValueError gerritholl 500246 closed 0     2 2017-05-11T15:05:20Z 2017-10-23T07:50:29Z 2017-10-23T07:50:29Z CONTRIBUTOR      

Trying to index a Dataset using an index array of dtype uint64 yields a ValueError. int64 works fine. See below:

``` In [13]: import xarray

In [14]: ds = xarray.Dataset({"A": (("x", "y"), arange(5*6).reshape(5,6))})

In [15]: ds[{"x": numpy.array([0], dtype="int64")}] Out[15]: <xarray.Dataset> Dimensions: (x: 1, y: 6) Dimensions without coordinates: x, y Data variables: A (x, y) int64 0 1 2 3 4 5

In [16]: ds[{"x": numpy.array([0], dtype="uint64")}]

ValueError Traceback (most recent call last) <ipython-input-16-4cf23af0967e> in <module>() ----> 1 ds[{"x": numpy.array([0], dtype="uint64")}]

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in getitem(self, key) 722 """ 723 if utils.is_dict_like(key): --> 724 return self.isel(**key) 725 726 if hashable(key):

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in isel(self, drop, indexers) 1147 for name, var in iteritems(self._variables): 1148 var_indexers = dict((k, v) for k, v in indexers if k in var.dims) -> 1149 new_var = var.isel(var_indexers) 1150 if not (drop and name in var_indexers): 1151 variables[name] = new_var

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py in isel(self, **indexers) 547 if dim in indexers: 548 key[i] = indexers[dim] --> 549 return self[tuple(key)] 550 551 def squeeze(self, dim=None):

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py in getitem(self, key) 377 dims = tuple(dim for k, dim in zip(key, self.dims) 378 if not isinstance(k, integer_types)) --> 379 values = self._indexable_data[key] 380 # orthogonal indexing should ensure the dimensionality is consistent 381 if hasattr(values, 'ndim'):

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in getitem(self, key) 467 468 def getitem(self, key): --> 469 key = self._convert_key(key) 470 return self._ensure_ndarray(self.array[key]) 471

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in _convert_key(self, key) 454 if any(not isinstance(k, integer_types + (slice,)) for k in key): 455 # key would trigger fancy indexing --> 456 key = orthogonal_indexer(key, self.shape) 457 return key 458

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in orthogonal_indexer(key, shape) 78 """ 79 # replace Ellipsis objects with slices ---> 80 key = list(canonicalize_indexer(key, len(shape))) 81 # replace 1d arrays and slices with broadcast compatible arrays 82 # note: we treat integers separately (instead of turning them into 1d

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in canonicalize_indexer(key, ndim) 66 return indexer 67 ---> 68 return tuple(canonicalize(k) for k in expanded_indexer(key, ndim)) 69 70

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in <genexpr>(.0) 66 return indexer 67 ---> 68 return tuple(canonicalize(k) for k in expanded_indexer(key, ndim)) 69 70

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in canonicalize(indexer) 63 'array indexing; all subkeys must be ' 64 'slices, integers or sequences of ' ---> 65 'integers or Booleans' % indexer) 66 return indexer 67

ValueError: invalid subkey array([0], dtype=uint64) for integer based array indexing; all subkeys must be slices, integers or sequences of integers or Booleans ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1405/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 26.332ms · About: xarray-datasette