home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

11 rows where state = "closed" and user = 221526 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, closed_at, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 6
  • issue 5

state 1

  • closed · 11 ✖

repo 1

  • xarray 11
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1473329967 I_kwDOAMm_X85X0Tsv 7350 Coordinate variable gains coordinate on subset dopplershift 221526 closed 0     5 2022-12-02T19:18:14Z 2022-12-05T22:56:30Z 2022-12-05T22:56:30Z CONTRIBUTOR      

What happened?

When subsetting a DataArray along a dimension down to a single item, the other coordinate variables gain this scalar coordinate.

What did you expect to happen?

Coordinate variables should not have their coordinates changed.

Minimal Complete Verifiable Example

```Python import numpy as np import xarray as xr

lat = np.array([25, 35, 45]) lon = np.array([-105, -95, -85, -75]) time = np.array([0, 1])

data = np.arange(lat.size * lon.size * time.size) test_data = xr.DataArray(data.reshape((time.size, lat.size, lon.size)), coords=dict(lat=lat, lon=lon, time=time), dims=('time', 'lat', 'lon'))

print(test_data.lat.coords) # Only 'lat' print(test_data.isel(time=0).lat.coords) # Has both 'lat' and 'time' pritn(test_data.isel(time=[0, 1]).lat.coords) # Only 'lat' ```

MVCE confirmation

  • [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [X] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [X] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [X] New issue — a search of GitHub Issues suggests this is not a duplicate.

Relevant log output

No response

Anything else we need to know?

This occurs with both the latest 2022.11.0 release and current main.

Environment

INSTALLED VERSIONS ------------------ commit: None python: 3.10.8 | packaged by conda-forge | (main, Nov 22 2022, 08:31:57) [Clang 14.0.6 ] python-bits: 64 OS: Darwin OS-release: 21.6.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.12.2 libnetcdf: 4.8.1 xarray: 2022.11.0 pandas: 1.5.2 numpy: 1.23.5 scipy: 1.9.3 netCDF4: 1.6.2 pydap: None h5netcdf: None h5py: None Nio: None zarr: None cftime: 1.6.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: 0.9.10.2 iris: None bottleneck: 1.3.5 dask: 2022.6.1 distributed: 2022.6.1 matplotlib: 3.6.2 cartopy: 0.21.0 seaborn: None numbagg: None fsspec: 2022.11.0 cupy: None pint: 0.20.1 sparse: None flox: None numpy_groupies: None setuptools: 65.5.1 pip: 22.3.1 conda: None pytest: 7.2.0 IPython: 8.6.0 sphinx: 5.3.0
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7350/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
667550022 MDU6SXNzdWU2Njc1NTAwMjI= 4283 Selection with datetime64[ns] fails with Pandas 1.1.0 dopplershift 221526 closed 0     2 2020-07-29T05:01:14Z 2020-09-16T01:33:30Z 2020-09-16T01:33:30Z CONTRIBUTOR      

I ran into this issue with a netCDF file with the following time variable: ``` double time1(time1) ; time1:_FillValue = NaN ; time1:standard_name = "time" ; time1:long_name = "time" ; time1:udunits = "Hour since 2017-09-05T12:00:00Z" ; time1:units = "Hour since 2017-09-05T12:00:00+00:00" ; time1:calendar = "proleptic_gregorian" ;

time1 = 0, 3, 6, 9, 12, 15, 18, 21, 24 ; but we can reproduce the problem with something as simple as:python import numpy as np import xarray as xr

t = np.array(['2017-09-05T12:00:00.000000000', '2017-09-05T15:00:00.000000000'], dtype='datetime64[ns]') da = xr.DataArray(np.ones(t.shape), dims=('time',), coords=(t,))

da.loc[{'time':t[0]}] # Works on pandas 1.0.5 this produces:pytb


KeyError Traceback (most recent call last) <ipython-input-11-3e0afa0bd195> in <module> ----> 1 da.loc[{'time':t[0]}]

~/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/dataarray.py in getitem(self, key) 196 labels = indexing.expanded_indexer(key, self.data_array.ndim) 197 key = dict(zip(self.data_array.dims, labels)) --> 198 return self.data_array.sel(**key) 199 200 def setitem(self, key, value) -> None:

~/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/dataarray.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs) 1147 1148 """ -> 1149 ds = self._to_temp_dataset().sel( 1150 indexers=indexers, 1151 drop=drop,

~/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/dataset.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs) 2099 """ 2100 indexers = either_dict_or_kwargs(indexers, indexers_kwargs, "sel") -> 2101 pos_indexers, new_indexes = remap_label_indexers( 2102 self, indexers=indexers, method=method, tolerance=tolerance 2103 )

~/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/coordinates.py in remap_label_indexers(obj, indexers, method, tolerance, **indexers_kwargs) 394 } 395 --> 396 pos_indexers, new_indexes = indexing.remap_label_indexers( 397 obj, v_indexers, method=method, tolerance=tolerance 398 )

~/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/indexing.py in remap_label_indexers(data_obj, indexers, method, tolerance) 268 coords_dtype = data_obj.coords[dim].dtype 269 label = maybe_cast_to_coords_dtype(label, coords_dtype) --> 270 idxr, new_idx = convert_label_indexer(index, label, dim, method, tolerance) 271 pos_indexers[dim] = idxr 272 if new_idx is not None:

~/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/indexing.py in convert_label_indexer(index, label, index_name, method, tolerance) 187 indexer = index.get_loc(label.item()) 188 else: --> 189 indexer = index.get_loc( 190 label.item(), method=method, tolerance=tolerance 191 )

~/miniconda3/envs/py38/lib/python3.8/site-packages/pandas/core/indexes/datetimes.py in get_loc(self, key, method, tolerance) 620 else: 621 # unrecognized type --> 622 raise KeyError(key) 623 624 try:

KeyError: 1504612800000000000 ```

what's interesting is changing the units of datetime64 to [s] works: ```python import numpy as np import xarray as xr

t = np.array(['2017-09-05T12:00:00.000000000', '2017-09-05T15:00:00.000000000'], dtype='datetime64[s]') da = xr.DataArray(np.ones(t.shape), dims=('time',), coords=(t,)) da.loc[{'time':t[0]}] # Works ```

Environment: Python 3.8 from conda-forge on macOS 10.15.4

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.8.5 | packaged by conda-forge | (default, Jul 24 2020, 01:06:20) [Clang 10.0.1 ] python-bits: 64 OS: Darwin OS-release: 19.6.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.1.0 numpy: 1.19.1 scipy: 1.5.2 netCDF4: 1.5.4 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: 0.9.8.3 iris: None bottleneck: None dask: 2.21.0 distributed: 2.21.0 matplotlib: 3.3.0 cartopy: 0.18.0 seaborn: None numbagg: None pint: 0.14 setuptools: 49.2.0.post20200712 pip: 20.1.1 conda: None pytest: 6.0.0 IPython: 7.16.1 sphinx: 2.4.4
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4283/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
605920781 MDExOlB1bGxSZXF1ZXN0NDA4MjM3MjUz 3998 Fix handling of abbreviated units like msec dopplershift 221526 closed 0     3 2020-04-23T22:43:51Z 2020-04-24T19:18:00Z 2020-04-24T07:16:10Z CONTRIBUTOR   0 pydata/xarray/pulls/3998

By default, xarray tries to decode times with pandas and falls back to cftime. This fixes the exception handler to fallback properly in the cases an unhandled abbreviated unit is passed in.

An additional item here would be to add support for msec, etc. to xarray's handling, but I wasn't sure the best way to handle that. I'm happy just if things properly fall back to cftime.

  • [ ] Closes #xxxx
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3998/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
323823894 MDExOlB1bGxSZXF1ZXN0MTg4NTg4ODEy 2144 Add strftime() to datetime accessor dopplershift 221526 closed 0     13 2018-05-16T23:37:34Z 2020-04-23T22:40:41Z 2019-06-01T03:22:44Z CONTRIBUTOR   0 pydata/xarray/pulls/2144

This matches pandas and makes it possible to pass a datetime dataarray to something expecting to be able to use strftime().

  • [x] Closes #2090
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2144/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
501730864 MDExOlB1bGxSZXF1ZXN0MzIzOTQ4NTA1 3367 Remove setting of universal wheels dopplershift 221526 closed 0     4 2019-10-02T21:15:48Z 2019-10-05T20:05:58Z 2019-10-02T21:43:45Z CONTRIBUTOR   0 pydata/xarray/pulls/3367

Universal wheels indicate that one wheel supports Python 2 and 3. This is no longer the case for xarray. This causes builds to generate files with names like xarray-0.13.0-py2.py3-none-any.whl, which can cause pip to incorrectly install the wheel on Python 2 when installing from a list of wheel files.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3367/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
318761320 MDU6SXNzdWUzMTg3NjEzMjA= 2090 strftime and/or format support for DatetimeAccessor dopplershift 221526 closed 0     1 2018-04-29T23:55:12Z 2019-06-01T07:51:17Z 2019-06-01T07:51:17Z CONTRIBUTOR      

There's no easy way currently to control the conversion to string of time values/series. For this purpose, Panda's own .dt attribute has an implementation of strftime.

Is there interest in adding similar functionality to xarray?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2090/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
322019660 MDExOlB1bGxSZXF1ZXN0MTg3MjU4ODcx 2115 Fix docstring formatting for load(). dopplershift 221526 closed 0     1 2018-05-10T17:44:32Z 2018-05-10T18:24:04Z 2018-05-10T17:50:00Z CONTRIBUTOR   0 pydata/xarray/pulls/2115

Need '::' to introduce a code literal block. This was causing MetPy's doc build to warn (since we inherit AbstractDataStore).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2115/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
308768432 MDExOlB1bGxSZXF1ZXN0MTc3NTkzMzUy 2016 Allow _FillValue and missing_value to differ (Fixes #1749) dopplershift 221526 closed 0     9 2018-03-26T23:20:10Z 2018-04-20T00:35:22Z 2018-03-31T01:16:00Z CONTRIBUTOR   0 pydata/xarray/pulls/2016

The CF standard permits both values, and them to have different values, so we should not be treating this as an error--just mask out all of them.

  • [x] Closes #1749 (remove if there is no corresponding issue, which should only be the case for minor changes)
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2016/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
303727896 MDU6SXNzdWUzMDM3Mjc4OTY= 1976 What's wrong with "conflicting" _FillValue and missing_value? dopplershift 221526 closed 0     2 2018-03-09T05:21:27Z 2018-03-09T17:45:35Z 2018-03-09T17:45:35Z CONTRIBUTOR      

So this exception: ``` ValueError: Conflicting _FillValue and missing_value attrs on a variable 'MergedBaseReflectivityQC_altitude_above_msl': -999.0 vs. -99.0

Consider opening the offending dataset using decode_cf=False, correcting the attrs and decoding explicitly using xarray.decode_cf(). `` Why is having_FillValueandmissing_valuedifferent considered an error in decoding CF? It's perfectly CF-compliant, especially since_FillValueis a scalar (used by the netCDF library to initialize an array), andmissing_value` can be a vector (representing one or more undefined or invalid values).

This happens in this case because the source GRIB file has one value specified for "missing" (maps to missing_value) and another for "no coverage" (which has been mapped to _FillValue).

Is this a technical limitation? Or just something that needs an implementation?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1976/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
250747314 MDExOlB1bGxSZXF1ZXN0MTM2MTEzMjA2 1508 ENH: Support using opened netCDF4.Dataset (Fixes #1459) dopplershift 221526 closed 0   0.10 2415632 5 2017-08-16T20:19:01Z 2017-08-31T22:24:36Z 2017-08-31T17:18:51Z CONTRIBUTOR   0 pydata/xarray/pulls/1508

Make the filename argument to NetCDF4DataStore polymorphic so that a Dataset can be passed in.

  • [x] Closes #1459
  • [x] Tests added / passed
  • [x] Passes git diff upstream/master | flake8 --diff
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

1459 discussed adding an alternate constructor (i.e. a class method) to NetCDF4DataStore to allow this, which would be my preferred approach rather than making a filename polymorphic (via isinstance). Unfortunately, alternate constructors only work by taking one set of parameters (or setting defaults) and then passing them to the original constructor. Given that, there's no way to make an alternate constructor without also making the original constructor somehow aware of this functionality--or breaking backwards-compatibility. I'm open to suggestions to the contrary.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1508/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
236595831 MDU6SXNzdWUyMzY1OTU4MzE= 1459 xarray.Dataset from existing netCDF4.Dataset dopplershift 221526 closed 0     2 2017-06-16T21:03:21Z 2017-08-31T17:18:51Z 2017-08-31T17:18:51Z CONTRIBUTOR      

It would be really handy to be able to initialize a xarray.Dataset instance from an already opened instance of netCDF4.Dataset. I have a lot of code where I'm already returning an opened netCDF4 file and this would streamline the process of hooking xarray into that.

It seems like the quick solution here would be to make NetCDF4DataStore accept a netCDF4.Dataset instance as filename, which would bypass the creation of a new instance. Thoughts?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1459/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 23.76ms · About: xarray-datasette