issues
37 rows where user = 500246 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, closed_at, state_reason, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
377356113 | MDU6SXNzdWUzNzczNTYxMTM= | 2542 | full_like, ones_like, zeros_like should retain subclasses | gerritholl 500246 | closed | 0 | 4 | 2018-11-05T11:22:49Z | 2023-11-05T06:27:31Z | 2023-11-05T06:27:31Z | CONTRIBUTOR | Code Sample,```python Your code hereimport numpy import xarray class MyDataArray(xarray.DataArray): pass da = MyDataArray(numpy.arange(5)) da2 = xarray.zeros_like(da) print(type(da), type(da2)) ``` Problem descriptionI would expect that
Expected OutputI would hope as an output:
In principle changing this could break people's code, so if a change is implemented it should probably be through an optional keyword argument to the Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2542/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
not_planned | xarray 13221727 | issue | ||||||
352999600 | MDU6SXNzdWUzNTI5OTk2MDA= | 2377 | Comparing scalar xarray with ma.masked fails with ValueError: assignment destination is read-only | gerritholl 500246 | closed | 0 | 5 | 2018-08-22T15:11:54Z | 2023-05-17T16:06:01Z | 2023-05-17T16:06:01Z | CONTRIBUTOR | Code Sample, a copy-pastable example if possible
Problem descriptionThis results in ```ValueError Traceback (most recent call last) <ipython-input-102-f6226708b971> in <module>() ----> 1 xarray.DataArray(0) > numpy.ma.masked /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/dataarray.py in func(self, other) 1808 1809 variable = (f(self.variable, other_variable) -> 1810 if not reflexive 1811 else f(other_variable, self.variable)) 1812 coords = self.coords._merge_raw(other_coords) /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in func(self, other) 1580 if not reflexive 1581 else f(other_data, self_data)) -> 1582 result = Variable(dims, new_data) 1583 return result 1584 return func /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in init(self, dims, data, attrs, encoding, fastpath) 260 unrecognized encoding items. 261 """ --> 262 self._data = as_compatible_data(data, fastpath=fastpath) 263 self._dims = self._parse_dimensions(dims) 264 self._attrs = None /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in as_compatible_data(data, fastpath) 177 dtype, fill_value = dtypes.maybe_promote(data.dtype) 178 data = np.asarray(data, dtype=dtype) --> 179 data[mask] = fill_value 180 else: 181 data = np.asarray(data) ValueError: assignment destination is read-only ``` Expected OutputTo be consistent, the result should be identical to the result of
which would be
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2377/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1608352581 | I_kwDOAMm_X85f3YNF | 7581 | xr.where loses attributes despite keep_attrs=True | gerritholl 500246 | closed | 0 | 1 | 2023-03-03T10:14:34Z | 2023-04-06T01:58:45Z | 2023-04-06T01:58:45Z | CONTRIBUTOR | What happened?I'm using What did you expect to happen?I expect that if I use Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
Anything else we need to know?I can make a workaround by turning the logic around, such as The workaround works in this case, but Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.11.0 | packaged by conda-forge | (main, Jan 14 2023, 12:27:40) [GCC 11.3.0]
python-bits: 64
OS: Linux
OS-release: 5.3.18-150300.59.76-default
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_GB.UTF-8
LOCALE: ('en_GB', 'UTF-8')
libhdf5: 1.12.2
libnetcdf: 4.9.1
xarray: 2023.2.0
pandas: 1.5.3
numpy: 1.24.2
scipy: 1.10.1
netCDF4: 1.6.2
pydap: None
h5netcdf: 1.1.0
h5py: 3.8.0
Nio: None
zarr: 2.13.6
cftime: 1.6.2
nc_time_axis: None
PseudoNetCDF: None
rasterio: 1.3.6
cfgrib: None
iris: None
bottleneck: 1.3.6
dask: 2023.2.1
distributed: 2023.2.1
matplotlib: 3.7.0
cartopy: 0.21.1
seaborn: None
numbagg: None
fsspec: 2023.1.0
cupy: None
pint: 0.20.1
sparse: None
flox: None
numpy_groupies: None
setuptools: 67.4.0
pip: 23.0.1
conda: None
pytest: 7.2.1
mypy: None
IPython: 8.7.0
sphinx: 5.3.0
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7581/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
203999231 | MDU6SXNzdWUyMDM5OTkyMzE= | 1238 | `set_index` converts string-dtype to object-dtype | gerritholl 500246 | open | 0 | 10 | 2017-01-30T12:37:05Z | 2023-03-13T14:09:21Z | CONTRIBUTOR | 'Dataset.set_index' apparently changes a ``` In [108]: ds = xarray.Dataset({"x": (("a", "b"), arange(25).reshape(5,5)+100), "y": ("b", arange(5)-100)}, {"a": arange(5), "b": arange(5)*2, "c": (("a",), list("ABCDE"))}) In [109]: print(ds) <xarray.Dataset> Dimensions: (a: 5, b: 5) Coordinates: * b (b) int64 0 2 4 6 8 c (a) <U1 'A' 'B' 'C' 'D' 'E' * a (a) int64 0 1 2 3 4 Data variables: x (a, b) int64 100 101 102 103 104 105 106 107 108 109 110 111 ... y (b) int64 -100 -99 -98 -97 -96 In [110]: print(ds.set_index(a='c')) <xarray.Dataset> Dimensions: (a: 5, b: 5) Coordinates: * b (b) int64 0 2 4 6 8 * a (a) object 'A' 'B' 'C' 'D' 'E' Data variables: x (a, b) int64 100 101 102 103 104 105 106 107 108 109 110 111 ... y (b) int64 -100 -99 -98 -97 -96 ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1238/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
741806260 | MDU6SXNzdWU3NDE4MDYyNjA= | 4579 | Invisible differences between arrays using IntervalIndex | gerritholl 500246 | open | 0 | 2 | 2020-11-12T17:54:55Z | 2022-10-03T15:09:25Z | CONTRIBUTOR | What happened: I have two What you expected to happen: I expect two arrays that appear identical to behave identically. If they don't behave identically then there should be some way to tell the difference (apart from Minimal Complete Verifiable Example: ```python import xarray import pandas da1 = xarray.DataArray([0, 1, 2], dims=("x",), coords={"x": pandas.interval_range(0, 2, 3)}) da2 = xarray.DataArray([0, 1, 2], dims=("x",), coords={"x": pandas.interval_range(0, 2, 3).to_numpy()}) print(repr(da1) == repr(da2)) print(repr(da1.x) == repr(da2.x)) print(da1.x.dtype == da2.x.dtype) identical? No:print(da1.equals(da2)) print(da1.x.equals(da2.x)) in particular:da1.sel(x=1) # works da2.sel(x=1) # fails ``` Results in: ``` True True True False False Traceback (most recent call last): File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/pandas/core/indexes/base.py", line 2895, in get_loc return self._engine.get_loc(casted_key) File "pandas/_libs/index.pyx", line 70, in pandas._libs.index.IndexEngine.get_loc File "pandas/_libs/index.pyx", line 101, in pandas._libs.index.IndexEngine.get_loc File "pandas/_libs/hashtable_class_helper.pxi", line 1675, in pandas._libs.hashtable.PyObjectHashTable.get_item File "pandas/_libs/hashtable_class_helper.pxi", line 1683, in pandas._libs.hashtable.PyObjectHashTable.get_item KeyError: 1 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "mwe105.py", line 19, in <module> da2.sel(x=1) # fails File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/dataarray.py", line 1143, in sel ds = self._to_temp_dataset().sel( File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/dataset.py", line 2105, in sel pos_indexers, new_indexes = remap_label_indexers( File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/coordinates.py", line 397, in remap_label_indexers pos_indexers, new_indexes = indexing.remap_label_indexers( File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/indexing.py", line 275, in remap_label_indexers idxr, new_idx = convert_label_indexer(index, label, dim, method, tolerance) File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/xarray/core/indexing.py", line 196, in convert_label_indexer indexer = index.get_loc(label_value, method=method, tolerance=tolerance) File "/data/gholl/miniconda3/envs/py38/lib/python3.8/site-packages/pandas/core/indexes/base.py", line 2897, in get_loc raise KeyError(key) from err KeyError: 1 ``` Additional context I suppose this happens because under the hood xarray does something clever to support pandas-style indexing even though the coordinate variable appears like a numpy array with an object dtype, and that this cleverness is lost if the object is already converted to a numpy array. But there is, as far as I can see, no way to tell the difference once the objects have been created. Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.8.6 | packaged by conda-forge | (default, Oct 7 2020, 19:08:05) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.1 pandas: 1.1.4 numpy: 1.19.4 scipy: 1.5.3 netCDF4: 1.5.4 pydap: None h5netcdf: 0.8.1 h5py: 3.1.0 Nio: None zarr: 2.5.0 cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.7 cfgrib: None iris: None bottleneck: None dask: 2.30.0 distributed: 2.30.1 matplotlib: 3.3.2 cartopy: 0.18.0 seaborn: None numbagg: None pint: None setuptools: 49.6.0.post20201009 pip: 20.2.4 conda: installed pytest: 6.1.2 IPython: 7.19.0 sphinx: 3.3.0 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4579/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
217216935 | MDU6SXNzdWUyMTcyMTY5MzU= | 1329 | Cannot open NetCDF file if dimension with time coordinate has length 0 (`ValueError` when decoding CF datetime) | gerritholl 500246 | closed | 0 | 7 | 2017-03-27T11:33:07Z | 2022-08-10T17:25:20Z | 2022-08-10T17:25:20Z | CONTRIBUTOR | If a data set has a zero-sized coordinate that is a time index, reading fails. A ``` $ cat mwe.py !/usr/bin/env pythonimport numpy import xarray ds = xarray.Dataset( {"a": ("x", [])}, coords={"x": numpy.zeros(shape=0, dtype="M8[ns]")}) ds.to_netcdf("/tmp/test.nc") xarray.open_dataset("/tmp/test.nc") $ ./mwe.py Traceback (most recent call last): File "./mwe.py", line 12, in <module> xarray.open_dataset("/tmp/test.nc") File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py", line 302, in open_dataset return maybe_decode_store(store, lock) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py", line 223, in maybe_decode_store drop_variables=drop_variables) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py", line 952, in decode_cf ds = Dataset(vars, attrs=attrs) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py", line 358, in init self._set_init_vars_and_dims(data_vars, coords, compat) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py", line 373, in _set_init_vars_and_dims data_vars, coords, compat=compat) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/merge.py", line 365, in merge_data_and_coords return merge_core(objs, compat, join, explicit_coords=explicit_coords) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/merge.py", line 413, in merge_core expanded = expand_variable_dicts(aligned) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/merge.py", line 213, in expand_variable_dicts var = as_variable(var, name=name) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py", line 83, in as_variable obj = obj.to_index_variable() File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py", line 322, in to_index_variable encoding=self._encoding, fastpath=True) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py", line 1173, in init self._data = PandasIndexAdapter(self._data) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py", line 497, in init self.array = utils.safe_cast_to_index(array) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/utils.py", line 57, in safe_cast_to_index index = pd.Index(np.asarray(array), **kwargs) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/numpy/core/numeric.py", line 531, in asarray return array(a, dtype, copy=False, order=order) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py", line 373, in array return np.asarray(array[self.key], dtype=None) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py", line 408, in getitem calendar=self.calendar) File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py", line 151, in decode_cf_datetime pd.to_timedelta(flat_num_dates.min(), delta) + ref_date File "/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/numpy/core/_methods.py", line 29, in _amin return umr_minimum(a, axis, None, out, keepdims) ValueError: zero-size array to reduction operation minimum which has no identity $ ncdump /tmp/test.nc netcdf test { dimensions: x = UNLIMITED ; // (0 currently) variables: double a(x) ; a:_FillValue = NaN ; int64 x(x) ; x:units = "days since 1970-01-01 00:00:00" ; x:calendar = "proleptic_gregorian" ; // global attributes: :_NCProperties = "version=1|netcdflibversion=4.4.1|hdf5libversion=1.8.18" ; data: } ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1329/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
232623945 | MDU6SXNzdWUyMzI2MjM5NDU= | 1435 | xarray.plot.imshow with datetime coordinates results in blank plot | gerritholl 500246 | open | 0 | 6 | 2017-05-31T16:31:30Z | 2022-05-03T01:56:37Z | CONTRIBUTOR | ``` In [72]: da = xarray.DataArray(arange(5*6).reshape(5,6), dims=("A", "B"), coords={"A": arange(5), "B": pd.date_range("2000-01-01", periods=6)}) In [73]: da.plot.imshow() Out[73]: <matplotlib.image.AxesImage at 0x7f699cf1acf8> ``` The resulting plot has the correct axes and colorbar, but the contents of the plot itself are blank. Upon moving the cursor over the plot, there is an exception in
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1435/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
686461572 | MDU6SXNzdWU2ODY0NjE1NzI= | 4378 | Plotting when Interval coordinate is timedelta-based | gerritholl 500246 | open | 0 | 2 | 2020-08-26T16:36:27Z | 2022-04-18T21:55:15Z | CONTRIBUTOR | Is your feature request related to a problem? Please describe. The xarray plotting interface supports coordinates containing ```python import numpy as np import pandas as pd import xarray as xr da = xr.DataArray( np.arange(10), dims=("x",), coords={"x": [pd.Interval(i, i+1) for i in range(10)]}) da.plot() # works da = xr.DataArray( np.arange(10), dims=("x",), coords={"x": [pd.Interval( d-pd.Timestamp("2000-01-01"), d-pd.Timestamp("2000-01-01")+pd.Timedelta("1H")) for d in pd.date_range("2000-01-01", "2000-01-02", 10)]}) da.plot() # fails ``` The latter fails with:
This error message is somewhat confusing, because the coordinates are "dates of type (...) pd.Interval", but perhaps a timedelta is not considered a date. Describe the solution you'd like I would like that I can use the xarray plotting interface for any pandas.Interval coordinate, including Describe alternatives you've considered I'll "manually" calculate the midpoints and use those as a timedelta coordinate instead. Additional context It seems that regular timedeltas aren't really supported either, although they don't cause an error message, they rather produce incorrect results. There's probably a related issue somewhere, but I can't find it now. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4378/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
203630267 | MDU6SXNzdWUyMDM2MzAyNjc= | 1234 | `where` grows new dimensions for unrelated variables | gerritholl 500246 | open | 0 | 5 | 2017-01-27T13:02:34Z | 2022-04-18T16:04:16Z | CONTRIBUTOR | In the example below, the dimensionality for data variable ``` In [46]: ds = xarray.Dataset({"x": (("a", "b"), arange(25).reshape(5,5)+100), "y": ("b", arange(5)-100)}, {"a": arange(5), "b": arange(5)*2, "c": (("a",), list("ABCDE"))})
``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1234/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
751732952 | MDU6SXNzdWU3NTE3MzI5NTI= | 4612 | Assigning nan to int-dtype array converts nan to int | gerritholl 500246 | open | 0 | 1 | 2020-11-26T17:00:45Z | 2021-01-02T03:55:30Z | CONTRIBUTOR | (I am almost sure this already exists as an issue, but I can't find the original) What happened: When assigning nan to a integer-dtype array, the nan gets incorrectly inverted to int. What you expected to happen: I expect to get a Minimal Complete Verifiable Example:
Gives:
Anything else we need to know?: In numpy the equivalent code raises This is related but different from #2945. In #2945, xarray behaves the same as numpy. In #4612, xarray behaves differently from numpy. Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.8.6 | packaged by conda-forge | (default, Oct 7 2020, 19:08:05) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.1 pandas: 1.1.4 numpy: 1.19.4 scipy: 1.5.3 netCDF4: 1.5.4 pydap: None h5netcdf: 0.8.1 h5py: 3.1.0 Nio: None zarr: 2.5.0 cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.7 cfgrib: None iris: None bottleneck: None dask: 2.30.0 distributed: 2.30.1 matplotlib: 3.3.2 cartopy: 0.18.0 seaborn: None numbagg: None pint: None setuptools: 49.6.0.post20201009 pip: 20.2.4 conda: installed pytest: 6.1.2 IPython: 7.19.0 sphinx: 3.3.0 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4612/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
376104925 | MDU6SXNzdWUzNzYxMDQ5MjU= | 2529 | numpy.insert on DataArray may silently result in array inconsistent with its coordinates | gerritholl 500246 | closed | 0 | 1 | 2018-10-31T18:33:23Z | 2020-11-07T21:55:42Z | 2020-11-07T21:55:42Z | CONTRIBUTOR | ```python import numpy import xarray da = xarray.DataArray(numpy.arange(103).reshape(10, 3), dims=("x", "y"), coords={"foo": (("x", "y"), numpy.arange(310).reshape(10,3))}) print(da.shape == da["foo"].shape) da2 = numpy.insert(da, 3, 0, axis=0) print(da2.shape == da2["foo"].shape) ``` Problem descriptionRunning the code snippet gives
and does not raise any exception. In the resulting Expected OutputI would expect to get an exception, telling me that the insertion has failed because there are coordinates associated with the axis along which we are inserting values. It would be nice to have an Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2529/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
714844298 | MDExOlB1bGxSZXF1ZXN0NDk3ODU3MTA0 | 4485 | Handle scale_factor and add_offset as scalar | gerritholl 500246 | closed | 0 | 3 | 2020-10-05T13:31:36Z | 2020-10-16T21:20:14Z | 2020-10-11T20:06:33Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/4485 | The h5netcdf engine exposes single-valued attributes as arrays of shape (1,), which is correct according to the NetCDF standard, but may cause a problem when reading a value of shape () before the scale_factor and add_offset have been applied. This PR adds a check for the dimensionality of add_offset and scale_factor and ensures they are scalar before they are used for further processing, adds a unit test to verify that this works correctly, and a note to the documentation to warn users of this difference between the h5netcdf and netcdf4 engines.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4485/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
618985094 | MDU6SXNzdWU2MTg5ODUwOTQ= | 4065 | keep_attrs not respected for unary operators | gerritholl 500246 | closed | 0 | 2 | 2020-05-15T13:55:14Z | 2020-10-14T16:29:51Z | 2020-10-14T16:29:51Z | CONTRIBUTOR | The xarray global option MCVE Code Sample
Expected OutputI expect
Problem DescriptionI get:
I get the same for the other unary operators VersionsTested with latest xarray master (see below for details). Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.8.2 | packaged by conda-forge | (default, Mar 23 2020, 18:16:37) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.15.2.dev64+g2542a63f pandas: 1.0.3 numpy: 1.18.1 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: 2.4.0 cftime: 1.1.1.2 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.3 cfgrib: None iris: None bottleneck: None dask: 2.14.0 distributed: 2.14.0 matplotlib: 3.2.1 cartopy: 0.17.0 seaborn: None numbagg: None pint: None setuptools: 46.1.3.post20200325 pip: 20.0.2 conda: installed pytest: 5.4.1 IPython: 7.13.0 sphinx: None |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4065/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
710876876 | MDU6SXNzdWU3MTA4NzY4NzY= | 4471 | Numeric scalar variable attributes (including fill_value, scale_factor, add_offset) are 1-d instead of 0-d with h5netcdf engine, triggering ValueError: non-broadcastable output on application when loading single elements | gerritholl 500246 | closed | 0 | 13 | 2020-09-29T08:15:48Z | 2020-10-11T20:06:33Z | 2020-10-11T20:06:33Z | CONTRIBUTOR | What happened: When I try to open a NetCDF file using the
What you expected to happen: I expect the data access to work similarly as when opening with other engines. Minimal Complete Verifiable Example:
Anything else we need to know?: An earlier version of this issue, and some comments, refer to fsspec or working on open files, but that proved to have nothing to do with the problem. Environment: I've confirmed this issue installing xarray from latest master, which means xarray 0.16.2.dev11+gf821fe20 at the time of writing, Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.8.5 | packaged by conda-forge | (default, Sep 24 2020, 16:55:52) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp150.12.82-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.2.dev11+gf821fe20 pandas: 1.1.2 numpy: 1.19.1 scipy: 1.5.2 netCDF4: 1.5.4 pydap: None h5netcdf: 0.8.1 h5py: 2.10.0 Nio: None zarr: 2.4.0 cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.6 cfgrib: None iris: None bottleneck: None dask: 2.27.0 distributed: 2.27.0 matplotlib: 3.3.2 cartopy: None seaborn: None numbagg: None pint: None setuptools: 49.6.0.post20200917 pip: 20.2.3 conda: None pytest: 6.0.2 IPython: None sphinx: None |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4471/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
283345586 | MDU6SXNzdWUyODMzNDU1ODY= | 1792 | Comparison with masked array yields object-array with nans for masked values | gerritholl 500246 | open | 0 | 3 | 2017-12-19T19:37:13Z | 2020-10-11T13:34:25Z | CONTRIBUTOR | Code Sample, a copy-pastable example if possible``` $ cat mwe.py !/usr/bin/env python3.6import xarray import numpy da = xarray.DataArray(numpy.arange(5)) ma = numpy.ma.masked_array(numpy.arange(5), [True, False, False, False, True]) print(da>ma) $ ./mwe.py <xarray.DataArray (dim_0: 5)> array([nan, False, False, False, nan], dtype=object) Dimensions without coordinates: dim_0 ``` Problem descriptionA comparison between a Expected OutputI would expect the masked array to be dropped (which it is) and an array to be returned equivalent to the comparison
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1792/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
204071440 | MDU6SXNzdWUyMDQwNzE0NDA= | 1240 | Cannot use xarrays own times for indexing | gerritholl 500246 | closed | 0 | 9 | 2017-01-30T17:12:08Z | 2020-08-28T09:48:56Z | 2018-03-18T21:04:07Z | CONTRIBUTOR | I need to get the first Δt from the start of my dataset, i.e. ``` In [282]: time = pd.date_range('2000-01-01', freq='H', periods=365 * 24) In [283]: ds = xarray.Dataset({'foo': ('time', np.arange(365 * 24)), 'time': time}) In [284]: ds.sel(time=slice(ds["time"][0], ds["time"][10]))TypeError Traceback (most recent call last) <ipython-input-284-a101e126e1b0> in <module>() ----> 1 ds.sel(time=slice(ds["time"][0], ds["time"][10])) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in sel(self, method, tolerance, drop, indexers) 1180 """ 1181 pos_indexers, new_indexes = indexing.remap_label_indexers( -> 1182 self, indexers, method=method, tolerance=tolerance 1183 ) 1184 result = self.isel(drop=drop, pos_indexers) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in remap_label_indexers(data_obj, indexers, method, tolerance) 286 else: 287 idxr, new_idx = convert_label_indexer(index, label, --> 288 dim, method, tolerance) 289 pos_indexers[dim] = idxr 290 if new_idx is not None: /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in convert_label_indexer(index, label, index_name, method, tolerance) 183 indexer = index.slice_indexer(_try_get_item(label.start), 184 _try_get_item(label.stop), --> 185 _try_get_item(label.step)) 186 if not isinstance(indexer, slice): 187 # unlike pandas, in xarray we never want to silently convert a slice /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/tseries/index.py in slice_indexer(self, start, end, step, kind) 1496 1497 try: -> 1498 return Index.slice_indexer(self, start, end, step, kind=kind) 1499 except KeyError: 1500 # For historical reasons DatetimeIndex by default supports /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in slice_indexer(self, start, end, step, kind) 2995 """ 2996 start_slice, end_slice = self.slice_locs(start, end, step=step, -> 2997 kind=kind) 2998 2999 # return a slice /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in slice_locs(self, start, end, step, kind) 3174 start_slice = None 3175 if start is not None: -> 3176 start_slice = self.get_slice_bound(start, 'left', kind) 3177 if start_slice is None: 3178 start_slice = 0 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in get_slice_bound(self, label, side, kind) 3113 # For datetime indices label may be a string that has to be converted 3114 # to datetime boundary according to its resolution. -> 3115 label = self._maybe_cast_slice_bound(label, side, kind) 3116 3117 # we need to look up the label /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/tseries/index.py in _maybe_cast_slice_bound(self, label, side, kind) 1444 1445 if is_float(label) or isinstance(label, time) or is_integer(label): -> 1446 self._invalid_indexer('slice', label) 1447 1448 if isinstance(label, compat.string_types): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/pandas/indexes/base.py in _invalid_indexer(self, form, key) 1282 "indexers [{key}] of {kind}".format( 1283 form=form, klass=type(self), key=key, -> 1284 kind=type(key))) 1285 1286 def get_duplicates(self): TypeError: cannot do slice indexing on <class 'pandas.tseries.index.DatetimeIndex'> with these indexers [946684800000000000] of <class 'int'> ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1240/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
212177054 | MDU6SXNzdWUyMTIxNzcwNTQ= | 1297 | Encoding lost upon concatenation | gerritholl 500246 | closed | 0 | 6 | 2017-03-06T16:33:40Z | 2020-04-05T19:12:41Z | 2019-02-06T22:51:19Z | CONTRIBUTOR | When using ``` In [64]: da = xarray.DataArray([1, 2, 3, 2, 1]) In [65]: da.attrs.update(foo="bar") In [66]: da.encoding.update(complevel=5) In [67]: da2 = xarray.concat((da, da), dim="new") In [68]: print(da2.attrs) OrderedDict([('foo', 'bar')]) In [69]: print(da2.encoding) {} ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1297/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
199188476 | MDU6SXNzdWUxOTkxODg0NzY= | 1194 | Use masked arrays while preserving int | gerritholl 500246 | open | 0 | 9 | 2017-01-06T12:40:22Z | 2020-03-29T20:37:29Z | CONTRIBUTOR | A great beauty of numpys masked arrays is that it works with any dtype, since it does not use ``` In [137]: x = arange(30, dtype="i1").reshape(3, 10) In [138]: xr.Dataset({"count": (["x", "y"], ma.masked_where(x%5>3, x))}, coords={"x": range(3), "y": ...: range(10)}) Out[138]: <xarray.Dataset> Dimensions: (x: 3, y: 10) Coordinates: * y (y) int64 0 1 2 3 4 5 6 7 8 9 * x (x) int64 0 1 2 Data variables: count (x, y) float64 0.0 1.0 2.0 3.0 nan 5.0 6.0 7.0 8.0 nan 10.0 ... ``` This happens in the function Such type “promotion” is unaffordable for me; the memory consumption of my multi-gigabyte arrays would explode by a factor 4. Secondly, many of my integer-dtype fields are bit arrays, for which floating point representation is not desirable. It would greatly benefit (See also: Stackoverflow question) |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1194/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
528154893 | MDU6SXNzdWU1MjgxNTQ4OTM= | 3572 | Context manager `AttributeError` when engine='h5netcdf' | gerritholl 500246 | closed | 0 | 2 | 2019-11-25T15:19:29Z | 2019-11-25T16:12:37Z | 2019-11-25T16:12:37Z | CONTRIBUTOR | Opening this NetCDF file works fine with the default engine, but fails with AttributeError with the h5netcdf engine: MCVE Code SampleData available from EUMETSAT: https://www.eumetsat.int/website/home/Satellites/FutureSatellites/MeteosatThirdGeneration/MTGData/MTGUserTestData/index.html --> ftp://ftp.eumetsat.int/pub/OPS/out/test-data/Test-data-for-External-Users/MTG_FCI_Test-Data/ --> uncompressed ```python import xarray f = "/path/to/.../W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410114434_GTT_DEV_20170410113925_20170410113934_N__C_0070_0067.nc" ds = xarray.open_dataset(f, engine="h5netcdf") ``` Expected OutputNo output at all. Problem DescriptionResults in
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3572/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
442617907 | MDU6SXNzdWU0NDI2MTc5MDc= | 2954 | Segmentation fault reading many groups from many files | gerritholl 500246 | closed | 0 | 14 | 2019-05-10T09:12:34Z | 2019-07-12T16:48:26Z | 2019-07-12T16:48:26Z | CONTRIBUTOR | This is probably the wrong place to report it, but I haven't been able to reproduce this without using xarray. Repeatedly opening NetCDF4/HDF5 files and reading a group from them, triggers a Segmentation Fault after about 130–150 openings. See details below. Code Sample, a copy-pastable example if possible```python from itertools import count, product import netCDF4 import glob import xarray files = sorted(glob.glob("/media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/BODY.nc")) get all groupsdef get_groups(ds, pre=""): for g in ds.groups.keys(): nm = pre + "/" + g yield from get_groups(ds[g], nm) yield nm with netCDF4.Dataset(files[0]) as ds: groups = sorted(list(get_groups(ds))) print("total groups", len(groups), "total files", len(files)) ds_all = [] ng = 20 nf = 20 print("using groups", ng, "using files", nf) for (i, (g, f)) in zip(count(), product(groups[:ng], files[:nf])): print("attempting", i, "group", g, "from", f) ds = xarray.open_dataset( f, group=g, decode_cf=False) ds_all.append(ds) ``` Problem descriptionI have 70 NetCDF-4 files with 70 groups each. When I cycle through the files and read one group from them at the time, after about 130–150 times, the next opening fails with a Segmentation Fault. If I try to read one group from one file at the time, that would require a total of 70*70=4900 openings. If I limit to 20 groups from 20 files in total, it would require 400 openings. In either case, it fails after about 130–150 times. I'm using the Python xarray interface, but the error occurs in the HDF5 library. The message belows includes the traceback in Python: ```
HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 140107218855616: [9/1985]
#000: H5D.c line 485 in H5Dget_create_plist(): Can't get creation plist
major: Dataset
minor: Can't get value
#001: H5Dint.c line 3159 in H5D__get_create_plist(): can't get dataset's creation property list
major: Dataset
minor: Can't get value
#002: H5Dint.c line 3296 in H5D_get_create_plist(): datatype conversion failed
major: Dataset During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/tmp/mwe9.py", line 24, in <module> f, group=g, decode_cf=False) File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/api.py", line 363, in open_dataset filename_or_obj, group=group, lock=lock, backend_kwargs) File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 352, in open return cls(manager, lock=lock, autoclose=autoclose) File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 311, in init self.format = self.ds.data_model File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 356, in ds return self._manager.acquire().value File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 173, in acquire file = self._opener(*self._args, kwargs) File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 244, in _open_netcdf4_group ds = nc4.Dataset(filename, mode=mode, **kwargs) File "netCDF4/_netCDF4.pyx", line 2291, in netCDF4._netCDF4.Dataset.init File "netCDF4/_netCDF4.pyx", line 1855, in netCDF4._netCDF4._ensure_nc_success OSError: [Errno -101] NetCDF: HDF error: b'/media/nas/x21308/2019_05_Testdata/MTG/FCI/FDHSI/uncompressed/20170410_RC70/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--L2P-NC4E_C_EUMT_20170410114417_GTT_DEV_20170410113908_20170410113917_N__C_0070_0065.nc' ``` More usually however, it fails with a Segmentation Fault and no further information. The failure might happen in any file. The full output of my script might end with:
prior to the segmentation fault. When running with ``` Fatal Python error: Segmentation fault Current thread 0x00007ff6ab89d6c0 (most recent call first): File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 244 in open_netcdf4_group File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 173 in acquire File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4.py", line 356 in ds File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 311 in init File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 352 in open File "/media/nas/x21324/miniconda3/envs/py37d/lib/python3.7/site-packages/xarray/backends/api.py", line 363 in open_dataset File "/tmp/mwe9.py", line 24 in <module> Segmentation fault (core dumped) ``` Expected OutputI expect no segmentation fault. Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2954/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
246093122 | MDU6SXNzdWUyNDYwOTMxMjI= | 1494 | AssertionError when storing datetime coordinates of wrong units | gerritholl 500246 | closed | 0 | 2 | 2017-07-27T16:11:48Z | 2019-06-30T04:28:18Z | 2019-06-30T04:28:17Z | CONTRIBUTOR | The following code should probably fail somewhere else than with an ``` $ cat mwe.py !/usr/bin/env python3.6import numpy import xarray x = xarray.DataArray( [1, 2, 3], dims=["X"], coords={"X": numpy.zeros(shape=3, dtype="M8[ms]")}) x.to_netcdf("/tmp/test.nc") $ python3.6 mwe.py Traceback (most recent call last): File "mwe.py", line 11, in <module> x.to_netcdf("/tmp/test.nc") File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataarray.py", line 1351, in to_netcdf dataset.to_netcdf(args, *kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py", line 977, in to_netcdf unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/api.py", line 573, in to_netcdf unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py", line 916, in dump_to_store unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py", line 244, in store cf_variables, cf_attrs = cf_encoder(variables, attributes) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 1089, in cf_encoder for k, v in iteritems(variables)) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 1089, in <genexpr> for k, v in iteritems(variables)) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 734, in encode_cf_variable var = maybe_encode_datetime(var) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 585, in maybe_encode_datetime data, encoding.pop('units', None), encoding.pop('calendar', None)) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/conventions.py", line 293, in encode_cf_datetime assert dates.dtype == 'datetime64[ns]' AssertionError ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1494/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
290572700 | MDU6SXNzdWUyOTA1NzI3MDA= | 1849 | passing unlimited_dims to to_netcdf triggers RuntimeError: NetCDF: Invalid argument | gerritholl 500246 | closed | 0 | 12 | 2018-01-22T18:43:23Z | 2019-06-04T20:41:50Z | 2019-06-04T20:41:50Z | CONTRIBUTOR | For some datafiles with properties I cannot quite reproduce, ``` $ cat mwe.py !/usr/bin/env python3.6import xarray ds = xarray.open_dataset("sample.nc") ds.to_netcdf("sample2.nc", unlimited_dims=["y"]) $ ncdump sample.nc netcdf sample { dimensions: y = 6 ; variables: float x(y) ; x:_FillValue = NaNf ; int64 y(y) ; data: x = 0, 0, 0, 0, 0, 0 ; y = 0, 1, 2, 3, 4, 5 ; } $ ./mwe.py Traceback (most recent call last): File "./mwe.py", line 5, in <module> ds.to_netcdf("sample2.nc", unlimited_dims=["y"]) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py", line 1133, in to_netcdf unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/api.py", line 627, in to_netcdf unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py", line 1070, in dump_to_store unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py", line 254, in store args, kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py", line 221, in store unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/netCDF4_.py", line 339, in set_variables super(NetCDF4DataStore, self).set_variables(args, **kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/common.py", line 233, in set_variables name, v, check, unlimited_dims=unlimited_dims) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/backends/netCDF4_.py", line 385, in prepare_variable fill_value=fill_value) File "netCDF4/_netCDF4.pyx", line 2437, in netCDF4._netCDF4.Dataset.createVariable File "netCDF4/_netCDF4.pyx", line 3439, in netCDF4._netCDF4.Variable.init File "netCDF4/_netCDF4.pyx", line 1638, in netCDF4._netCDF4._ensure_nc_success RuntimeError: NetCDF: Invalid argument ``` Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1849/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
410317757 | MDU6SXNzdWU0MTAzMTc3NTc= | 2772 | Should xarray allow assigning a masked constant? | gerritholl 500246 | open | 0 | 1 | 2019-02-14T14:10:20Z | 2019-02-15T20:24:44Z | CONTRIBUTOR | Currently, |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2772/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
381633612 | MDExOlB1bGxSZXF1ZXN0MjMxNTU2ODM5 | 2557 | add missing comma and article in error message | gerritholl 500246 | closed | 0 | 2 | 2018-11-16T14:59:02Z | 2018-11-16T16:40:03Z | 2018-11-16T16:40:03Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/2557 | Add missing comma and article in error message when attribute values have the wrong type. I tihnk this change is sufficiently minor that no documentation or whatsnew changes should be necessary. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2557/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
371990138 | MDU6SXNzdWUzNzE5OTAxMzg= | 2495 | Confusing error message when using a set to pass coordinates | gerritholl 500246 | closed | 0 | 4 | 2018-10-19T14:48:04Z | 2018-10-22T21:43:01Z | 2018-10-22T20:32:48Z | CONTRIBUTOR | Code Sample, a copy-pastable example if possible```python xarray.DataArray(numpy.arange(3), dims=("x",), coords={"x": {"a", "b", "c"}}) ``` Problem descriptionThis results in a ``` In [57]: xarray.DataArray(numpy.arange(3), dims=("x",), coords={"x": {"a", "b", "c"}}) MissingDimensionsError Traceback (most recent call last) <ipython-input-57-6d18e1623a15> in <module>() ----> 1 xarray.DataArray(numpy.arange(3), dims=("x",), coords={"x": {"a", "b", "c"}}) /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/dataarray.py in init(self, data, coords, dims, name, attrs, encoding, fastpath) 225 226 data = as_compatible_data(data) --> 227 coords, dims = _infer_coords_and_dims(data.shape, coords, dims) 228 variable = Variable(dims, data, attrs, encoding, fastpath=True) 229 /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/dataarray.py in _infer_coords_and_dims(shape, coords, dims) 62 if utils.is_dict_like(coords): 63 for k, v in coords.items(): ---> 64 new_coords[k] = as_variable(v, name=k) 65 elif coords is not None: 66 for dim, coord in zip(dims, coords): /group_workspaces/cems2/fiduceo/Users/gholl/anaconda3/envs/FCDR37a/lib/python3.7/site-packages/xarray/core/variable.py in as_variable(obj, name) 99 'cannot set variable %r with %r-dimensional data ' 100 'without explicit dimension names. Pass a tuple of ' --> 101 '(dims, data) instead.' % (name, data.ndim)) 102 obj = Variable(name, obj, fastpath=True) 103 else: MissingDimensionsError: cannot set variable 'x' with 0-dimensional data without explicit dimension names. Pass a tuple of (dims, data) instead. ``` Expected OutputIt should probably raise a Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2495/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
289790965 | MDU6SXNzdWUyODk3OTA5NjU= | 1838 | DataArray.sum does not respect dtype keyword | gerritholl 500246 | closed | 0 | 2 | 2018-01-18T22:01:07Z | 2018-01-20T18:29:02Z | 2018-01-20T18:29:02Z | CONTRIBUTOR | Code Sample, a copy-pastable example if possible```python Your code hereda = xarray.DataArray(arange(5, dtype="i2")) print(da.sum(dtype="i4").dtype) ``` Problem descriptionThe result is int64. This is a problem because I asked for int32. Expected OutputExpected output Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1838/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
281552158 | MDExOlB1bGxSZXF1ZXN0MTU3OTUwMzcy | 1777 | Respect PEP 440 | gerritholl 500246 | closed | 0 | 4 | 2017-12-12T21:59:18Z | 2017-12-15T07:26:33Z | 2017-12-15T07:26:24Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1777 | Change unreleased version numbers as to respect PEP 440. Rather than
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1777/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
212501628 | MDU6SXNzdWUyMTI1MDE2Mjg= | 1300 | git version label yields version in violation of PEP 440 | gerritholl 500246 | closed | 0 | 2 | 2017-03-07T17:23:00Z | 2017-12-15T07:26:24Z | 2017-12-15T07:26:24Z | CONTRIBUTOR | When an This violates PEP 440, which leads to multiple problems:
Instead, the version number above should be written as |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1300/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
269182902 | MDExOlB1bGxSZXF1ZXN0MTQ5MjQ2NDQ5 | 1664 | BUG: Added new names for pandas isna/notna unary functions | gerritholl 500246 | closed | 0 | 1 | 2017-10-27T17:38:54Z | 2017-11-09T02:47:21Z | 2017-11-09T02:47:21Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1664 | In pandas commit https://github.com/pandas-dev/pandas/commit/793020293ee1e5fa023f45c12943a4ac51cc23d isna and notna were added as aliases for isnull and notnull. Those need to be added to PANDAS_UNARY_FUNCTIONS for xarray datasets notnull to work. Closes #1663.
Note: I'm not sure how to test for this, as I think existing tests should be already failing due to this. In fact, when I run I did not try the flake8 test and I think new documentation is exaggerated in this case. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1664/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
269143043 | MDU6SXNzdWUyNjkxNDMwNDM= | 1663 | ds.notnull() fails with AttributeError on pandas 0.21.0rc1 | gerritholl 500246 | closed | 0 | 8 | 2017-10-27T15:19:33Z | 2017-11-01T05:27:18Z | 2017-11-01T05:27:18Z | CONTRIBUTOR |
``` $ cat mwe.py !/usr/bin/env python3.6import numpy print(numpy.version) import xarray print(xarray.version) import pandas print(pandas.version) xarray.Dataset({"A": ("x", numpy.arange(5))}).notnull() $ ./mwe.py 1.13.3 0.9.6 0.21.0rc1 Traceback (most recent call last): File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/ops.py", line 193, in func return getattr(self, name)(args, *kwargs) AttributeError: 'Variable' object has no attribute 'notna' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "./mwe.py", line 10, in <module> xarray.Dataset({"A": ("x", numpy.arange(5))}).notnull() File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/dataset.py", line 2485, in func ds._variables[k] = f(self._variables[k], args, kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/core/ops.py", line 195, in func return f(self, args, **kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/pandas/core/dtypes/missing.py", line 212, in notna res = isna(obj) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/pandas/core/dtypes/missing.py", line 45, in isna return _isna(obj) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/pandas/core/dtypes/missing.py", line 60, in _isna_new return obj._constructor(obj._data.isna(func=isna)) AttributeError: 'Variable' object has no attribute '_constructor' ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1663/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
268487752 | MDU6SXNzdWUyNjg0ODc3NTI= | 1661 | da.plot.pcolormesh fails when there is a datetime coordinate | gerritholl 500246 | closed | 0 | 9 | 2017-10-25T17:44:38Z | 2017-10-29T17:28:55Z | 2017-10-29T17:28:55Z | CONTRIBUTOR |
``` $ cat mwe.py !/usr/bin/env python3.6import xarray import numpy da = xarray.DataArray( numpy.arange(3*4).reshape(3,4), dims=("x", "y"), coords={"x": [1,2,3], "y": [numpy.datetime64(f"2000-01-{x:02d}") for x in range(1, 5)]}) da.plot.pcolormesh() $ ./mwe.py Traceback (most recent call last): File "./mwe.py", line 13, in <module> da.plot.pcolormesh() File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/plot/plot.py", line 547, in plotmethod return newplotfunc(allargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/plot/plot.py", line 500, in newplotfunc kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/xarray/plot/plot.py", line 667, in pcolormesh primitive = ax.pcolormesh(x, y, z, kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/matplotlib/init.py", line 1710, in inner return func(ax, *args, kwargs) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/matplotlib/axes/_axes.py", line 5636, in pcolormesh coords = np.column_stack((X, Y)).astype(float, copy=False) File "/dev/shm/gerrit/venv/stable-3.6/lib/python3.6/site-packages/numpy/lib/shape_base.py", line 353, in column_stack return _nx.concatenate(arrays, 1) TypeError: invalid type promotion ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1661/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
228023777 | MDU6SXNzdWUyMjgwMjM3Nzc= | 1405 | Using uint64 for Dataset indexing gives ValueError | gerritholl 500246 | closed | 0 | 2 | 2017-05-11T15:05:20Z | 2017-10-23T07:50:29Z | 2017-10-23T07:50:29Z | CONTRIBUTOR | Trying to index a ``` In [13]: import xarray In [14]: ds = xarray.Dataset({"A": (("x", "y"), arange(5*6).reshape(5,6))}) In [15]: ds[{"x": numpy.array([0], dtype="int64")}] Out[15]: <xarray.Dataset> Dimensions: (x: 1, y: 6) Dimensions without coordinates: x, y Data variables: A (x, y) int64 0 1 2 3 4 5 In [16]: ds[{"x": numpy.array([0], dtype="uint64")}]ValueError Traceback (most recent call last) <ipython-input-16-4cf23af0967e> in <module>() ----> 1 ds[{"x": numpy.array([0], dtype="uint64")}] /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in getitem(self, key) 722 """ 723 if utils.is_dict_like(key): --> 724 return self.isel(**key) 725 726 if hashable(key): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/dataset.py in isel(self, drop, indexers) 1147 for name, var in iteritems(self._variables): 1148 var_indexers = dict((k, v) for k, v in indexers if k in var.dims) -> 1149 new_var = var.isel(var_indexers) 1150 if not (drop and name in var_indexers): 1151 variables[name] = new_var /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py in isel(self, **indexers) 547 if dim in indexers: 548 key[i] = indexers[dim] --> 549 return self[tuple(key)] 550 551 def squeeze(self, dim=None): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/variable.py in getitem(self, key) 377 dims = tuple(dim for k, dim in zip(key, self.dims) 378 if not isinstance(k, integer_types)) --> 379 values = self._indexable_data[key] 380 # orthogonal indexing should ensure the dimensionality is consistent 381 if hasattr(values, 'ndim'): /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in getitem(self, key) 467 468 def getitem(self, key): --> 469 key = self._convert_key(key) 470 return self._ensure_ndarray(self.array[key]) 471 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in _convert_key(self, key) 454 if any(not isinstance(k, integer_types + (slice,)) for k in key): 455 # key would trigger fancy indexing --> 456 key = orthogonal_indexer(key, self.shape) 457 return key 458 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in orthogonal_indexer(key, shape) 78 """ 79 # replace Ellipsis objects with slices ---> 80 key = list(canonicalize_indexer(key, len(shape))) 81 # replace 1d arrays and slices with broadcast compatible arrays 82 # note: we treat integers separately (instead of turning them into 1d /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in canonicalize_indexer(key, ndim) 66 return indexer 67 ---> 68 return tuple(canonicalize(k) for k in expanded_indexer(key, ndim)) 69 70 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in <genexpr>(.0) 66 return indexer 67 ---> 68 return tuple(canonicalize(k) for k in expanded_indexer(key, ndim)) 69 70 /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/core/indexing.py in canonicalize(indexer) 63 'array indexing; all subkeys must be ' 64 'slices, integers or sequences of ' ---> 65 'integers or Booleans' % indexer) 66 return indexer 67 ValueError: invalid subkey array([0], dtype=uint64) for integer based array indexing; all subkeys must be slices, integers or sequences of integers or Booleans ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1405/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
212471682 | MDExOlB1bGxSZXF1ZXN0MTA5NTA3MzQz | 1299 | BUG/TST: Retain encoding upon concatenation | gerritholl 500246 | closed | 0 | 6 | 2017-03-07T15:46:11Z | 2017-09-05T04:10:02Z | 2017-09-05T04:10:02Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1299 | Retain encoding upon concatenation of DataArray or Dataset.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1299/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
228036180 | MDExOlB1bGxSZXF1ZXN0MTIwMTM4ODQ4 | 1406 | BUG: Allow unsigned integer indexing, fixes #1405 | gerritholl 500246 | closed | 0 | 6 | 2017-05-11T15:41:50Z | 2017-09-01T15:54:41Z | 2017-09-01T15:54:41Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1406 | Permit indexing with unsigned integers. This should fix #1405.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1406/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
204090452 | MDExOlB1bGxSZXF1ZXN0MTAzNzkyMDAz | 1241 | BUG: Add mixing dimension name to error message | gerritholl 500246 | closed | 0 | 1 | 2017-01-30T18:24:52Z | 2017-01-30T18:40:27Z | 2017-01-30T18:40:27Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1241 | Bugfix: error message for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1241/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
203159853 | MDU6SXNzdWUyMDMxNTk4NTM= | 1229 | opening NetCDF file fails with ValueError when time variable is multidimensional | gerritholl 500246 | closed | 0 | 3 | 2017-01-25T16:56:27Z | 2017-01-26T05:13:12Z | 2017-01-26T05:13:12Z | CONTRIBUTOR | I have a NetCDF file that includes a time field with multiple dimensions. This leads to a failure in ``` In [748]: ds = netCDF4.Dataset("test.nc", "w") In [749]: dim = ds.createDimension("dim", 5) In [750]: dim2 = ds.createDimension("dim2", 5) In [751]: time = ds.createVariable("time", "u4", ("dim", "dim2")) In [752]: time.units = "seconds since 1970-01-01" In [753]: time.calendar = "gregorian" In [754]: time[:, :] = arange(25).reshape(5, 5) In [755]: ds.close() In [757]: xarray.open_dataset("test.nc")ValueError Traceback (most recent call last) <ipython-input-757-17ad46b81538> in <module>() ----> 1 xarray.open_dataset("test.nc") /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables) 300 lock = _default_lock(filename_or_obj, engine) 301 with close_on_error(store): --> 302 return maybe_decode_store(store, lock) 303 else: 304 if engine is not None and engine != 'scipy': /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 221 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 222 concat_characters=concat_characters, decode_coords=decode_coords, --> 223 drop_variables=drop_variables) 224 225 _protect_dataset_variables_inplace(ds, cache) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 947 vars, attrs, coord_names = decode_cf_variables( 948 vars, attrs, concat_characters, mask_and_scale, decode_times, --> 949 decode_coords, drop_variables=drop_variables) 950 ds = Dataset(vars, attrs=attrs) 951 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variables(variables, attributes, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 882 new_vars[k] = decode_cf_variable( 883 v, concat_characters=concat, mask_and_scale=mask_and_scale, --> 884 decode_times=decode_times) 885 if decode_coords: 886 var_attrs = new_vars[k].attrs /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variable(var, concat_characters, mask_and_scale, decode_times, decode_endianness) 819 units = pop_to(attributes, encoding, 'units') 820 calendar = pop_to(attributes, encoding, 'calendar') --> 821 data = DecodedCFDatetimeArray(data, units, calendar) 822 elif attributes['units'] in TIME_UNITS: 823 # timedelta /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in init(self, array, units, calendar) 384 # Dataset.repr when users try to view their lazily decoded array. 385 example_value = np.concatenate([first_n_items(array, 1) or [0], --> 386 last_item(array) or [0]]) 387 388 try: ValueError: all the input arrays must have same number of dimensions ``` Closer look in the debugger: ``` In [758]: %debug xarray.open_dataset("test.nc") NOTE: Enter 'c' at the ipdb> prompt to continue execution.
ipdb> break /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 Breakpoint 1 at /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 ipdb> cont
ipdb> p first_n_items(array, 1).shape (1,) ipdb> p last_item(array).shape (1, 1) ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1229/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
200131742 | MDExOlB1bGxSZXF1ZXN0MTAxMDkzNjEy | 1200 | DOC: fix small typo/mistake (NaN value not dtype) | gerritholl 500246 | closed | 0 | 1 | 2017-01-11T15:57:15Z | 2017-01-11T17:11:04Z | 2017-01-11T17:11:01Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1200 | Fix small mistake in documentation. It said NaN is not a valid dtype for integer dtypes, this surely should be NaN is not a valid value for integer dtypes. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1200/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);