issues
5 rows where comments = 1, state = "closed" and user = 500246 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1608352581 | I_kwDOAMm_X85f3YNF | 7581 | xr.where loses attributes despite keep_attrs=True | gerritholl 500246 | closed | 0 | 1 | 2023-03-03T10:14:34Z | 2023-04-06T01:58:45Z | 2023-04-06T01:58:45Z | CONTRIBUTOR | What happened?I'm using What did you expect to happen?I expect that if I use Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
Anything else we need to know?I can make a workaround by turning the logic around, such as The workaround works in this case, but Environment
INSTALLED VERSIONS
------------------
commit: None
python: 3.11.0 | packaged by conda-forge | (main, Jan 14 2023, 12:27:40) [GCC 11.3.0]
python-bits: 64
OS: Linux
OS-release: 5.3.18-150300.59.76-default
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_GB.UTF-8
LOCALE: ('en_GB', 'UTF-8')
libhdf5: 1.12.2
libnetcdf: 4.9.1
xarray: 2023.2.0
pandas: 1.5.3
numpy: 1.24.2
scipy: 1.10.1
netCDF4: 1.6.2
pydap: None
h5netcdf: 1.1.0
h5py: 3.8.0
Nio: None
zarr: 2.13.6
cftime: 1.6.2
nc_time_axis: None
PseudoNetCDF: None
rasterio: 1.3.6
cfgrib: None
iris: None
bottleneck: 1.3.6
dask: 2023.2.1
distributed: 2023.2.1
matplotlib: 3.7.0
cartopy: 0.21.1
seaborn: None
numbagg: None
fsspec: 2023.1.0
cupy: None
pint: 0.20.1
sparse: None
flox: None
numpy_groupies: None
setuptools: 67.4.0
pip: 23.0.1
conda: None
pytest: 7.2.1
mypy: None
IPython: 8.7.0
sphinx: 5.3.0
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7581/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
376104925 | MDU6SXNzdWUzNzYxMDQ5MjU= | 2529 | numpy.insert on DataArray may silently result in array inconsistent with its coordinates | gerritholl 500246 | closed | 0 | 1 | 2018-10-31T18:33:23Z | 2020-11-07T21:55:42Z | 2020-11-07T21:55:42Z | CONTRIBUTOR | ```python import numpy import xarray da = xarray.DataArray(numpy.arange(103).reshape(10, 3), dims=("x", "y"), coords={"foo": (("x", "y"), numpy.arange(310).reshape(10,3))}) print(da.shape == da["foo"].shape) da2 = numpy.insert(da, 3, 0, axis=0) print(da2.shape == da2["foo"].shape) ``` Problem descriptionRunning the code snippet gives
and does not raise any exception. In the resulting Expected OutputI would expect to get an exception, telling me that the insertion has failed because there are coordinates associated with the axis along which we are inserting values. It would be nice to have an Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2529/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
269182902 | MDExOlB1bGxSZXF1ZXN0MTQ5MjQ2NDQ5 | 1664 | BUG: Added new names for pandas isna/notna unary functions | gerritholl 500246 | closed | 0 | 1 | 2017-10-27T17:38:54Z | 2017-11-09T02:47:21Z | 2017-11-09T02:47:21Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1664 | In pandas commit https://github.com/pandas-dev/pandas/commit/793020293ee1e5fa023f45c12943a4ac51cc23d isna and notna were added as aliases for isnull and notnull. Those need to be added to PANDAS_UNARY_FUNCTIONS for xarray datasets notnull to work. Closes #1663.
Note: I'm not sure how to test for this, as I think existing tests should be already failing due to this. In fact, when I run I did not try the flake8 test and I think new documentation is exaggerated in this case. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1664/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
204090452 | MDExOlB1bGxSZXF1ZXN0MTAzNzkyMDAz | 1241 | BUG: Add mixing dimension name to error message | gerritholl 500246 | closed | 0 | 1 | 2017-01-30T18:24:52Z | 2017-01-30T18:40:27Z | 2017-01-30T18:40:27Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1241 | Bugfix: error message for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1241/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
200131742 | MDExOlB1bGxSZXF1ZXN0MTAxMDkzNjEy | 1200 | DOC: fix small typo/mistake (NaN value not dtype) | gerritholl 500246 | closed | 0 | 1 | 2017-01-11T15:57:15Z | 2017-01-11T17:11:04Z | 2017-01-11T17:11:01Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/1200 | Fix small mistake in documentation. It said NaN is not a valid dtype for integer dtypes, this surely should be NaN is not a valid value for integer dtypes. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1200/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);