issues
19 rows where comments = 4 and user = 10194086 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: closed_at, draft, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2163675672 | PR_kwDOAMm_X85obI_8 | 8803 | missing chunkmanager: update error message | mathause 10194086 | open | 0 | 4 | 2024-03-01T15:48:00Z | 2024-03-15T11:02:45Z | MEMBER | 0 | pydata/xarray/pulls/8803 | When dask is missing we get the following error message:
this could be confusing - the error message seems geared towards a typo in the requested manager. However, I think it's much more likely that a chunk manager is just not installed. I tried to update the error message - happy to get feedback. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8803/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | ||||||
2090265314 | PR_kwDOAMm_X85kiCi8 | 8627 | unify freq strings (independent of pd version) | mathause 10194086 | closed | 0 | 4 | 2024-01-19T10:57:04Z | 2024-02-15T17:53:42Z | 2024-02-15T16:53:36Z | MEMBER | 0 | pydata/xarray/pulls/8627 |
Probably not ready for review yet. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8627/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
2025652693 | PR_kwDOAMm_X85hJh0D | 8521 | test and fix empty xindexes repr | mathause 10194086 | closed | 0 | 4 | 2023-12-05T08:54:56Z | 2024-01-08T10:58:09Z | 2023-12-06T17:06:15Z | MEMBER | 0 | pydata/xarray/pulls/8521 |
Uses |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8521/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
144630996 | MDU6SXNzdWUxNDQ2MzA5OTY= | 810 | correct DJF mean | mathause 10194086 | closed | 0 | 4 | 2016-03-30T15:36:42Z | 2022-04-06T16:19:47Z | 2016-05-04T12:56:30Z | MEMBER | This started as a question and I add it as reference. Maybe you have a comment. There are several ways to calculate time series of seasonal data (starting from monthly or daily data): ``` load librariesimport pandas as pd import matplotlib.pyplot import numpy as np import xarray as xr Create Example Datasettime = pd.date_range('2000.01.01', '2010.12.31', freq='M') data = np.random.rand(*time.shape) ds = xr.DataArray(data, coords=dict(time=time)) (1) using resampleds_res = ds.resample('Q-FEB', 'time') ds_res = ds_res.sel(time=ds_res['time.month'] == 2) ds_res = ds_res.groupby('time.year').mean('time') (2) this is wrongds_season = ds.where(ds['time.season'] == 'DJF').groupby('time.year').mean('time') (3) using where and rollingmask other months with nands_DJF = ds.where(ds['time.season'] == 'DJF') rolling mean -> only Jan is not nanhowever, we loose Jan/ Feb in the first year and Dec in the lastds_DJF = ds_DJF.rolling(min_periods=3, center=True, time=3).mean() make annual meands_DJF = ds_DJF.groupby('time.year').mean('time') ds_res.plot(marker='*') ds_season.plot() ds_DJF.plot() plt.show() ``` (1) The first is to use resample with 'Q-FEB' as argument. This works fine. It does include Jan/ Feb in the first year, and Dec in the last year + 1. If this makes sense can be debated. One case where this does not work is when you have, say, two regions in your data set, for one you want to calculate DJF and for the other you want NovDecJan. (2) Using 'time.season' is wrong as it combines Jan, Feb and Dec from the same year. (3) The third uses |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/810/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1126086052 | PR_kwDOAMm_X84yLQ48 | 6251 | use `warnings.catch_warnings(record=True)` instead of `pytest.warns(None)` | mathause 10194086 | closed | 0 | 4 | 2022-02-07T14:42:26Z | 2022-02-18T16:51:58Z | 2022-02-18T16:51:55Z | MEMBER | 0 | pydata/xarray/pulls/6251 | pytest v7.0.0 no longer want's us to use |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6251/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
752870062 | MDExOlB1bGxSZXF1ZXN0NTI5MDc4NDA0 | 4616 | don't type check __getattr__ | mathause 10194086 | closed | 0 | 4 | 2020-11-29T08:53:09Z | 2022-01-26T08:41:18Z | 2021-10-18T14:06:30Z | MEMBER | 1 | pydata/xarray/pulls/4616 |
It's not pretty as I had to define a number of empty methods... I think this should wait for 0.17 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4616/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
802400938 | MDExOlB1bGxSZXF1ZXN0NTY4NTUwNDEx | 4865 | fix da.pad example for numpy 1.20 | mathause 10194086 | closed | 0 | 4 | 2021-02-05T19:00:04Z | 2021-10-18T14:06:33Z | 2021-02-07T21:57:34Z | MEMBER | 0 | pydata/xarray/pulls/4865 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4865/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
800118528 | MDU6SXNzdWU4MDAxMTg1Mjg= | 4858 | doctest failure with numpy 1.20 | mathause 10194086 | closed | 0 | 4 | 2021-02-03T08:57:43Z | 2021-02-07T21:57:34Z | 2021-02-07T21:57:34Z | MEMBER | What happened: Our doctests fail since numpy 1.20 came out: https://github.com/pydata/xarray/pull/4760/checks?check_run_id=1818512841#step:8:69 What you expected to happen: They don't ;-) Minimal Complete Verifiable Example: The following fails with numpy 1.20 while it converted ```python import numpy as np x = np.arange(10) x = np.pad(x, 1, "constant", constant_values=np.nan) ``` requires numpy 1.20 Anything else we need to know?:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4858/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
688115687 | MDU6SXNzdWU2ODgxMTU2ODc= | 4385 | warnings from internal use of apply_ufunc | mathause 10194086 | closed | 0 | 4 | 2020-08-28T14:28:56Z | 2020-08-30T16:37:52Z | 2020-08-30T16:37:52Z | MEMBER | Another follow up from #4060: Minimal Complete Verifiable Example:
We should probably check the warnings in the test suite - there may be others. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4385/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
683856183 | MDExOlB1bGxSZXF1ZXN0NDcxODc4NjUz | 4365 | Silence plot warnings | mathause 10194086 | closed | 0 | 4 | 2020-08-21T22:21:40Z | 2020-08-24T16:05:13Z | 2020-08-24T16:00:42Z | MEMBER | 0 | pydata/xarray/pulls/4365 |
I gave a try to silence some of the warning for the plotting functions. I brought it down from 67 to 5 (4 of which come from external libraries).
I cannot exclude that one of these changes has an effect on the plots that is not tested in the suite... Tests pass locally for py38 and py36-bare-minimum. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4365/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
514308007 | MDExOlB1bGxSZXF1ZXN0MzMzOTQ4MDg2 | 3463 | unpin cftime | mathause 10194086 | closed | 0 | 4 | 2019-10-30T00:05:55Z | 2020-08-19T13:11:55Z | 2019-10-30T01:08:14Z | MEMBER | 0 | pydata/xarray/pulls/3463 | I think the cftime problems should be fixed after the release of v1.0.4.2 #3434 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3463/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
620424728 | MDExOlB1bGxSZXF1ZXN0NDE5Njc1NDU1 | 4075 | Fix bool weights | mathause 10194086 | closed | 0 | 4 | 2020-05-18T18:42:05Z | 2020-08-19T13:11:40Z | 2020-05-23T21:06:19Z | MEMBER | 0 | pydata/xarray/pulls/4075 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4075/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
559864146 | MDU6SXNzdWU1NTk4NjQxNDY= | 3750 | isort pre-commit hook does not skip text files | mathause 10194086 | closed | 0 | 4 | 2020-02-04T17:18:31Z | 2020-05-06T01:50:29Z | 2020-03-28T20:58:15Z | MEMBER | MCVE Code SampleAdd arbitrary change to the file
The pre-commit hook will fail. Expected Outputthe pre-commit hook to pass Problem Descriptionrunning |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3750/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
550964139 | MDExOlB1bGxSZXF1ZXN0MzYzNzcyNzE3 | 3699 | Feature/align in dot | mathause 10194086 | closed | 0 | 4 | 2020-01-16T17:55:38Z | 2020-01-20T12:55:51Z | 2020-01-20T12:09:27Z | MEMBER | 0 | pydata/xarray/pulls/3699 |
Happy to get feedback @fujiisoup @shoyer |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3699/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
545764524 | MDU6SXNzdWU1NDU3NjQ1MjQ= | 3665 | Cannot roundtrip time in NETCDF4_CLASSIC | mathause 10194086 | closed | 0 | 4 | 2020-01-06T14:47:48Z | 2020-01-16T18:27:15Z | 2020-01-16T18:27:14Z | MEMBER | MCVE Code Sample``` python import numpy as np import xarray as xr time = xr.cftime_range("2006-01-01", periods=2, calendar="360_day") da = xr.DataArray(time, dims=["time"]) da.encoding["dtype"] = np.float da.to_netcdf("tst.nc", format="NETCDF4_CLASSIC") ds = xr.open_dataset("tst.nc") ds.to_netcdf("tst2.nc", format="NETCDF4_CLASSIC") ``` yields:
Or an example without ```python import numpy as np import xarray as xr time = xr.cftime_range("2006-01-01", periods=2, calendar="360_day") da = xr.DataArray(time, dims=["time"]) da.encoding["_FillValue"] = np.array([np.nan]) xr.backends.netcdf3.encode_nc3_variable(xr.conventions.encode_cf_variable(da)) ``` Expected OutputXarray can save the dataset/ an Problem DescriptionIf there is a time variable that can be encoded using integers only, but that has a Note: if the time cannot be encoded using integers only, it works: ``` python da = xr.DataArray(time, dims=["time"]) da.encoding["_FillValue"] = np.array([np.nan]) da.encoding["units"] = "days since 2006-01-01T12:00:00" xr.backends.netcdf3.encode_nc3_variable(xr.conventions.encode_cf_variable(da)) ``` Another note: when saving with NETCDF4 ``` python da = xr.DataArray(time, dims=["time"]) da.encoding["_FillValue"] = np.array([np.nan]) xr.backends.netCDF4_._encode_nc4_variable(xr.conventions.encode_cf_variable(da))
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3665/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
106595746 | MDU6SXNzdWUxMDY1OTU3NDY= | 577 | wrap lon coordinates to 360 | mathause 10194086 | closed | 0 | 4 | 2015-09-15T16:36:37Z | 2019-01-17T09:34:56Z | 2019-01-15T20:15:01Z | MEMBER | Assume I have two datasets with the same lat/ lon grid. However, one has |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/577/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
310819233 | MDU6SXNzdWUzMTA4MTkyMzM= | 2036 | better error message for to_netcdf -> unlimited_dims | mathause 10194086 | closed | 0 | 4 | 2018-04-03T12:39:21Z | 2018-05-18T14:48:32Z | 2018-05-18T14:48:32Z | MEMBER | Code Sample, a copy-pastable example if possible```python Your code hereimport numpy as np import xarray as xr x = np.arange(10) da = xr.Dataset(data_vars=dict(data=('dim1', x)), coords=dict(dim1=('dim1', x), dim2=('dim2', x))) da.to_netcdf('tst.nc', format='NETCDF4_CLASSIC', unlimited_dims='dim1') ``` Problem descriptionThis creates the error The correct syntax is With I only tested with netCDF4 as backend. Expected Output
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2036/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
106581329 | MDU6SXNzdWUxMDY1ODEzMjk= | 576 | define fill value for where | mathause 10194086 | closed | 0 | 4 | 2015-09-15T15:27:32Z | 2017-08-08T17:00:30Z | 2017-08-08T17:00:30Z | MEMBER | It would be nice if
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/576/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
67332234 | MDU6SXNzdWU2NzMzMjIzNA== | 386 | "loosing" virtual variables | mathause 10194086 | closed | 0 | 4 | 2015-04-09T10:35:31Z | 2015-04-20T03:55:44Z | 2015-04-20T03:55:44Z | MEMBER | Once I take a mean over virtual variables, they are not available any more.
Is this intended behaviour? And could I get them back somehow? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/386/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);