issues
5 rows where user = 22488770 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, closed_at, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1636431481 | I_kwDOAMm_X85hifZ5 | 7661 | Dark mode documentation not readable | andrewpauling 22488770 | closed | 0 | 0 | 2023-03-22T20:11:06Z | 2023-03-27T18:14:27Z | 2023-03-27T18:14:27Z | CONTRIBUTOR | What is your issue?When opening the xarray documentation website, it defaults to dark mode as my system is using dark mode. However, much of the text is not readable as it remains black on the dark background, see screenshots for comparison of the homepage in dark vs light mode:
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7661/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
707752322 | MDU6SXNzdWU3MDc3NTIzMjI= | 4455 | Document units for polyfit when dimension is time | andrewpauling 22488770 | open | 0 | 2 | 2020-09-23T23:51:02Z | 2021-06-23T11:12:55Z | CONTRIBUTOR | Is your feature request related to a problem? Please describe. I think (please correct me if I'm wrong) when using polyfit with dim='time' the units of the output slope are [data_units]/ns, but this not explained in the docstring or in the documentation on the webpage. I figured this out eventually but it could be confusing for new users. Describe the solution you'd like Mention in the documentation and/or docstring that the units for time will be ns |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4455/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
707745196 | MDExOlB1bGxSZXF1ZXN0NDkyMDk1NzEw | 4454 | Raise error when datetime64 or timedelta64 values that are outside the valid range for ns precision are converted to ns precision | andrewpauling 22488770 | closed | 0 | 11 | 2020-09-23T23:36:14Z | 2020-09-30T18:56:45Z | 2020-09-30T00:49:35Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/4454 | Use _possibly_convert_obhects to raise an error when converting datetime64 or timedelta64 objects that are in some units other than ns to ns as per pandas requirements, resulting in dates outside the valid range for ns precision being silently changed to incorrect values.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4454/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
702373263 | MDU6SXNzdWU3MDIzNzMyNjM= | 4427 | assign_coords with datetime64[us] changes dtype to datetime64[ns] | andrewpauling 22488770 | closed | 0 | 3 | 2020-09-16T01:14:11Z | 2020-09-30T00:49:35Z | 2020-09-30T00:49:35Z | CONTRIBUTOR | What happened: When using xr.DataArray.assign_coords() to assign a new coordinate to the time dimension that is an array with dtype datetime64[us], after assigning, the dtype is datetime64[ns], resulting in the wrong dates, since the dates I am using are outside the valid range for the [ns] units. What you expected to happen: Preserve dtype of array when assigning as a coordinate. Minimal Complete Verifiable Example: ```python import numpy as np import xarray as xr import cftime tmp = np.random.random(12) da = xr.DataArray(tmp, dims='time') times=list() for mth in np.arange(1, 13): times.append(cftime.DatetimeNoLeap(1250, mth, 1)) times64 = np.array([np.datetime64(t, 'us') for t in times]) da = da.assign_coords({'time': times64})
and for the array after assigning:
Anything else we need to know?: Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.7.8 | packaged by conda-forge | (default, Jul 31 2020, 02:37:09) [Clang 10.0.1 ] python-bits: 64 OS: Darwin OS-release: 18.7.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: en_US.UTF-8 LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.3 xarray: 0.16.0 pandas: 1.1.0 numpy: 1.19.1 scipy: 1.4.1 netCDF4: 1.5.3 pydap: installed h5netcdf: None h5py: None Nio: None zarr: None cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.21.0 distributed: 2.22.0 matplotlib: 3.1.2 cartopy: 0.17.0 seaborn: None numbagg: None pint: None setuptools: 49.3.1.post20200810 pip: 20.2.2 conda: None pytest: None IPython: 7.17.0 sphinx: 3.2.0 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4427/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
506914634 | MDU6SXNzdWU1MDY5MTQ2MzQ= | 3398 | Mean called on groupby object adds dimensions to undesired variables | andrewpauling 22488770 | closed | 0 | 3 | 2019-10-14T23:03:04Z | 2019-10-16T14:30:38Z | 2019-10-16T14:30:38Z | CONTRIBUTOR | MCVE Code Sample```python import numpy as np import xarray as xr import cftime create time coordinatetdays = np.arange(0, 730) time = cftime.num2date(tdays, 'days since 0001-01-01 00:00:00', calendar='noleap') create spatial coordinatelev = np.arange(100) Create dummy datax = np.random.rand(time.size, lev.size) y = np.random.rand(lev.size) Create sample Datasetds = xr.Dataset({'sample_data': (['time', 'lev'], x), 'independent_data': (['lev'], y)}, coords={'time': (['time'], time), 'lev': (['lev'], lev)}) Perform groupby and meands2 = ds.groupby('time.month').mean(dim='time') ``` Actual Output
Expected Output
Problem DescriptionThe variable independent_data above initially has no time dimension but, after performing groupby('time.month').mean(dim='time') on the Dataset, it now has a month dimension that is meaningless. Preferably, it should leave the independent_data variable untouched. Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3398/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);