issues
3 rows where state = "closed" and user = 9655353 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
307224717 | MDU6SXNzdWUzMDcyMjQ3MTc= | 2002 | Unexpected decoded time in xarray >= 0.10.1 | JanisGailis 9655353 | closed | 0 | 0.10.3 3008859 | 8 | 2018-03-21T12:28:54Z | 2018-03-31T01:16:14Z | 2018-03-31T01:16:14Z | NONE | Problem descriptionGiven the original time dimension:
Expected OutputWith Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2002/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | |||||
260569191 | MDU6SXNzdWUyNjA1NjkxOTE= | 1592 | groupby() fails with a stack trace when Dask 0.15.3 is used | JanisGailis 9655353 | closed | 0 | 2 | 2017-09-26T10:15:46Z | 2017-10-04T21:42:52Z | 2017-10-04T21:42:52Z | NONE | Hi xarray team! Our unit tests broke when Dask got updated to 0.15.3, after a quick investigation it became clear that groupby operation on an xarray Dataset fails with this Dask version. The following example: ```python import xarray as xr import numpy as np import dask def plus_one(da): return da + 1 print(xr.version) print(dask.version) ds = xr.Dataset({ 'first': (['time', 'lat', 'lon'], np.array([np.eye(4, 8), np.eye(4, 8)])), 'second': (['time', 'lat', 'lon'], np.array([np.eye(4, 8), np.eye(4, 8)])), 'lat': np.linspace(-67.5, 67.5, 4), 'lon': np.linspace(-157.5, 157.5, 8), 'time': np.array([1, 2])}).chunk(chunks={'lat': 2, 'lon': 4}) ds_new = ds.groupby('time').apply(plus_one)
I don't have enough understanding regarding what's really going on in Dask-land, so I leave it to you guys to open an issue in their Issue tracker if needed! |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1592/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
216010508 | MDU6SXNzdWUyMTYwMTA1MDg= | 1316 | ValueError not raised when doing difference of two non-intersecting datasets | JanisGailis 9655353 | closed | 0 | 3 | 2017-03-22T10:09:43Z | 2017-03-23T16:20:23Z | 2017-03-23T16:20:23Z | NONE | From the documentation I infer that when doing binary arithmetic operations, a ValueError should be raised when the datasets' variables don't intersect. However, the following happily returns a dataset with empty variable arrays:
Feel free to close right away if this is the desired behavior. EDIT: Xarray version is '0.9.1' |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1316/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);