issues
2 rows where "created_at" is on date 2018-12-11 and user = 2448579 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
| id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 389952656 | MDU6SXNzdWUzODk5NTI2NTY= | 2601 | View list of cached open files | dcherian 2448579 | closed | 0 | 2 | 2018-12-11T21:03:36Z | 2019-01-06T18:54:49Z | 2019-01-06T18:54:49Z | MEMBER | Is there a way to view the list of currently open, cached files? And possibly force-close them? I keep running to write errors because a file is open in the cache, and I can't remember all the variables that depend on it. |
{
"url": "https://api.github.com/repos/pydata/xarray/issues/2601/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
completed | xarray 13221727 | issue | ||||||
| 389865283 | MDU6SXNzdWUzODk4NjUyODM= | 2600 | Tests are failing on dask-dev | dcherian 2448579 | closed | 0 | 1 | 2018-12-11T17:09:57Z | 2018-12-12T03:13:30Z | 2018-12-12T03:13:30Z | MEMBER | Sample error from https://travis-ci.org/pydata/xarray/jobs/466431752 ``` _____ test_dataarray_with_dask_coords ______ def test_dataarray_with_dask_coords(): import toolz x = xr.Variable('x', da.arange(8, chunks=(4,))) y = xr.Variable('y', da.arange(8, chunks=(4,)) * 2) data = da.random.random((8, 8), chunks=(4, 4)) + 1 array = xr.DataArray(data, dims=['x', 'y']) array.coords['xx'] = x array.coords['yy'] = y
../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:395: in compute dsk = collections_to_dsk(collections, optimize_graph, *kwargs) ../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:187: in collections_to_dsk for opt, val in groups.items()} ../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:187: in <dictcomp> for opt, val in groups.items()} ../../../miniconda/envs/test_env/lib/python3.6/site-packages/dask/base.py:212: in _extract_graph_and_keys graph = merge(graphs) dicts = <dask.sharedict.ShareDict object at 0x7f307d29a128>, kwargs = {} factory = <class 'dict'>, rv = {} d = ('arange-36f53ab1e6153a63bbf7f4f8ff56693c', 0) def merge(dicts, *kwargs): """ Merge a collection of dictionaries
|
{
"url": "https://api.github.com/repos/pydata/xarray/issues/2600/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] (
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[number] INTEGER,
[title] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[state] TEXT,
[locked] INTEGER,
[assignee] INTEGER REFERENCES [users]([id]),
[milestone] INTEGER REFERENCES [milestones]([id]),
[comments] INTEGER,
[created_at] TEXT,
[updated_at] TEXT,
[closed_at] TEXT,
[author_association] TEXT,
[active_lock_reason] TEXT,
[draft] INTEGER,
[pull_request] TEXT,
[body] TEXT,
[reactions] TEXT,
[performed_via_github_app] TEXT,
[state_reason] TEXT,
[repo] INTEGER REFERENCES [repos]([id]),
[type] TEXT
);
CREATE INDEX [idx_issues_repo]
ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
ON [issues] ([user]);