home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

23 rows where milestone = 987654 and repo = 13221727 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: user, comments, closed_at, author_association, body, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 17
  • issue 6

state 1

  • closed 23

repo 1

  • xarray · 23 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
46049691 MDU6SXNzdWU0NjA0OTY5MQ== 255 Add Dataset.to_pandas() method shoyer 1217238 closed 0   0.5 987654 2 2014-10-17T00:01:36Z 2021-05-04T13:56:00Z 2021-05-04T13:56:00Z MEMBER      

This would be the complement of the DataArray constructor, converting an xray.DataArray into a 1D series, 2D DataFrame or 3D panel, whichever is appropriate.

to_pandas would also makes sense for Dataset, if it could convert 0d datasets to series, e.g., pd.Series({k: v.item() for k, v in ds.items()}) (there is currently no direct way to do this), and revert to to_dataframe for higher dimensional input. - [x] DataArray method - [ ] Dataset method

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/255/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
48301141 MDU6SXNzdWU0ODMwMTE0MQ== 277 Creation of an empty DataArray andreas-h 358378 closed 0   0.5 987654 11 2014-11-10T19:07:55Z 2020-03-06T12:38:08Z 2020-03-06T12:38:07Z CONTRIBUTOR      

I'd like to create an empty DataArray, i.e., one with only NA values. The docstring of DataArray says that data=None is allowed, if a dataset argument is provided. However, the docstring doesn't say anything about a dataset argument. 1. I think there's a bug in the docstring 2. I'd like to pass data=None and get a DataArray with the coords/dims set up properly (as defined by the coords and dims kwargs), but with a values array of NA.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/277/reactions",
    "total_count": 10,
    "+1": 10,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
69216911 MDU6SXNzdWU2OTIxNjkxMQ== 394 Checklist for releasing a version of xray with dask support shoyer 1217238 closed 0   0.5 987654 3 2015-04-17T21:02:10Z 2015-06-01T18:27:49Z 2015-06-01T18:27:49Z MEMBER      

For dask: - [x] default threadpool for dask.array - [x] fix indexing bugs for dask.array - [x] make a decision on (and if necessary implement) renaming "block" to "chunk" - [x] fix repeated use of da.insert

For xray: - [x] update xray for the updated dask (https://github.com/xray/xray/pull/395) - [x] figure out how to handle caching with the .load() method on dask arrays - [x] cleanup the xray documentation on dask array. - [x] write an introductory blog post

Things we can add in an incremental release: - make non-aggregating grouped operations more useable - automatic lazy apply for grouped operations on xray objects

CC @mrocklin

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/394/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
83324817 MDExOlB1bGxSZXF1ZXN0MzY1OTk5MDI= 414 Doc updates 3 shoyer 1217238 closed 0   0.5 987654 0 2015-06-01T05:30:50Z 2015-06-01T05:33:24Z 2015-06-01T05:33:17Z MEMBER   0 pydata/xarray/pulls/414
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/414/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
83266863 MDExOlB1bGxSZXF1ZXN0MzY1OTQ3NTE= 413 More doc updates for 0.5 shoyer 1217238 closed 0   0.5 987654 0 2015-06-01T02:09:49Z 2015-06-01T02:12:29Z 2015-06-01T02:12:27Z MEMBER   0 pydata/xarray/pulls/413
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/413/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
83029667 MDExOlB1bGxSZXF1ZXN0MzY1Nzg0MDk= 412 Doc updates shoyer 1217238 closed 0   0.5 987654 0 2015-05-31T07:41:07Z 2015-05-31T23:45:06Z 2015-05-31T23:45:04Z MEMBER   0 pydata/xarray/pulls/412
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/412/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
75454929 MDExOlB1bGxSZXF1ZXN0MzUxOTIwNzQ= 410 ENH: Add .sel() method to Dataset and DataArray shoyer 1217238 closed 0   0.5 987654 0 2015-05-12T04:17:36Z 2015-05-14T02:27:45Z 2015-05-14T02:27:43Z MEMBER   0 pydata/xarray/pulls/410

sel() now supports the method parameter, which works like the paramter of the same name on reindex(). It provides a simple interface for doing nearest- neighbor interpolation:

``` In [12]: ds.sel(x=1.1, method='nearest') Out[12]: <xray.Dataset> Dimensions: () Coordinates: x int64 1 Data variables: y int64 2

In [13]: ds.sel(x=[1.1, 2.1], method='pad') Out[13]: <xray.Dataset> Dimensions: (x: 2) Coordinates: * x (x) int64 1 2 Data variables: y (x) int64 2 3 ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/410/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
74529471 MDExOlB1bGxSZXF1ZXN0MzUwNTUyNTE= 409 Add display_width option shoyer 1217238 closed 0   0.5 987654 0 2015-05-08T23:27:55Z 2015-05-12T04:19:28Z 2015-05-12T04:17:22Z MEMBER   0 pydata/xarray/pulls/409

Example usage:

``` In [12]: ds = xray.Dataset({'x': np.arange(1000)})

In [13]: with xray.set_options(display_width=40): ....: print(ds) ....: <xray.Dataset> Dimensions: (x: 1000) Coordinates: * x (x) int64 0 1 2 3 4 5 6 ... Data variables: empty

In [14]: with xray.set_options(display_width=60): ....: print(ds) ....: <xray.Dataset> Dimensions: (x: 1000) Coordinates: * x (x) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 ... Data variables: empty ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/409/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
72946694 MDExOlB1bGxSZXF1ZXN0MzQ2MjAwMTQ= 408 Improved docs for dask integration shoyer 1217238 closed 0   0.5 987654 0 2015-05-04T07:40:49Z 2015-05-04T08:12:59Z 2015-05-04T08:12:57Z MEMBER   0 pydata/xarray/pulls/408
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/408/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
72227526 MDExOlB1bGxSZXF1ZXN0MzQ0ODIyMTU= 407 Support reading and writing milliseconds/microseconds shoyer 1217238 closed 0   0.5 987654 0 2015-04-30T17:28:26Z 2015-05-01T20:33:53Z 2015-05-01T20:33:10Z MEMBER   0 pydata/xarray/pulls/407

Fixes #406.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/407/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
71994342 MDExOlB1bGxSZXF1ZXN0MzQ0MDc4MDE= 405 Add robust retry logic when accessing remote datasets shoyer 1217238 closed 0   0.5 987654 3 2015-04-29T21:25:47Z 2015-05-01T20:33:46Z 2015-05-01T20:33:45Z MEMBER   0 pydata/xarray/pulls/405

Accessing data from remote datasets now has retrying logic (with exponential backoff) that should make it robust to occasional bad responses from DAP servers.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/405/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
72145600 MDU6SXNzdWU3MjE0NTYwMA== 406 millisecond and microseconds support jsignell 4806877 closed 0   0.5 987654 5 2015-04-30T12:38:27Z 2015-05-01T20:33:10Z 2015-05-01T20:33:10Z CONTRIBUTOR      

netcdf4python supports milliseconds and microseconds:

https://github.com/Unidata/netcdf4-python/commit/22d439d6d3602171dc2c23bca0ade31d3c49ad20

would it be possible to support in X-ray?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/406/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
71747677 MDExOlB1bGxSZXF1ZXN0MzQzMjU5MDE= 403 Fix indexing remote datasets with pydap shoyer 1217238 closed 0   0.5 987654 0 2015-04-29T00:50:32Z 2015-04-29T00:55:17Z 2015-04-29T00:55:16Z MEMBER   0 pydata/xarray/pulls/403
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/403/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
70295909 MDExOlB1bGxSZXF1ZXN0MzM5MjE1OTc= 400 H5nc cleanup shoyer 1217238 closed 0   0.5 987654 0 2015-04-23T03:33:44Z 2015-04-23T03:41:19Z 2015-04-23T03:41:15Z MEMBER   0 pydata/xarray/pulls/400

Fixes #369

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/400/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
70015421 MDExOlB1bGxSZXF1ZXN0MzM4MjQzMDc= 399 Dataset.to_array and DataArray.to_dataset methods shoyer 1217238 closed 0   0.5 987654 0 2015-04-22T03:59:56Z 2015-04-22T04:34:56Z 2015-04-22T04:34:54Z MEMBER   0 pydata/xarray/pulls/399

These methods make it easy to switch back and forth between data arrays and datasets:

``` In [4]: ds = xray.Dataset({'a': 1, 'b': ('x', [1, 2, 3])}, ...: coords={'c': 42}, attrs={'Conventions': 'None'}) ...:

In [5]: ds.to_array() Out[5]: <xray.DataArray (variables: 2, x: 3)> array([[1, 1, 1], [1, 2, 3]]) Coordinates: c int64 42 * x (x) int64 0 1 2 * variables (variables) |S1 'a' 'b' Attributes: Conventions: None

In [6]: ds.to_array().to_dataset(dim='variables') Out[6]: <xray.Dataset> Dimensions: (x: 3) Coordinates: c int64 42 * x (x) int64 0 1 2 Data variables: a (x) int64 1 1 1 b (x) int64 1 2 3 Attributes: Conventions: None ```

Fixes #132

CC @IamJeffG

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/399/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
69972951 MDExOlB1bGxSZXF1ZXN0MzM4MTIxMzQ= 398 Rename .load_data() to the more succinct .load() shoyer 1217238 closed 0   0.5 987654 0 2015-04-21T23:01:46Z 2015-04-22T00:46:09Z 2015-04-22T00:46:08Z MEMBER   0 pydata/xarray/pulls/398

Also rename .chunk_data() -> .chunk() (but nobody is using that, yet).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/398/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
69767364 MDExOlB1bGxSZXF1ZXN0MzM3MzUyOTY= 397 Simplify load_data now that dask bugs have been fixed shoyer 1217238 closed 0   0.5 987654 0 2015-04-21T07:30:12Z 2015-04-21T07:35:49Z 2015-04-21T07:35:47Z MEMBER   0 pydata/xarray/pulls/397
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/397/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
69763046 MDExOlB1bGxSZXF1ZXN0MzM3MzQ0NTQ= 396 Add nbytes property shoyer 1217238 closed 0   0.5 987654 0 2015-04-21T07:14:00Z 2015-04-21T07:20:25Z 2015-04-21T07:20:23Z MEMBER   0 pydata/xarray/pulls/396
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/396/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
69714060 MDExOlB1bGxSZXF1ZXN0MzM3MjA0NTg= 395 Update xray to use updated dask.array and h5netcdf on pypi shoyer 1217238 closed 0   0.5 987654 0 2015-04-21T00:54:35Z 2015-04-21T01:07:03Z 2015-04-21T01:07:02Z MEMBER   0 pydata/xarray/pulls/395

This involves a big internal rename: block -> chunk

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/395/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
58310637 MDU6SXNzdWU1ODMxMDYzNw== 328 Support out-of-core computation using dask shoyer 1217238 closed 0   0.5 987654 7 2015-02-20T05:02:22Z 2015-04-17T21:03:12Z 2015-04-17T21:03:12Z MEMBER      

Dask is a library for out of core computation somewhat similar to biggus in conception, but with slightly grander aspirations. For examples of how Dask could be applied to weather data, see this blog post by @mrocklin: http://matthewrocklin.com/blog/work/2015/02/13/Towards-OOC-Slicing-and-Stacking/

It would be interesting to explore using dask internally in xray, so that we can implement lazy/out-of-core aggregations, concat and groupby to complement the existing lazy indexing. This functionality would be quite useful for xray, and even more so than merely supporting datasets-on-disk (#199).

A related issue is #79: we can easily imagine using Dask with groupby/apply to power out-of-core and multi-threaded computation.

Todos for xray: - [x] refactor Variable.concat to make use of functions like concatenate and stack instead of in-place array modification (Dask arrays do not support mutation, for good reasons) - [x] refactor reindex_variables to not make direct use of mutation (e.g., by using da.insert below) - [x] add some sort of internal abstraction to represent "computable" arrays that are not necessarily numpy.ndarray objects (done: this is the data attribute) - [x] expose reblock in the public API - [x] load datasets into dask arrays from disk - [x] load dataset from multiple files into dask - [x] ~~some sort of API for user controlled lazy apply on dask arrays (using groupby, mostly likely)~~ (not necessary for initial release) - [x] save from dask arrays - [x] an API for lazy ufuncs like sin and sqrt - [x] robustly handle indexing along orthogonal dimensions if dask can't handle it directly.

Todos for dask (to be clear, none of these are blockers for a proof of concept): - [x] support for NaN skipping aggregations - [x] ~~support for interleaved concatenation (necessary for transformations by group, which are quite common)~~ (turns out to be a one-liner with concatenate and take, see below) - [x] ~~support for something like take_nd from pandas: like np.take, but with -1 as a sentinel value for "missing" (necessary for many alignment operations)~~ da.insert, modeled after np.insert would solve this problem. - [x] ~~support "orthogonal" MATLAB-like array-based indexing along multiple dimensions~~ (taking along one axis at a time is close enough) - [x] broadcast_to: see https://github.com/numpy/numpy/pull/5371

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/328/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
67246147 MDExOlB1bGxSZXF1ZXN0MzI5MTM4MjA= 384 Fixes for dataset formatting shoyer 1217238 closed 0   0.5 987654 0 2015-04-08T23:53:40Z 2015-04-09T02:21:03Z 2015-04-09T02:21:00Z MEMBER   0 pydata/xarray/pulls/384

The previous tests were actually not being run because I named the test method incorrectly :(

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/384/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
64206762 MDExOlB1bGxSZXF1ZXN0MzE5MDAxOTE= 381 WIP: support dask.array in xray objects shoyer 1217238 closed 0   0.5 987654 1 2015-03-25T08:00:50Z 2015-04-08T03:44:08Z 2015-04-08T03:44:08Z MEMBER   0 pydata/xarray/pulls/381

xref #328

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/381/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
58288666 MDU6SXNzdWU1ODI4ODY2Ng== 326 DataArray.groupby.apply with a generic ndarray function IamJeffG 2002703 closed 0   0.5 987654 1 2015-02-19T23:37:34Z 2015-02-20T04:41:08Z 2015-02-20T04:41:08Z CONTRIBUTOR      

Need to apply a transformation function across one dimension of a DataArray, where that non-xray function speaks in ndarrays. Currently the only ways to do this involve wrapping the function. An example:

``` import numpy as np import xray from scipy.ndimage.morphology import binary_opening

da = xray.DataArray(np.random.random_integers(0, 1, (10, 10, 3)), dims=['row', 'col', 'time'])

I want to apply an operation the 2D image at each point in time

da.groupby('time').apply(binary_opening)

AttributeError: 'numpy.ndarray' object has no attribute 'dims'

def wrap_binary_opening(da, kwargs): return xray.DataArray(binary_opening(da.values, kwargs), da.coords)

da.groupby('time').apply(wrap_binary_opening) da.groupby('time').apply(wrap_binary_opening, iterations=2) # func may take custom args ```

My proposed solution is that apply would automatically coerce func's return value to a DataArray.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/326/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 32.717ms · About: xarray-datasette