home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

7 rows where author_association = "MEMBER" and issue = 58310637 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 2

  • shoyer 4
  • mrocklin 3

issue 1

  • Support out-of-core computation using dask · 7 ✖

author_association 1

  • MEMBER · 7 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
94074862 https://github.com/pydata/xarray/issues/328#issuecomment-94074862 https://api.github.com/repos/pydata/xarray/issues/328 MDEyOklzc3VlQ29tbWVudDk0MDc0ODYy shoyer 1217238 2015-04-17T21:03:12Z 2015-04-17T21:03:12Z MEMBER

Basic support for dask.array is merged on master.

Continued in https://github.com/xray/xray/issues/394

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support out-of-core computation using dask 58310637
87509188 https://github.com/pydata/xarray/issues/328#issuecomment-87509188 https://api.github.com/repos/pydata/xarray/issues/328 MDEyOklzc3VlQ29tbWVudDg3NTA5MTg4 shoyer 1217238 2015-03-30T01:39:02Z 2015-03-30T01:39:02Z MEMBER

@mrocklin It occurs to me now that a much simpler version of the functionality I'm looking for with take_nd would be dask.array.insert modeled after np.insert, which we could combine with array indexing. For the purposes of xray, we would only need support for insert with a scalar value, e.g., like da.insert(x, [1, 5, 6], np.nan, axis=1).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support out-of-core computation using dask 58310637
75476521 https://github.com/pydata/xarray/issues/328#issuecomment-75476521 https://api.github.com/repos/pydata/xarray/issues/328 MDEyOklzc3VlQ29tbWVudDc1NDc2NTIx shoyer 1217238 2015-02-23T00:56:51Z 2015-02-23T00:56:51Z MEMBER

Yes, take_nd is very similar to fancy indexing but only non-negative indices are valid (-1 means insert NaN).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support out-of-core computation using dask 58310637
75475798 https://github.com/pydata/xarray/issues/328#issuecomment-75475798 https://api.github.com/repos/pydata/xarray/issues/328 MDEyOklzc3VlQ29tbWVudDc1NDc1Nzk4 mrocklin 306380 2015-02-23T00:42:39Z 2015-02-23T00:42:39Z MEMBER

Am I right in thinking that this is almost equivalent to fancy indexing with a list of indices?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support out-of-core computation using dask 58310637
75475215 https://github.com/pydata/xarray/issues/328#issuecomment-75475215 https://api.github.com/repos/pydata/xarray/issues/328 MDEyOklzc3VlQ29tbWVudDc1NDc1MjE1 shoyer 1217238 2015-02-23T00:30:02Z 2015-02-23T00:31:00Z MEMBER

support for interleaved concatenation (necessary for transformations by group, which are quite common)

Turns out what I was thinking of here can be written as a one liner in terms of concatenate and take:

def interleaved_concatenate(arrays, indices, axis=0): return np.take(np.concatenate(arrays, axis), np.concatenate(indices))

So I've crossed that one off the line.

support super-imposing array values inter-leaved on top of a constant array of NaN (necessary for many alignment operations)

What I need here is something similar to the private take_nd functions that pandas defines that works like np.take, but that uses -1 as a sentinel value for "missing":

``` In [1]: import pandas

In [2]: import numpy as np

In [3]: x = np.arange(5)

In [4]: pandas.core.common.take_nd(x, [0, -1, 1, -1, 2]) Out[4]: array([ 0., nan, 1., nan, 2.]) ```

(In xray, I implement this a little differently so that I can take along all multiple axes simultaneously using array indexing, but this version would suffice.)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support out-of-core computation using dask 58310637
75417769 https://github.com/pydata/xarray/issues/328#issuecomment-75417769 https://api.github.com/repos/pydata/xarray/issues/328 MDEyOklzc3VlQ29tbWVudDc1NDE3NzY5 mrocklin 306380 2015-02-22T03:37:22Z 2015-02-22T03:37:22Z MEMBER

support super-imposing array values inter-leaved on top of a constant array of NaN (necessary for many alignment operations)

@shoyer can you clarify this one? Would the np.choose interface satisfy this?

``` Python In [1]: import numpy as np

In [2]: a = np.arange(4).reshape(2, 2)

In [3]: a Out[3]: array([[0, 1], [2, 3]])

In [4]: x = np.array([[True, False], [True, True]])

In [5]: np.choose(x, [-10, a]) Out[5]: array([[ 0, -10], [ 2, 3]]) ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support out-of-core computation using dask 58310637
75276367 https://github.com/pydata/xarray/issues/328#issuecomment-75276367 https://api.github.com/repos/pydata/xarray/issues/328 MDEyOklzc3VlQ29tbWVudDc1Mjc2MzY3 mrocklin 306380 2015-02-20T17:06:41Z 2015-02-20T17:06:41Z MEMBER
  • support for NaN skipping aggregations

Presumably we could drop in numbagg here. The reductions are generally pretty straightforward to extend. I can do this relatively soon. See https://github.com/ContinuumIO/dask/blob/master/dask/array/reductions.py#L43-L111 - support for interleaved concatenation (necessary for transformations by group, which are quite common)

Do we have this already? Or rather can you point me to how you would do this with NumPy. - support super-imposing array values inter-leaved on top of a constant array of NaN (necessary for many alignment operations)

Would this be solved by an elementwise ifelse operation? ifelse(condition, x, y) - support "orthogonal" MATLAB-like array-based indexing along multiple dimensions

You can do this now by repeated slicing x[[1, 2, 3], :][:, [4, 5, 6]] and get a fully efficient solution. I can roll this in to the normal syntax though. I might pause for a bit as I think about the break that this causes with NumPy but I'll probably go ahead anyway.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support out-of-core computation using dask 58310637

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 477.184ms · About: xarray-datasette