home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

7 rows where user = 3064397 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • llllllllll · 7 ✖

issue 1

  • Hooks for XArray operations 7

author_association 1

  • NONE 7
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
368280791 https://github.com/pydata/xarray/issues/1938#issuecomment-368280791 https://api.github.com/repos/pydata/xarray/issues/1938 MDEyOklzc3VlQ29tbWVudDM2ODI4MDc5MQ== llllllllll 3064397 2018-02-25T03:47:41Z 2018-02-25T03:47:41Z NONE

@hameerabbasi This really doesn't work with *args due to how multiple dispatch itself works. What we have done in blaze is make top-level functions that accept *args which directly call dispatched functions passing the tuple.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Hooks for XArray operations 299668148
368280749 https://github.com/pydata/xarray/issues/1938#issuecomment-368280749 https://api.github.com/repos/pydata/xarray/issues/1938 MDEyOklzc3VlQ29tbWVudDM2ODI4MDc0OQ== llllllllll 3064397 2018-02-25T03:46:19Z 2018-02-25T03:46:19Z NONE

Given the issues raised on that PR as well as the profiling results shown here I think that PR will need some serious work before it could be merged.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Hooks for XArray operations 299668148
368111050 https://github.com/pydata/xarray/issues/1938#issuecomment-368111050 https://api.github.com/repos/pydata/xarray/issues/1938 MDEyOklzc3VlQ29tbWVudDM2ODExMTA1MA== llllllllll 3064397 2018-02-23T19:16:37Z 2018-02-23T19:16:37Z NONE

I wouldn't mind submitting this upstream, but I will defer to @mrocklin.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Hooks for XArray operations 299668148
368106529 https://github.com/pydata/xarray/issues/1938#issuecomment-368106529 https://api.github.com/repos/pydata/xarray/issues/1938 MDEyOklzc3VlQ29tbWVudDM2ODEwNjUyOQ== llllllllll 3064397 2018-02-23T19:00:39Z 2018-02-23T19:00:39Z NONE

The wrapping dispatch would just look like:

python @dispatch(list) def f(args): return f(VarArgs(args))

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Hooks for XArray operations 299668148
368105739 https://github.com/pydata/xarray/issues/1938#issuecomment-368105739 https://api.github.com/repos/pydata/xarray/issues/1938 MDEyOklzc3VlQ29tbWVudDM2ODEwNTczOQ== llllllllll 3064397 2018-02-23T18:57:59Z 2018-02-23T18:58:47Z NONE

We could make a particular list an instance of a particular TypedVarArgs; however, multiple dispatch uses the type() of arguments as well as issubclass to do dispatching. Multiple dispatch depends on being able to partially order types to make dispatching more efficient. The constructor of VarArgs scans for the types of the elements and constructs an instance of a new (but memoized) subclass of VarArgs which encodes the element types so that issubclass works as expected. The problem is that type([1.0, 'foo']) returns just list which erases all information about the elements.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Hooks for XArray operations 299668148
368100305 https://github.com/pydata/xarray/issues/1938#issuecomment-368100305 https://api.github.com/repos/pydata/xarray/issues/1938 MDEyOklzc3VlQ29tbWVudDM2ODEwMDMwNQ== llllllllll 3064397 2018-02-23T18:39:50Z 2018-02-23T18:40:46Z NONE

VarArgs itself is actually a type, so you need to create instances which wrap the list argument, for example:

```python In [1]: from blaze.compute.varargs import VarArgs

In [2]: from multipledispatch import dispatch

In [3]: @dispatch(VarArgs[float]) ...: def f(args): ...: print('floats') ...:

In [4]: @dispatch(VarArgs[str]) ...: def f(args): ...: print('strings') ...:

In [5]: @dispatch(VarArgs[str, float]) ...: def f(args): ...: print('mixed') ...:

In [6]: f(VarArgs(['foo'])) strings

In [7]: f(VarArgs([1.0])) floats

In [8]: f(VarArgs([1.0, 'foo'])) mixed

In [9]: VarArgs([1.0, 'foo']) Out[9]: VarArgsfloat, str ```

You could hide this behind a top-level function that wraps the input for the user, or register a dispatch for list which boxes and recurses into itself.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Hooks for XArray operations 299668148
368091406 https://github.com/pydata/xarray/issues/1938#issuecomment-368091406 https://api.github.com/repos/pydata/xarray/issues/1938 MDEyOklzc3VlQ29tbWVudDM2ODA5MTQwNg== llllllllll 3064397 2018-02-23T18:08:30Z 2018-02-23T18:08:30Z NONE

In blaze we have variadic sequences for multiple dispatch, and the List[Union] case is something we have run into. We have a type called VarArgs which takes a variadic sequence of type-arguments and represents a sequence of a unions over the arguments, for example: VarArgs[pd.Series, pd.DataFrame] is a sequence of unknown length which is known to hold either series or dataframes. With some mild metaprogramming we made it so that VarArs[pd.Series] is a subclass of VarArgs[pd.Series, pd.DataFrame], or in general, more specific sequences are subclasses of more general sequences. This means that you can solve the ambiguity by registering a dispatch for VarArgs[np.ndarray] and VarArgs[np.ndarray, da.Array] and you know that the second function can only be called if the sequence holds at least one dask array.

Here is an example of what that looks like for merge, which is concat(axis=1): https://github.com/blaze/blaze/blob/master/blaze/compute/pandas.py#L691 This is the definition of VarArgs: https://github.com/blaze/blaze/blob/master/blaze/compute/varargs.py

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Hooks for XArray operations 299668148

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 13.06ms · About: xarray-datasette