home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 368084600

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/1938#issuecomment-368084600 https://api.github.com/repos/pydata/xarray/issues/1938 368084600 MDEyOklzc3VlQ29tbWVudDM2ODA4NDYwMA== 1217238 2018-02-23T17:44:27Z 2018-02-23T18:17:28Z MEMBER

Dispatch for stack/concatenate is definitely on the radar for NumPy development, but I don't know when it's actually going to happen. The likely interface is something like __array_ufunc__: a special method like __array_concatenate__ is called on each element in the list, until one does not return NotImplemented. This is a different style of overloads than multipledispatch, one that is slightly simpler to implement but possibly slower and with fewer guarantees of correctness.

We only need this for a couple of operations, so in any case we can probably implement our own ad-hoc dispatch system for np.stack and np.concatenate, either along the of multipledispatch or NumPy/__array_ufunc__.

On further contemplation, overloading based on union types with a system like multipledispatch does seem tricky. It's not clear to me that there's even a well defined type for inputs to concatenate that should be dispatched to dask vs. numpy, for example. We want to let that dask handle any cases where at least one input is a dask array, but a type like List[Union[np.ndarray, da.Array]] actually matches a list of all numpy arrays, too -- unless we require an exact match for the type.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  299668148
Powered by Datasette · Queries took 2.988ms · About: xarray-datasette