home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 457799138

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/2710#issuecomment-457799138 https://api.github.com/repos/pydata/xarray/issues/2710 457799138 MDEyOklzc3VlQ29tbWVudDQ1Nzc5OTEzOA== 1217238 2019-01-26T03:55:39Z 2019-01-26T03:55:39Z MEMBER

broadcast the data across 1 or more new dimensions

Yes, this feels in scope for expand_dims(). But I think there are two separate features here: 1. Support inserting/broadcasting dimensions with size > 1. 2. Specify the size of the new dimension implicitly, by providing coordinate labels.

I think we would want both to be supported -- you should not be required to supply coordinate labels in order to expand to a dimension of size > 1. We can imagine the first being spelled like da.expand_dims({'a': 3}) or da.expand_dims(a=3).

expand an existing dimension to include 1 or more new coordinates

This feels a little different from expand_dims to me. Here the fundamental operation is alignment/reindexing, not broadcasting across a new dimension. The result also looks different, because you get all the NaN values.

I would probably write this with reindex, e.g., In [12]: da.reindex(b=list(da.b.values)+[5, 6]) Out[12]: <xarray.DataArray (b: 7, c: 3)> array([[ 1., 1., 1.], [ 1., 1., 1.], [ 1., 1., 1.], [ 1., 1., 1.], [ 1., 1., 1.], [nan, nan, nan], [nan, nan, nan]]) Coordinates: * b (b) int64 0 1 2 3 4 5 6 * c (c) int64 0 1 2

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  403326458
Powered by Datasette · Queries took 0.73ms · About: xarray-datasette