issues: 1954445639
This data as json
| id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1954445639 | I_kwDOAMm_X850fnlH | 8350 | optimize align for scalars at least | 2448579 | open | 0 | 5 | 2023-10-20T14:48:25Z | 2023-10-20T19:17:39Z | MEMBER | What happened?Here's a simple rescaling calculation: ```python import numpy as np import xarray as xr ds = xr.Dataset( {"a": (("x", "y"), np.ones((300, 400))), "b": (("x", "y"), np.ones((300, 400)))} ) mean = ds.mean() # scalar std = ds.std() # scalar rescaled = (ds - mean) / std ``` The profile for the last line shows 30% (!!!) time spent in This is a small example inspired by a ML pipeline where this normalization is happening very many times in a tight loop. cc @benbovy What did you expect to happen?A fast path for when no reindexing needs to happen. |
{
"url": "https://api.github.com/repos/pydata/xarray/issues/8350/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
13221727 | issue |