html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/7721#issuecomment-1516494141,https://api.github.com/repos/pydata/xarray/issues/7721,1516494141,IC_kwDOAMm_X85aY909,98330,2023-04-20T15:04:17Z,2023-04-20T15:04:17Z,NONE,"> So really, my question is: how do we support python scalars for libraries that only implement `__array_namespace__`, given that stopping to do so would be a major breaking change? I was considering this question for SciPy (xref [scipy#18286](https://github.com/scipy/scipy/issues/18286)) this week, and I think I'm happy with this strategy: 1. Cast all ""array-like"" inputs like Python scalars, lists/sequences, and generators, to `numpy.ndarray`. 2. Require ""same array type"" input, forbid mixing numpy-cupy, numpy-pytorch, cupy-pytorch, etc. - this will raise an exception 3. As a result, cupy-pyscalar and pytorch-pyscalar will _also_ raise an exception. What that results in is an API that's backwards-compatible for numpy and array-like usage, and much stricter when using other array libraries. That strictness to me is a good thing, because: - that's what CuPy, PyTorch & co themselves do, and it works well there - it avoids the complexity raised by arbitrary mixing, which results in questions like the one raised in this issue. - in case you do need to use a scalar from within a function inside your own library, just convert it explicitly to the desired array type with `xp.asarray(a_scalar)` giving you a 0-D array of the correct type (add `dtype=x.dtype` to make sure dtypes match if that matters)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1655290694 https://github.com/pydata/xarray/issues/7721#issuecomment-1515600072,https://api.github.com/repos/pydata/xarray/issues/7721,1515600072,IC_kwDOAMm_X85aVjjI,5534781,2023-04-20T01:50:58Z,2023-04-20T01:50:58Z,NONE,"Thanks, Justus, for expanding on this. It sounds to me the question is ""how do we cast dtypes when multiple array libraries are participating in the same computation?"" and I am not sure I am knowledgable enough to make any comment. From the array API point of view, long long ago [we decided](https://github.com/data-apis/array-api/issues/399) that this is UB (undefined behavior), meaning it's completely up to each library to decide what to do. You can raise or come up with a special rule that you can make sense of. It sounds like Xarray has some machinery to deal with this situation, but you'd rather prefer to not keep special-casing for a certain array library? Am I understanding it right?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1655290694 https://github.com/pydata/xarray/issues/7721#issuecomment-1510016072,https://api.github.com/repos/pydata/xarray/issues/7721,1510016072,IC_kwDOAMm_X85aAQRI,5534781,2023-04-16T01:25:53Z,2023-04-16T01:25:53Z,NONE,"Sorry that I missed the ping, Jacob, but I'd need more context for making any suggestions/answers 😅 Is the question about why CuPy wouldn't return scalars?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1655290694