html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/3252#issuecomment-582987147,https://api.github.com/repos/pydata/xarray/issues/3252,582987147,MDEyOklzc3VlQ29tbWVudDU4Mjk4NzE0Nw==,1217238,2020-02-06T16:26:11Z,2020-02-06T16:26:11Z,MEMBER,"I recently wrote a version of scipy.ndimage.map_coordinates for JAX in pure
NumPt that I think could be straightforwardly ported into xarray.
On Thu, Feb 6, 2020 at 7:00 AM David Huard wrote:
> Just got bit by this as well. Computing monthly quantile correction
> factors, so I have an array with dimensions (month, quantile, lon, lat). I
> then want to apply these correction factors to a time series (time, lon,
> lat), so I compute the month and quantile of my time series, and want to
> interp into the quantile correction factors. This doesn't work because
> both the factors and the time series have (lat, lon) dimensions.
>
> —
> You are receiving this because you authored the thread.
> Reply to this email directly, view it on GitHub
> ,
> or unsubscribe
>
> .
>
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,484622545
https://github.com/pydata/xarray/issues/3252#issuecomment-524572529,https://api.github.com/repos/pydata/xarray/issues/3252,524572529,MDEyOklzc3VlQ29tbWVudDUyNDU3MjUyOQ==,1217238,2019-08-24T18:45:53Z,2019-08-24T18:45:53Z,MEMBER,We could probably use this our own version for all linear and nearest neighbor interpolation. Then we won’t need scipy installed for that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,484622545
https://github.com/pydata/xarray/issues/3252#issuecomment-524457928,https://api.github.com/repos/pydata/xarray/issues/3252,524457928,MDEyOklzc3VlQ29tbWVudDUyNDQ1NzkyOA==,1217238,2019-08-23T20:50:29Z,2019-08-23T20:50:29Z,MEMBER,"Linear interpolation for 1d -> nd is just a matter of averaging two
indexing selections. If we leverage xarray's vectorized indexing operations
to do the hard work, it should work automatically for dask arrays, sparse
arrays and xarray's internal backend array types, all with any number of
dimensions.
On Fri, Aug 23, 2019 at 2:17 PM Noah D Brenowitz
wrote:
> In my experience, computing w efficiently is the tricky part. The
> function is slightly different, but metpy
>
> uses a lot of tricks to make this work efficiently. A manual for-loop is
> much cleaner for this kind of stencil calculation IMO. What kind of duck
> arrays were you thinking of?
>
> —
> You are receiving this because you authored the thread.
> Reply to this email directly, view it on GitHub
> ,
> or mute the thread
>
> .
>
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,484622545
https://github.com/pydata/xarray/issues/3252#issuecomment-524439002,https://api.github.com/repos/pydata/xarray/issues/3252,524439002,MDEyOklzc3VlQ29tbWVudDUyNDQzOTAwMg==,1217238,2019-08-23T19:43:21Z,2019-08-23T19:43:21Z,MEMBER,"We could implement linear interpolation just as `w * array.sel(..., method='ffill') + (1 - w) * array.sel(..., method='bfill')`, where `w` is calculated from the indexes. This is probably about as efficient as scipy. Numba could be slightly faster (by avoiding a few memory allocations) but it will be hard to make it work as generally, especially on duck arrays.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,484622545