html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/2281#issuecomment-497468930,https://api.github.com/repos/pydata/xarray/issues/2281,497468930,MDEyOklzc3VlQ29tbWVudDQ5NzQ2ODkzMA==,6213168,2019-05-30T20:12:29Z,2019-05-30T20:12:29Z,MEMBER,"@fspaolo where does that huge number come from? I thought you said you have 1500 nodes in total. Did you select a single point on the t dimension before you applied bisplrep?
Also, (pardon the ignorance, I never dealt with geographical data) what kind of information does having your lat and lon being bidimensional convey? Does it imply ``lat[i, j] < lat[i +1, j] and lon[i, j] < lon[i, j+1]`` for any possible (i, j)? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,340486433
https://github.com/pydata/xarray/issues/2281#issuecomment-497254984,https://api.github.com/repos/pydata/xarray/issues/2281,497254984,MDEyOklzc3VlQ29tbWVudDQ5NzI1NDk4NA==,6213168,2019-05-30T08:45:16Z,2019-05-30T08:50:13Z,MEMBER,"I did not test it but this looks like what you want:
```
from scipy.interpolate import bisplrep, bisplev
x = cube1.x.values.ravel()
y = cube1.y.values.ravel()
z = cube1.values.ravel()
x_new = cube2.x.values.ravel()
y_new = cube2.y.values.ravel()
tck = bisplrep(x, y, z)
z_new = bisplev(x_new, y_new, tck)
z_new = z_new.reshape(cube2.shape)
cube3 = xarray.DataArray(z_new, dims=cube2.dims, coords=cube2.coords)
```
I read above that you have concerns about performance as the above does not understand the geometry of the input data - did you run performance tests on it already?
[EDIT] you will probably need to break down your problem on 1-point slices along dimension t before you apply the above.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,340486433
https://github.com/pydata/xarray/issues/2281#issuecomment-497251626,https://api.github.com/repos/pydata/xarray/issues/2281,497251626,MDEyOklzc3VlQ29tbWVudDQ5NzI1MTYyNg==,6213168,2019-05-30T08:33:16Z,2019-05-30T08:33:51Z,MEMBER,"@fspaolo sorry, I should have taken more time re-reading the initial post. No, xarray_extras.interpolate does not do the kind of interpolation you want. Have you looked into scipy?
https://docs.scipy.org/doc/scipy/reference/interpolate.html#multivariate-interpolation
xarray is just a wrapper, and if scipy does what you need, it's trivial to unwrap your DataArray into a bunch of numpy arrays, feed them into scipy, and then re-wrap the output numpy arrays into a DataArray.
On the other hand, if scipy does *not* do what you want, then I suspect that opening a feature request on the scipy tracker would be a much better place than the xarray board. As a rule of thumb, any fancy algorithm should first exist for numpy-only data and then potentially it can be wrapped by the xarray library.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,340486433
https://github.com/pydata/xarray/issues/2281#issuecomment-497130177,https://api.github.com/repos/pydata/xarray/issues/2281,497130177,MDEyOklzc3VlQ29tbWVudDQ5NzEzMDE3Nw==,6213168,2019-05-29T22:22:01Z,2019-05-29T22:25:45Z,MEMBER,"@fspaolo 2d mesh interpolation and 1d interpolation with extra ""free"" dimensions are fundamentally different algorithms. Look up the scipy documentation on the various interpolation functions available.
I don't understand what you are trying to pass for x_new and y_new and it definitely doesn't sound right. Right now you have a 3d DataArray with dimensions (x, y, t) and 3 coords, *each of which is a 1d numpy array* (e.g. ``da.coords.x.values``). If you want to rescale, you need to pass a 1d numpy array or array-like for x_new, and another separate 1d array for y_new. You are not doing that, as the error message you're receiving is saying that your x_new is a numpy array with 2 or more dimensions, which the algorithm doesn't know what to do with. It can accept a multi-dimensional DataArrays with *brand new* dimensions, but that does not sound like it's your case. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,340486433
https://github.com/pydata/xarray/issues/2281#issuecomment-495871201,https://api.github.com/repos/pydata/xarray/issues/2281,495871201,MDEyOklzc3VlQ29tbWVudDQ5NTg3MTIwMQ==,6213168,2019-05-25T06:55:33Z,2019-05-25T06:59:03Z,MEMBER,"@fspaolo I never tried using my algorithm to perform 2D interpolation, but this should work:
```
from xarray_extras.interpolate import splrep, splev
da = splev(x_new, splrep(da, 'x'))
da = splev(y_new, splrep(da, 'y'))
da = splev(t_new, splrep(da, 't'))
```
Add k=1 to downgrade from cubic to linear interpolation and get a speed boost.
You can play around with dask to increase performance by using all your CPUs (or more with dask distributed), although you have to remember that an original dim can't be broken on multiple chunks when you apply splrep to it:
```
from xarray_extras.interpolate import splrep, splev
da = da.chunk(t=TCHUNK)
da = splev(x_new, splrep(da, 'x'))
da = splev(y_new, splrep(da, 'y'))
da = da.chunk(x=SCHUNK, y=SCHUNK).chunk(t=-1)
da = splev(t_new, splrep(da, 't'))
da = da.compute()
```
where TCHUNK and SCHUNK are integers you'll have to play with. The rule of thumb is that you want to have your chunks 5~100 MBs each.
If you end up finding out that chunking along an interpolation dimension is important for you, it _is_ possible to implement it with dask ghosting techniques, just painfully complicated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,340486433
https://github.com/pydata/xarray/issues/2281#issuecomment-495515463,https://api.github.com/repos/pydata/xarray/issues/2281,495515463,MDEyOklzc3VlQ29tbWVudDQ5NTUxNTQ2Mw==,6213168,2019-05-24T08:10:10Z,2019-05-24T08:10:10Z,MEMBER,"I am not aware of a ND mesh interpolation algorithm. However, my package xarray_extras [1] offers highly optimized 1D interpolation on a ND hypercube, on any numerical coord (not just time). You may try applying it 3 times on each dimension in sequence and see if you get what you want - although performance won't be optimal.
[1] https://xarray-extras.readthedocs.io/en/latest/
Alternatively, if you do find the exact algorithm you want, but it's for numpy, then applying it to xarray is simple - just get DataArray.values -> apply function -> create new DataArray from the output.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,340486433