home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 387848141

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/pull/2104#issuecomment-387848141 https://api.github.com/repos/pydata/xarray/issues/2104 387848141 MDEyOklzc3VlQ29tbWVudDM4Nzg0ODE0MQ== 10050469 2018-05-09T19:24:27Z 2018-05-09T19:24:27Z MEMBER

@fujiisoup I gave a go at your branch this evening, trying a few things on real data cases. First of all: this is great! Really looking forward to see this in xarray.

I noticed a small issue: currently this doesn't work with decreasing coordinates:

python nx = 5 ny = 4 lon = np.arange(nx) lat = np.arange(ny)[::-1] data = np.arange(nx*ny).reshape((ny, nx)) da = xr.DataArray(data, dims=('lat', 'lon'), coords={'lon':lon, 'lat':lat}) da.interp(lon=[1.1, 2.2, 3.3], lat=[1.2, 2.3]) -> ValueError from SciPy: The points in dimension 0 must be strictly ascending Traceback:

--------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-87-98b53090c21d> in <module>() ----> 1 da.interp(lon=[1.1, 2.2, 3.3], lat=[2.3, 1.2]) ~/tmp/xarray-fuji/xarray/core/dataarray.py in interp(self, method, kwargs, **coords) 912 """ 913 ds = self._to_temp_dataset().interp( --> 914 method=method, kwargs=kwargs, **coords) 915 return self._from_temp_dataset(ds) 916 ~/tmp/xarray-fuji/xarray/core/dataset.py in interp(self, method, kwargs, **coords) 1824 if name not in [k for k, v in indexers_list]: 1825 variables[name] = missing.interp( -> 1826 var, var_indexers, method, **kwargs) 1827 1828 coord_names = set(variables).intersection(self._coord_names) ~/tmp/xarray-fuji/xarray/core/missing.py in interp(obj, indexes_coords, method, **kwargs) 441 kwargs={'x': x, 'new_x': destination, 'method': method, 442 'kwargs': kwargs}, --> 443 keep_attrs=True) 444 445 if all(x1.dims == new_x1.dims for x1, new_x1 in zip(x, new_x)): ~/tmp/xarray-fuji/xarray/core/computation.py in apply_ufunc(func, *args, **kwargs) 934 keep_attrs=keep_attrs) 935 elif any(isinstance(a, Variable) for a in args): --> 936 return variables_ufunc(*args) 937 else: 938 return apply_array_ufunc(func, *args, dask=dask) ~/tmp/xarray-fuji/xarray/core/computation.py in apply_variable_ufunc(func, *args, **kwargs) 563 raise ValueError('unknown setting for dask array handling in ' 564 'apply_ufunc: {}'.format(dask)) --> 565 result_data = func(*input_data) 566 567 if signature.num_outputs > 1: ~/tmp/xarray-fuji/xarray/core/missing.py in interp_func(obj, x, new_x, method, kwargs) 499 500 func, kwargs = _get_interpolator_nd(method, **kwargs) --> 501 return _interpnd(obj, x, new_x, func, kwargs) 502 503 ~/tmp/xarray-fuji/xarray/core/missing.py in _interpnd(obj, x, new_x, func, kwargs) 518 # stack new_x to 1 vector, with reshape 519 xi = np.stack([x1.values.ravel() for x1 in new_x], axis=-1) --> 520 rslt = func(x, obj, xi, **kwargs) 521 # move back the interpolation axes to the last position 522 rslt = rslt.transpose(range(-rslt.ndim + 1, 1)) ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/scipy/interpolate/interpolate.py in interpn(points, values, xi, method, bounds_error, fill_value) 2589 if not np.all(np.diff(p) > 0.): 2590 raise ValueError("The points in dimension %d must be strictly " -> 2591 "ascending" % i) 2592 if not np.asarray(p).ndim == 1: 2593 raise ValueError("The points in dimension %d must be " ValueError: The points in dimension 0 must be strictly ascending

Decreasing coordinates are really frequent in climate data (latitudes are often decreasing, don't ask me why), so I wondered how I implemented this in salem, which brings me to the second point: I think we need an interp() equivalent to isel() too, which is the one I could make use of with salem (and, maybe, several other georeferencing tools).

The way salem works is that it uses the concept of geospatial grid where the points to interpolate are defined in a local coordinate system which is always positive and increasing. That is, if I want to use xarray's interpolation routines, I'd like to be able to tell xarray that I'd like to interpolate at "indices [1.1, 2.1, 3.1]" for example, where these can really be understood as indexes in the way da.isel() sees them.

I don't know if we really need a i_interp for this though, or if it could be solved with a kwarg to the existing API. The implementation for you would be straightforward, you'd just have to pass np.arange(NN) to scipy instead of the actual coordinates.

Thoughts?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  320275317
Powered by Datasette · Queries took 0.75ms · About: xarray-datasette