home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

3 rows where state = "closed" and user = 1005109 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 2
  • issue 1

state 1

  • closed · 3 ✖

repo 1

  • xarray 3
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1061430200 PR_kwDOAMm_X84u6iop 6019 Use complex nan by default when interpolating out of bounds pums974 1005109 closed 0     0 2021-11-23T15:38:25Z 2021-11-28T04:40:06Z 2021-11-28T04:40:06Z CONTRIBUTOR   0 pydata/xarray/pulls/6019
  • [X] Tests added
  • [X] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

When using the da.interp to interpolate outside of the bounds, by default, fill_value is set to np.nan to set the values to NaN. This is completely fine with real values, but with complex values this will in fact set the values to np.nan + 0j which can be a source of confusion and bugs. This PR propose to detect if values are complex, and if so, to use np.nan + np.nan*1j as the default fill_value

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6019/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
638909879 MDExOlB1bGxSZXF1ZXN0NDM0NTg2NDg5 4155 Implement interp for interpolating between chunks of data (dask) pums974 1005109 closed 0     42 2020-06-15T14:42:32Z 2020-09-06T12:27:15Z 2020-08-11T23:15:49Z CONTRIBUTOR   0 pydata/xarray/pulls/4155

In a project of mine I need to interpolate a dask-based xarray between chunk of data.

When using the current official interp function (xarray v0.15.1), the code:

```python datax = xr.DataArray(data=da.from_array(np.arange(0, 4), chunks=2), coords={"x": np.linspace(0, 1, 4)}, dims="x") datay = xr.DataArray(data=da.from_array(np.arange(0, 4), chunks=2), coords={"y": np.linspace(0, 1, 4)}, dims="y") data = datax * datay

# both of these interp call fails
res = datax.interp(x=np.linspace(0, 1))
print(res.load())

res = data.interp(x=np.linspace(0, 1), y=0.5)
print(res.load())

`` fails withNotImplementedError: Chunking along the dimension to be interpolated (0) is not yet supported.`, but succeed with this version

I also want to alert that my version does not work with "advanced interpolation" (as shown in the xarray documentation) Also, my version cannot be used to make interpolate_na work with chunked data

  • [x] Closes #4078
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4155/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
621021621 MDU6SXNzdWU2MjEwMjE2MjE= 4078 Feature request: Implement interp for interpolating between chunks of data (dask) pums974 1005109 closed 0     6 2020-05-19T14:26:10Z 2020-08-11T23:15:48Z 2020-08-11T23:15:48Z CONTRIBUTOR      

In a project of mine I need to interpolate a dask-based xarray between chunk of data.

I made it work using monkey patching. I'm pretty sure that you can write it better but I made it as good as I could.

I hope that what I wrote can help you implement it properly.

```python from typing import Union, Tuple, Callable, Any, List

import dask.array as da import numpy as np import xarray as xr import xarray.core.missing as m

def interp_func(var: Union[np.ndarray, da.Array], x: Tuple[xr.DataArray, ...], new_x: Tuple[xr.DataArray, ...], method: str, kwargs: Any) -> da.Array: """ multi-dimensional interpolation for array-like. Interpolated axes should be located in the last position.

Parameters
----------
var: np.ndarray or dask.array.Array
    Array to be interpolated. The final dimension is interpolated.
x: a list of 1d array.
    Original coordinates. Should not contain NaN.
new_x: a list of 1d array
    New coordinates. Should not contain NaN.
method: string
    {'linear', 'nearest', 'zero', 'slinear', 'quadratic', 'cubic'} for
    1-dimensional itnterpolation.
    {'linear', 'nearest'} for multidimensional interpolation
**kwargs:
    Optional keyword arguments to be passed to scipy.interpolator

Returns
-------
interpolated: array
    Interpolated array

Note
----
This requiers scipy installed.

See Also
--------
scipy.interpolate.interp1d
"""

try:
    # try the official interp_func first
    res = official_interp_func(var, x, new_x, method, kwargs)
    return res
except NotImplementedError:
    # may fail if interpolating between chunks
    pass

if len(x) == 1:
    func, _kwargs = m._get_interpolator(method, vectorizeable_only=True,
                                        **kwargs)
else:
    func, _kwargs = m._get_interpolator_nd(method, **kwargs)

# reshape new_x (TODO REMOVE ?)
current_dims = [_x.name for _x in x]
new_x = tuple([_x.set_dims(current_dims) for _x in new_x])

# number of non interpolated dimensions
nconst = var.ndim - len(x)

# duplicate the ghost cells of the array
bnd = {i: "none" for i in range(len(var.shape))}
depth = {i: 0 if i < nconst else 1 for i in range(len(var.shape))}
var_with_ghost = da.overlap.overlap(var, depth=depth, boundary=bnd)

# chunks x and duplicate the ghost cells of x
x = tuple(da.from_array(_x, chunks=chunks) for _x, chunks in zip(x, var.chunks[nconst:]))
x_with_ghost = tuple(da.overlap.overlap(_x, depth={0: 1}, boundary={0: "none"})
                     for _x in x)

# compute final chunks
chunks_end = [np.cumsum(sizes) - 1 for _x in x
                                   for sizes in _x.chunks]
chunks_end_with_ghost = [np.cumsum(sizes) - 1 for _x in x_with_ghost
                                              for sizes in _x.chunks]
total_chunks = []
for dim, ce in enumerate(zip(chunks_end, chunks_end_with_ghost)):
    l_new_x_ends: List[np.ndarray] = []
    for iend, iend_with_ghost in zip(*ce):

        arr = np.moveaxis(new_x[dim].data, dim, -1)
        arr = arr[tuple([0] * (len(arr.shape) - 1))]

        n_no_ghost = (arr <= x[dim][iend]).sum()
        n_ghost = (arr <= x_with_ghost[dim][iend_with_ghost]).sum()

        equil = np.ceil(0.5 * (n_no_ghost + n_ghost)).astype(int)

        l_new_x_ends.append(equil)

    new_x_ends = np.array(l_new_x_ends)
    chunks = new_x_ends[0], *(new_x_ends[1:] - new_x_ends[:-1])
    total_chunks.append(tuple(chunks))
final_chunks = var.chunks[:-len(x)] + tuple(total_chunks)

# chunks new_x
new_x = tuple(da.from_array(_x, chunks=total_chunks) for _x in new_x)

# reshape x_with_ghost (TODO REMOVE ?)
x_with_ghost = da.meshgrid(*x_with_ghost, indexing='ij')

# compute on chunks (TODO use drop_axis and new_axis ?)
res = da.map_blocks(_myinterpnd, var_with_ghost, func, _kwargs, len(x_with_ghost), *x_with_ghost, *new_x,
                    dtype=var.dtype, chunks=final_chunks)

# reshape res and remove empty chunks (TODO REMOVE ?)
res = res.squeeze()
new_chunks = tuple([tuple([chunk for chunk in chunks if chunk > 0]) for chunks in res.chunks])
res = res.rechunk(new_chunks)
return res

def _myinterpnd(var: da.Array, func: Callable[..., Any], kwargs: Any, nx: int, *arrs: da.Array) -> da.Array: _old_x, _new_x = arrs[:nx], arrs[nx:]

# reshape x (TODO REMOVE ?)
old_x = tuple([np.moveaxis(tmp, dim, -1)[tuple([0] * (len(tmp.shape) - 1))]
               for dim, tmp in enumerate(_old_x)])

new_x = tuple([xr.DataArray(_x) for _x in _new_x])

return m._interpnd(var, old_x, new_x, func, kwargs)

official_interp_func = m.interp_func m.interp_func = interp_func ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4078/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 23.803ms · About: xarray-datasette