home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 902516397

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/5715#issuecomment-902516397 https://api.github.com/repos/pydata/xarray/issues/5715 902516397 IC_kwDOAMm_X841y06t 9466648 2021-08-20T08:09:46Z 2021-08-20T08:10:50Z CONTRIBUTOR

The responsible code for the error originally comes from the call to da_a = da_a.map_blocks(_get_valid_values, args=[da_b]), which aim is to remove nan values from both DataArrays. I am confused by this given that the code lines below seems to accumplish something similar (despite of the comment saying it should not): ```python

4. Compute covariance along the given dim

N.B. skipna=False is required or there is a bug when computing

auto-covariance. E.g. Try xr.cov(da,da) for

da = xr.DataArray([[1, 2], [1, np.nan]], dims=["x", "time"])

cov = (demeaned_da_a * demeaned_da_b).sum(dim=dim, skipna=True, min_count=1) / ( valid_count ) ```

In any case, the parrallel module imports dask in a try catch block to ignore the import error. So this is not a surprise that when using dask latter there is an error if it was not imported. I can see two possibilities: - encapsulate all dask calls in a similar try/catch block - set a boolean in the first place and do the tests only if dask is correctly imported

Now I do not have any big picure there so there are probably better solutions.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  974488736
Powered by Datasette · Queries took 0.71ms · About: xarray-datasette