html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/3922#issuecomment-626181186,https://api.github.com/repos/pydata/xarray/issues/3922,626181186,MDEyOklzc3VlQ29tbWVudDYyNjE4MTE4Ng==,2448579,2020-05-09T14:04:23Z,2020-05-09T14:04:56Z,MEMBER,"The test fails for object arrays because we compute eagerly in `nanops._nan_argminmax_object` to raise an error for all-NaN slices.
To solve this we could
1. fix #3884 so that `nanargmin`, `nanargmax` never raise an error for all-NaN slices.
2. Figure out some clever way to raise the error at compute time rather than graph construction time.
For now, I bumped up `max_computes` to 1 for object arrays. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,591101988
https://github.com/pydata/xarray/pull/3922#issuecomment-615935096,https://api.github.com/repos/pydata/xarray/issues/3922,615935096,MDEyOklzc3VlQ29tbWVudDYxNTkzNTA5Ng==,2448579,2020-04-18T19:51:32Z,2020-04-18T19:55:01Z,MEMBER,"The compute error is from here:
https://github.com/pydata/xarray/blob/6a6f2c8748464c89a61dbdbc9636bce78a965369/xarray/core/nanops.py#L48-L60
I think we'll have to rethink the skipna conditions for dask arrays so that the compute doesn't happen.
Or figure out why we do this check in the first place. hmm...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,591101988
https://github.com/pydata/xarray/pull/3922#issuecomment-615932590,https://api.github.com/repos/pydata/xarray/issues/3922,615932590,MDEyOklzc3VlQ29tbWVudDYxNTkzMjU5MA==,2448579,2020-04-18T19:32:24Z,2020-04-18T19:32:24Z,MEMBER,"Yeah interestingly we don't raise an error when trying to chunk IndexVariables.
I've pushed a commit where we extract the underlying numpy array, chunk that, index it and then wrap it up in a DataArray o_O.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,591101988
https://github.com/pydata/xarray/pull/3922#issuecomment-614043968,https://api.github.com/repos/pydata/xarray/issues/3922,614043968,MDEyOklzc3VlQ29tbWVudDYxNDA0Mzk2OA==,2448579,2020-04-15T13:34:22Z,2020-04-15T13:34:22Z,MEMBER,Not a problem. Thanks for working on this!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,591101988
https://github.com/pydata/xarray/pull/3922#issuecomment-614030919,https://api.github.com/repos/pydata/xarray/issues/3922,614030919,MDEyOklzc3VlQ29tbWVudDYxNDAzMDkxOQ==,2448579,2020-04-15T13:11:00Z,2020-04-15T13:11:00Z,MEMBER,"@kmuehlbauer I've pushed a commit adding the decorator to just the 2D `test_idxmax`. The decorator does nothing for numpy so all the numpy tests pass but the dask tests fail now (because it is indeed computing things). Prior to this commit the 2D tests were passing, so we should go back to the `map_blocks` solution I guess.
> because you can't index a NumPy array with a dask array?
I think @shoyer is right here.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,591101988
https://github.com/pydata/xarray/pull/3922#issuecomment-613555599,https://api.github.com/repos/pydata/xarray/issues/3922,613555599,MDEyOklzc3VlQ29tbWVudDYxMzU1NTU5OQ==,2448579,2020-04-14T16:49:26Z,2020-04-14T16:49:26Z,MEMBER,I suggest updating the tests before reverting anything. This solution may work ...,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,591101988
https://github.com/pydata/xarray/pull/3922#issuecomment-613546215,https://api.github.com/repos/pydata/xarray/issues/3922,613546215,MDEyOklzc3VlQ29tbWVudDYxMzU0NjIxNQ==,2448579,2020-04-14T16:31:24Z,2020-04-14T16:31:24Z,MEMBER,"If your tests are passing now, it's likely that they're computing things to make things work. We should add the `with raise_if_dask_computes()` context (imported from `test_dask.py`) when testing with dask variables but it looks non-trivial given how the tests are structured at the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,591101988