html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/3350#issuecomment-560115162,https://api.github.com/repos/pydata/xarray/issues/3350,560115162,MDEyOklzc3VlQ29tbWVudDU2MDExNTE2Mg==,2448579,2019-12-01T14:33:08Z,2019-12-01T14:33:08Z,MEMBER,"> The size zero dimension is a give-away that the problem has something to do with dask's _meta propagation. I think the size 0 results from `chunk()`. With `chunk(2)` other weird errors come up: ``` TypeError: tuple indices must be integers or slices, not tuple ``` We were specifying a name for the chunked array in `Dataset.chunk` but this name was independent of chunk sizes i.e. `ds.chunk()` & `ds.chunk(2)` have the same names which ends up confusing dask (I think). #3584 fixes this by providing `chunks` as an input to `tokenize`. I also needed to add `__dask_tokenize__` to `ReprObject` so that names were reproducible after going through a `to_temp_dataset` transformation","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,499477368 https://github.com/pydata/xarray/issues/3350#issuecomment-535990462,https://api.github.com/repos/pydata/xarray/issues/3350,535990462,MDEyOklzc3VlQ29tbWVudDUzNTk5MDQ2Mg==,1217238,2019-09-27T15:35:55Z,2019-09-27T15:35:55Z,MEMBER,"Interestingly, it looks like the difference comes down to whether we chunk DataArrays or Datasets. The former produces graphs with fixed (reproducible) keys, the later doesn't: ``` In [57]: dict(ds.chunk().x.data.dask) Out[57]: {('xarray-x-a46bb46a12a44073da484c1311d00dec', 0): array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])} In [58]: dict(ds.chunk().x.data.dask) Out[58]: {('xarray-x-a46bb46a12a44073da484c1311d00dec', 0): array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])} In [59]: dict(ds.x.chunk().data.dask) Out[59]: {('xarray--d75d5cc0f0ce1b56590d80702339c0f0', 0): array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])} In [60]: dict(ds.x.chunk().data.dask) Out[60]: {('xarray--0f78e51941cfb0e25d41ac24ef330a50', 0): array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])} ``` But clearly this should work either way. The size zero dimension is a give-away that the problem has something to do with dask's `_meta` propagation.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,499477368 https://github.com/pydata/xarray/issues/3350#issuecomment-535987790,https://api.github.com/repos/pydata/xarray/issues/3350,535987790,MDEyOklzc3VlQ29tbWVudDUzNTk4Nzc5MA==,1217238,2019-09-27T15:28:41Z,2019-09-27T15:28:41Z,MEMBER,"Here's a slightly simpler case: ``` In [28]: ds = xr.Dataset({'x': (('y',), np.zeros(10))}) In [29]: (ds.chunk().isnull() & ds.chunk(5).isnull()).compute() ValueError: operands could not be broadcast together with shapes (0,) (5,) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,499477368