pull_requests: 340145984
This data as json
id | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | auto_merge | repo | url | merged_by |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
340145984 | MDExOlB1bGxSZXF1ZXN0MzQwMTQ1OTg0 | 3515 | closed | 0 | Recursive tokenization | 6213168 | After misreading the dask documentation <https://docs.dask.org/en/latest/custom-collections.html#deterministic-hashing>, I was under the impression that the output of ``__dask_tokenize__`` would be recursively parsed, like it happens for ``__getstate__`` or ``__reduce__``. That's not the case - the output of ``__dask_tokenize__`` is just fed into a str() function so it has to be made explicitly recursive! | 2019-11-12T22:35:13Z | 2019-11-13T00:54:32Z | 2019-11-13T00:53:27Z | 2019-11-13T00:53:27Z | e70138b61033081e3bfab3aaaec5997716cd7109 | 6213168 | 0 | 36ad4f7d4d2f238cccb20d48c83d604ad431c49d | b74f80ca2df4920f711f9fe5762458c53ce3c2c6 | MEMBER | 13221727 | https://github.com/pydata/xarray/pull/3515 |
Links from other tables
- 0 rows from pull_requests_id in labels_pull_requests