html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/2389#issuecomment-417076999,https://api.github.com/repos/pydata/xarray/issues/2389,417076999,MDEyOklzc3VlQ29tbWVudDQxNzA3Njk5OQ==,306380,2018-08-29T19:32:17Z,2018-08-29T19:32:17Z,MEMBER,"I wouldn't expect this to sway things too much, but yes, there is a chance that that would happen.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,355264812
https://github.com/pydata/xarray/issues/2389#issuecomment-417072024,https://api.github.com/repos/pydata/xarray/issues/2389,417072024,MDEyOklzc3VlQ29tbWVudDQxNzA3MjAyNA==,306380,2018-08-29T19:15:10Z,2018-08-29T19:15:10Z,MEMBER,"> It would be nice if dask had a way to consolidate the serialization of these objects, rather than separately serializing them in each task.
You can make it a separate task (often done by wrapping with dask.delayed) and then use that key within other objets. This does create a data dependency though, which can make the graph somewhat more complex.
In normal use of Pickle these things are cached and reused. Unfortunately we can't do this because we're sending the tasks to different machines, each of which will need to deserialize independently.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,355264812