html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/1464#issuecomment-311114129,https://api.github.com/repos/pydata/xarray/issues/1464,311114129,MDEyOklzc3VlQ29tbWVudDMxMTExNDEyOQ==,306380,2017-06-26T16:39:24Z,2017-06-26T16:39:24Z,MEMBER,"Presumably there is some object in the task graph that we don't know how to serialize. This can be fixed either in XArray, by not including such an object but recreating it each time or wrapping it, or in Dask, by learning how to (de)serialize it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,238284894
https://github.com/pydata/xarray/issues/1464#issuecomment-310817771,https://api.github.com/repos/pydata/xarray/issues/1464,310817771,MDEyOklzc3VlQ29tbWVudDMxMDgxNzc3MQ==,306380,2017-06-24T06:17:52Z,2017-06-24T06:17:52Z,MEMBER,"It's failing to serialize *something* in the task graph, I'm not sure what (I'm also surprised that the except clause didn't trigger and log the input). My first guess is that there is an open netcdf file object floating around within the task graph. If so then we should endeavor to avoid doing this (or have some file object proxy that *is* (de)serializable.
As a short-term workaround you might try starting a local cluster within the same process.
client = Client(processes=False)
This *might* help you to avoid serialization issues. Generally we should resolve the issue regardless though.
cc'ing @rabernat, who seems to have the most experience here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,238284894