html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/1128#issuecomment-265966887,https://api.github.com/repos/pydata/xarray/issues/1128,265966887,MDEyOklzc3VlQ29tbWVudDI2NTk2Njg4Nw==,743508,2016-12-09T09:08:48Z,2016-12-09T09:08:48Z,CONTRIBUTOR,"@shoyer thanks, with a little testing it seems `lock=False` is fine (so don't automatically need dask dev for `lock=dask.utils.SerializableLock()`). Using spawning pool is necessary, just doesn't work without. Also looks like using dask distributed ipython backend works fine (works similar to spawn pool in that the worker engines aren't forked but kinda live in their own little world) - this is really nice because ipython in turn has good support for HPC systems (SGE batch scheduling + MPI for process handling).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,189817033
https://github.com/pydata/xarray/pull/1128#issuecomment-265875012,https://api.github.com/repos/pydata/xarray/issues/1128,265875012,MDEyOklzc3VlQ29tbWVudDI2NTg3NTAxMg==,743508,2016-12-08T22:28:25Z,2016-12-08T22:28:25Z,CONTRIBUTOR,I'm trying out the latest code to subset a set of netcdf4 files with dask.multiprocessing using `set_options(get=dask.multiprocessing.get)` but I'm still getting `TypeError: can't pickle _thread.lock objects` - this expect or there something specific I need to do to make it work?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,189817033