html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/2862#issuecomment-478903202,https://api.github.com/repos/pydata/xarray/issues/2862,478903202,MDEyOklzc3VlQ29tbWVudDQ3ODkwMzIwMg==,24368151,2019-04-02T08:47:22Z,2019-04-02T08:47:22Z,NONE,"correct, that does the trick. thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,427768540 https://github.com/pydata/xarray/issues/2862#issuecomment-478765910,https://api.github.com/repos/pydata/xarray/issues/2862,478765910,MDEyOklzc3VlQ29tbWVudDQ3ODc2NTkxMA==,1217238,2019-04-01T22:10:02Z,2019-04-01T22:10:02Z,MEMBER,"Something like this should definitely work: ``` f = xr.open_dataset('dataset.nc') n = f.compute() f.close() n.to_netcdf(path='dataset.nc') ``` Deep copying maintains dask arrays, so they are still linked to the original file on disk. If you close that file, then dask is definitely going to error when you attempt to use it. I agree that there is an opportunity for better error messages here, though.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,427768540