html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/2912#issuecomment-832864415,https://api.github.com/repos/pydata/xarray/issues/2912,832864415,MDEyOklzc3VlQ29tbWVudDgzMjg2NDQxNQ==,34693887,2021-05-05T17:12:19Z,2021-05-05T17:12:19Z,NONE,"I had a similar issue. I am trying to save a big xarray (~2 GB) dataset using `to_netcdf()`. Dataset: ![image](https://user-images.githubusercontent.com/34693887/117181133-c3152600-ad89-11eb-81be-0d5c2e80a368.png) I tried the following three approaches: 1. Directly save using `dset.to_netcdf()` 2. Load before save using `dset.load().to_netcdf()` 3. Chunk data and save using `dset.chunk({'time': 19968}).to_netcdf()` All three approaches failed to write to file which cause the python kernel to hang indefinitely or die. Any suggestion?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,435535284