html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/6920#issuecomment-1216913006,https://api.github.com/repos/pydata/xarray/issues/6920,1216913006,IC_kwDOAMm_X85IiJ5u,13301940,2022-08-16T17:05:24Z,2022-08-16T17:05:24Z,MEMBER,"Great... keep us posted once you have a working solution. I'm going to convert this issue in a discussion instead. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1340474484 https://github.com/pydata/xarray/issues/6920#issuecomment-1216820021,https://api.github.com/repos/pydata/xarray/issues/6920,1216820021,IC_kwDOAMm_X85IhzM1,13301940,2022-08-16T15:46:44Z,2022-08-16T15:46:44Z,MEMBER,"@lassiterdc, writing large, chunked xarray dataset to a netCDF file is always a challenge and quite slow since the write is serial. However, you could take advantage of the [`xr.save_mfdataset()`](https://docs.xarray.dev/en/stable/generated/xarray.save_mfdataset.html) function to write to multiple netCDF files. here's a good example that showcase how to achieve this: https://ncar.github.io/esds/posts/2020/writing-multiple-netcdf-files-in-parallel-with-xarray-and-dask","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1340474484