id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 236460253,MDU6SXNzdWUyMzY0NjAyNTM=,1458,writing datasets derived from netCDF4 with compression fails,9341267,closed,0,,,4,2017-06-16T11:48:19Z,2019-08-01T00:27:43Z,2019-08-01T00:27:43Z,NONE,,,,"When I read a file with netCDF4 compression into a Dataset, a subsequent call to write the dataset using `to_netcdf` fails. For instance, using data from the POP model, I can convert output to netCDF4 using NCO ``` $ ncks --netcdf4 --deflate 1 $file nc4-test.nc ``` Then in Python: ``` ds = xr.open_dataset('nc4-test.nc',decode_times=False,decode_coords=False) ds.to_netcdf('test-out.nc') ``` The write fails with: ``` File ""netCDF4/_netCDF4.pyx"", line 2263, in netCDF4._netCDF4.Dataset.createVariable (netCDF4/_netCDF4.c:18764) File ""netCDF4/_netCDF4.pyx"", line 3235, in netCDF4._netCDF4.Variable.__init__ (netCDF4/_netCDF4.c:31564) RuntimeError: NetCDF: Bad chunk sizes. ``` If I include `format = NETCDF3_64BIT`, the write completes. This seems like a bug. Example dataset: ftp://ftp.ucar.edu/pub/cgd/mclong/misc/nc4-test.nc ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue