issue_comments: 437633544
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/issues/2554#issuecomment-437633544 | https://api.github.com/repos/pydata/xarray/issues/2554 | 437633544 | MDEyOklzc3VlQ29tbWVudDQzNzYzMzU0NA== | 40218891 | 2018-11-11T00:38:03Z | 2018-11-11T00:38:03Z | NONE | Another puzzle, I don't know it is related to the crashes. Trying to localize the issue I added line after
This prints: ``` ======= hlcy (1, 85) ======= cdbp (1, 85) ======= hovi (1, 85) ======= itim (1024,) RuntimeError Traceback (most recent call last) <ipython-input-5-aeb92962e874> in <module>() 1 ds0 = xr.open_dataset('/tmp/nam/bufr.701940/bufr.701940.2010123112.nc') ----> 2 ds0.to_netcdf('/tmp/d0.nc') /usr/local/Python-3.6.5/lib/python3.6/site-packages/xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute) 1220 engine=engine, encoding=encoding, 1221 unlimited_dims=unlimited_dims, -> 1222 compute=compute) 1223 1224 def to_zarr(self, store=None, mode='w-', synchronizer=None, group=None, /usr/local/Python-3.6.5/lib/python3.6/site-packages/xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile) 718 # to be parallelized with dask 719 dump_to_store(dataset, store, writer, encoding=encoding, --> 720 unlimited_dims=unlimited_dims) 721 if autoclose: 722 store.close() /usr/local/Python-3.6.5/lib/python3.6/site-packages/xarray/backends/api.py in dump_to_store(dataset, store, writer, encoder, encoding, unlimited_dims) 761 762 store.store(variables, attrs, check_encoding, writer, --> 763 unlimited_dims=unlimited_dims) 764 765 /usr/local/Python-3.6.5/lib/python3.6/site-packages/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set, writer, unlimited_dims) 264 self.set_dimensions(variables, unlimited_dims=unlimited_dims) 265 self.set_variables(variables, check_encoding_set, writer, --> 266 unlimited_dims=unlimited_dims) 267 268 def set_attributes(self, attributes): /usr/local/Python-3.6.5/lib/python3.6/site-packages/xarray/backends/common.py in set_variables(self, variables, check_encoding_set, writer, unlimited_dims) 302 check = vn in check_encoding_set 303 target, source = self.prepare_variable( --> 304 name, v, check, unlimited_dims=unlimited_dims) 305 306 writer.add(source, target) /usr/local/Python-3.6.5/lib/python3.6/site-packages/xarray/backends/netCDF4_.py in prepare_variable(self, name, variable, check_encoding, unlimited_dims) 466 least_significant_digit=encoding.get( 467 'least_significant_digit'), --> 468 fill_value=fill_value) 469 _disable_auto_decode_variable(nc4_var) 470 netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Dataset.createVariable() netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.init() netCDF4/_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success() RuntimeError: NetCDF: Bad chunk sizes. ``` The dataset is:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
379472634 |