home / github / issues

Menu
  • Search all tables
  • GraphQL API

issues: 1004873981

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1004873981 I_kwDOAMm_X8475Sj9 5809 DataArray to_netcdf returns invalid argument 26401994 closed 0     1 2021-09-22T23:57:56Z 2021-09-24T22:23:28Z 2021-09-24T22:23:28Z NONE      

What happened: When I save a dataset with to_netcdf, it shows the following error message: ``` RuntimeError Traceback (most recent call last) /tmp/ipykernel_16908/4157932485.py in <module> ----> 1 pr.to_netcdf('test.nc')

/global/homes/d/duan0000/.conda/envs/duan/lib/python3.8/site-packages/xarray/core/dataarray.py in to_netcdf(self, args, kwargs) 2820 dataset = self.to_dataset() 2821 -> 2822 return dataset.to_netcdf(args, **kwargs) 2823 2824 def to_dict(self, data: bool = True) -> dict:

/global/homes/d/duan0000/.conda/envs/duan/lib/python3.8/site-packages/xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf) 1898 from ..backends.api import to_netcdf 1899 -> 1900 return to_netcdf( 1901 self, 1902 path,

/global/homes/d/duan0000/.conda/envs/duan/lib/python3.8/site-packages/xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf) 1075 # TODO: allow this work (setting up the file for writing array data) 1076 # to be parallelized with dask -> 1077 dump_to_store( 1078 dataset, store, writer, encoding=encoding, unlimited_dims=unlimited_dims 1079 )

/global/homes/d/duan0000/.conda/envs/duan/lib/python3.8/site-packages/xarray/backends/api.py in dump_to_store(dataset, store, writer, encoder, encoding, unlimited_dims) 1122 variables, attrs = encoder(variables, attrs) 1123 -> 1124 store.store(variables, attrs, check_encoding, writer, unlimited_dims=unlimited_dims) 1125 1126

/global/homes/d/duan0000/.conda/envs/duan/lib/python3.8/site-packages/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set, writer, unlimited_dims) 264 self.set_attributes(attributes) 265 self.set_dimensions(variables, unlimited_dims=unlimited_dims) --> 266 self.set_variables( 267 variables, check_encoding_set, writer, unlimited_dims=unlimited_dims 268 )

/global/homes/d/duan0000/.conda/envs/duan/lib/python3.8/site-packages/xarray/backends/common.py in set_variables(self, variables, check_encoding_set, writer, unlimited_dims) 302 name = _encode_variable_name(vn) 303 check = vn in check_encoding_set --> 304 target, source = self.prepare_variable( 305 name, v, check, unlimited_dims=unlimited_dims 306 )

/global/homes/d/duan0000/.conda/envs/duan/lib/python3.8/site-packages/xarray/backends/netCDF4_.py in prepare_variable(self, name, variable, check_encoding, unlimited_dims) 484 nc4_var = self.ds.variables[name] 485 else: --> 486 nc4_var = self.ds.createVariable( 487 varname=name, 488 datatype=datatype,

src/netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Dataset.createVariable()

src/netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.init()

src/netCDF4/_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

RuntimeError: NetCDF: Invalid argument ```

Minimal Complete Verifiable Example: The code is pretty simple: python pr.to_netcdf('test.nc')

Anything else we need to know?: I checked the version of my Xarray. It shows 0.20.0 from conda list but from my jupyter notebook it is 0.19.0 from xr.__version__ Environment: netCDF4.__netcdf4libversion__ shows 4.8.1

Thanks!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5809/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 0 rows from issues_id in issues_labels
  • 1 row from issue in issue_comments
Powered by Datasette · Queries took 162.882ms · About: xarray-datasette