home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

2 rows where user = 17178478 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 1
  • open 1

type 1

  • issue 2

repo 1

  • xarray 2
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
357729048 MDU6SXNzdWUzNTc3MjkwNDg= 2404 Volatile error: `unsupported dtype for netCDF4 variable: object` brynpickering 17178478 open 0     3 2018-09-06T16:16:05Z 2019-12-27T21:01:21Z   NONE      

Problem description

Hi, sometimes we have object dtypes in the coords (which are actually string dtypes that are not correctly captured). This doesn't usually cause issues on saving to NetCDF. But, for some Datasets, it leads to the error: unsupported dtype for netCDF4 variable: object. Once the Dataset is set up, the error will be raised without fail. On other models, also with a mix of string, float, int, and object dtypes in the coords, there's no issue.

example of one of the coords that causes the issue:

python <xarray.DataArray 'loc_techs_export' (loc_techs_export: 1)> array(['foo::bar'], dtype=object) Coordinates: * loc_techs_export (loc_techs_export) object 'foo::bar'

Specific error stream:

```

~\Miniconda3\envs\calliope\lib\site-packages\xarray\core\dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute) 1148 engine=engine, encoding=encoding, 1149 unlimited_dims=unlimited_dims, -> 1150 compute=compute) 1151 1152 def to_zarr(self, store=None, mode='w-', synchronizer=None, group=None,

~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, writer, encoding, unlimited_dims, compute) 722 try: 723 dataset.dump_to_store(store, sync=sync, encoding=encoding, --> 724 unlimited_dims=unlimited_dims, compute=compute) 725 if path_or_file is None: 726 return target.getvalue()

~\Miniconda3\envs\calliope\lib\site-packages\xarray\core\dataset.py in dump_to_store(self, store, encoder, sync, encoding, unlimited_dims, compute) 1073 1074 store.store(variables, attrs, check_encoding, -> 1075 unlimited_dims=unlimited_dims) 1076 if sync: 1077 store.sync(compute=compute)

~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\common.py in store(self, variables, attributes, check_encoding_set, unlimited_dims) 372 self.set_dimensions(variables, unlimited_dims=unlimited_dims) 373 self.set_variables(variables, check_encoding_set, --> 374 unlimited_dims=unlimited_dims) 375 376 def set_attributes(self, attributes):

~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in set_variables(self, args, kwargs) 405 def set_variables(self, args, kwargs): 406 with self.ensure_open(autoclose=False): --> 407 super(NetCDF4DataStore, self).set_variables(*args, kwargs) 408 409 def encode_variable(self, variable):

~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\common.py in set_variables(self, variables, check_encoding_set, unlimited_dims) 409 check = vn in check_encoding_set 410 target, source = self.prepare_variable( --> 411 name, v, check, unlimited_dims=unlimited_dims) 412 413 self.writer.add(source, target)

~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in prepare_variable(self, name, variable, check_encoding, unlimited_dims) 418 unlimited_dims=None): 419 datatype = _get_datatype(variable, self.format, --> 420 raise_on_invalid_encoding=check_encoding) 421 attrs = variable.attrs.copy() 422

~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in _get_datatype(var, nc_format, raise_on_invalid_encoding) 99 def _get_datatype(var, nc_format='NETCDF4', raise_on_invalid_encoding=False): 100 if nc_format == 'NETCDF4': --> 101 datatype = _nc4_dtype(var) 102 else: 103 if 'dtype' in var.encoding:

~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in _nc4_dtype(var) 122 else: 123 raise ValueError('unsupported dtype for netCDF4 variable: {}' --> 124 .format(var.dtype)) 125 return dtype 126

ValueError: unsupported dtype for netCDF4 variable: object ```

When debugging at ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\common.py in set_variables(self, variables, check_encoding_set, unlimited_dims), the OrderedDict over which it is looping deals with object dtype variables before reaching the variable that causes the error to be raised.

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.6.4.final.0 python-bits: 64 OS: Windows OS-release: 10 machine: AMD64 processor: Intel64 Family 6 Model 78 Stepping 3, GenuineIntel byteorder: little LC_ALL: None LANG: None LOCALE: None.None xarray: 0.10.8 pandas: 0.22.0 numpy: 1.14.2 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: 0.5.0 h5py: 2.7.1 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.1 distributed: 1.21.3 matplotlib: 2.2.2 cartopy: None seaborn: None setuptools: 38.5.2 pip: 18.0 conda: None pytest: 3.4.2 IPython: 6.2.1 sphinx: 1.7.1
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2404/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
301031693 MDU6SXNzdWUzMDEwMzE2OTM= 1949 Removing dimensions from Dataset objects brynpickering 17178478 closed 0     9 2018-02-28T13:53:23Z 2019-03-03T19:39:40Z 2019-03-03T19:39:40Z NONE      

python test_dataset = xr.Dataset(dict( empty_array=xr.DataArray([], dims='a'), populated_array=xr.DataArray([1], {'b':['1']}, 'b') ))

I have a dataset that is produced programatically and can at times end up producing an empty DataArray. This isn't an issue, per-se, because I can remove those empty DataArrays later. However, I cannot find any way in which to remove the unused, empty dimension! I've tried deleting, dropping, resetting indeces, etc., and have had no luck in purging this empty dimension. It causes issues down the line as existence of entries in the dimensions list triggers certain events.

Is there a way to remove a dimension (and possibly then all data variables which depend on it)?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1949/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 6078.629ms · About: xarray-datasette