issues
2 rows where user = 17178478 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
357729048 | MDU6SXNzdWUzNTc3MjkwNDg= | 2404 | Volatile error: `unsupported dtype for netCDF4 variable: object` | brynpickering 17178478 | open | 0 | 3 | 2018-09-06T16:16:05Z | 2019-12-27T21:01:21Z | NONE | Problem descriptionHi, sometimes we have object dtypes in the coords (which are actually string dtypes that are not correctly captured). This doesn't usually cause issues on saving to NetCDF. But, for some Datasets, it leads to the error: example of one of the coords that causes the issue:
Specific error stream: ``` ~\Miniconda3\envs\calliope\lib\site-packages\xarray\core\dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute) 1148 engine=engine, encoding=encoding, 1149 unlimited_dims=unlimited_dims, -> 1150 compute=compute) 1151 1152 def to_zarr(self, store=None, mode='w-', synchronizer=None, group=None, ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, writer, encoding, unlimited_dims, compute) 722 try: 723 dataset.dump_to_store(store, sync=sync, encoding=encoding, --> 724 unlimited_dims=unlimited_dims, compute=compute) 725 if path_or_file is None: 726 return target.getvalue() ~\Miniconda3\envs\calliope\lib\site-packages\xarray\core\dataset.py in dump_to_store(self, store, encoder, sync, encoding, unlimited_dims, compute) 1073 1074 store.store(variables, attrs, check_encoding, -> 1075 unlimited_dims=unlimited_dims) 1076 if sync: 1077 store.sync(compute=compute) ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\common.py in store(self, variables, attributes, check_encoding_set, unlimited_dims) 372 self.set_dimensions(variables, unlimited_dims=unlimited_dims) 373 self.set_variables(variables, check_encoding_set, --> 374 unlimited_dims=unlimited_dims) 375 376 def set_attributes(self, attributes): ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in set_variables(self, args, kwargs) 405 def set_variables(self, args, kwargs): 406 with self.ensure_open(autoclose=False): --> 407 super(NetCDF4DataStore, self).set_variables(*args, kwargs) 408 409 def encode_variable(self, variable): ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\common.py in set_variables(self, variables, check_encoding_set, unlimited_dims) 409 check = vn in check_encoding_set 410 target, source = self.prepare_variable( --> 411 name, v, check, unlimited_dims=unlimited_dims) 412 413 self.writer.add(source, target) ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in prepare_variable(self, name, variable, check_encoding, unlimited_dims) 418 unlimited_dims=None): 419 datatype = _get_datatype(variable, self.format, --> 420 raise_on_invalid_encoding=check_encoding) 421 attrs = variable.attrs.copy() 422 ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in _get_datatype(var, nc_format, raise_on_invalid_encoding) 99 def _get_datatype(var, nc_format='NETCDF4', raise_on_invalid_encoding=False): 100 if nc_format == 'NETCDF4': --> 101 datatype = _nc4_dtype(var) 102 else: 103 if 'dtype' in var.encoding: ~\Miniconda3\envs\calliope\lib\site-packages\xarray\backends\netCDF4_.py in _nc4_dtype(var) 122 else: 123 raise ValueError('unsupported dtype for netCDF4 variable: {}' --> 124 .format(var.dtype)) 125 return dtype 126 ValueError: unsupported dtype for netCDF4 variable: object ``` When debugging at Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/2404/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
301031693 | MDU6SXNzdWUzMDEwMzE2OTM= | 1949 | Removing dimensions from Dataset objects | brynpickering 17178478 | closed | 0 | 9 | 2018-02-28T13:53:23Z | 2019-03-03T19:39:40Z | 2019-03-03T19:39:40Z | NONE |
I have a dataset that is produced programatically and can at times end up producing an empty DataArray. This isn't an issue, per-se, because I can remove those empty DataArrays later. However, I cannot find any way in which to remove the unused, empty dimension! I've tried deleting, dropping, resetting indeces, etc., and have had no luck in purging this empty dimension. It causes issues down the line as existence of entries in the dimensions list triggers certain events. Is there a way to remove a dimension (and possibly then all data variables which depend on it)? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1949/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);