home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

1 row where state = "closed" and user = 6360066 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

type 1

  • issue 1

state 1

  • closed · 1 ✖

repo 1

  • xarray 1
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
396063731 MDU6SXNzdWUzOTYwNjM3MzE= 2649 "Timestamp subtraction must have the same timezones or no timezones" when saving a NetCDF matteodefelice 6360066 closed 0     6 2019-01-04T20:52:50Z 2019-01-15T20:02:18Z 2019-01-05T19:06:54Z NONE      

I have an issue when saving a Dataset to NetCDF. This is the example NetCDF I am using.

python import xarray as xr d = xr.open_dataset('example.nc') d.to_netcdf('out.nc') Then I get: ```python


TypeError Traceback (most recent call last) <ipython-input-4-baf698f1bf45> in <module> ----> 1 d.to_netcdf('out.nc')

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute) 1241 engine=engine, encoding=encoding, 1242 unlimited_dims=unlimited_dims, -> 1243 compute=compute) 1244 1245 def to_zarr(self, store=None, mode='w-', synchronizer=None, group=None,

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile) 747 # to be parallelized with dask 748 dump_to_store(dataset, store, writer, encoding=encoding, --> 749 unlimited_dims=unlimited_dims) 750 if autoclose: 751 store.close()

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/backends/api.py in dump_to_store(dataset, store, writer, encoder, encoding, unlimited_dims) 790 791 store.store(variables, attrs, check_encoding, writer, --> 792 unlimited_dims=unlimited_dims) 793 794

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set, writer, unlimited_dims) 259 writer = ArrayWriter() 260 --> 261 variables, attributes = self.encode(variables, attributes) 262 263 self.set_attributes(attributes)

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/backends/common.py in encode(self, variables, attributes) 345 # All NetCDF files get CF encoded by default, without this attempting 346 # to write times, for example, would fail. --> 347 variables, attributes = cf_encoder(variables, attributes) 348 variables = OrderedDict([(k, self.encode_variable(v)) 349 for k, v in variables.items()])

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/conventions.py in cf_encoder(variables, attributes) 603 """ 604 new_vars = OrderedDict((k, encode_cf_variable(v, name=k)) --> 605 for k, v in iteritems(variables)) 606 return new_vars, attributes

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/conventions.py in <genexpr>(.0) 603 """ 604 new_vars = OrderedDict((k, encode_cf_variable(v, name=k)) --> 605 for k, v in iteritems(variables)) 606 return new_vars, attributes

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/conventions.py in encode_cf_variable(var, needs_copy, name) 233 variables.CFMaskCoder(), 234 variables.UnsignedIntegerCoder()]: --> 235 var = coder.encode(var, name=name) 236 237 # TODO(shoyer): convert all of these to use coders, too:

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/coding/times.py in encode(self, variable, name) 393 data, 394 encoding.pop('units', None), --> 395 encoding.pop('calendar', None)) 396 safe_setitem(attrs, 'units', units, name=name) 397 safe_setitem(attrs, 'calendar', calendar, name=name)

~/miniconda2/envs/cds/lib/python3.6/site-packages/xarray/coding/times.py in encode_cf_datetime(dates, units, calendar) 363 # an OverflowError is raised if the ref_date is too far away from 364 # dates to be encoded (GH 2272). --> 365 num = (pd.DatetimeIndex(dates.ravel()) - ref_date) / time_delta 366 num = num.values.reshape(dates.shape) 367

~/miniconda2/envs/cds/lib/python3.6/site-packages/pandas/core/indexes/datetimelike.py in sub(self, other) 898 result = self._add_offset(-other) 899 elif isinstance(other, (datetime, np.datetime64)): --> 900 result = self._sub_datelike(other) 901 elif is_integer(other): 902 # This check must come after the check for np.timedelta64

~/miniconda2/envs/cds/lib/python3.6/site-packages/pandas/core/indexes/datetimes.py in _sub_datelike(self, other) 876 # require tz compat 877 elif not self._has_same_tz(other): --> 878 raise TypeError("Timestamp subtraction must have the same " 879 "timezones or no timezones") 880 else:

TypeError: Timestamp subtraction must have the same timezones or no timezones ```

I have tried with Python 3.7 and 3.6. I have also installed the latest version of xarray hoping that this issue was linked with #2630. Apparently, with other similar NetCDFs I don't get the error but however this is not supposed to happen, given that the same exact code was working a couple of months ago.

INSTALLED VERSIONS ------------------ commit: None python: 3.6.7 | packaged by conda-forge | (default, Nov 20 2018, 18:20:05) [GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)] python-bits: 64 OS: Darwin OS-release: 18.2.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: en_US.UTF-8 LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.3 libnetcdf: 4.6.1 xarray: 0.11.1+9.g06244df pandas: 0.23.4 numpy: 1.15.4 scipy: 1.1.0 netCDF4: 1.4.2 pydap: None h5netcdf: None h5py: None Nio: None zarr: None cftime: 1.0.3.4 PseudonetCDF: None rasterio: None cfgrib: 0.9.5.1 iris: None bottleneck: None cyordereddict: None dask: None distributed: None matplotlib: 3.0.2 cartopy: 0.17.0 seaborn: None setuptools: 40.6.3 pip: 18.1 conda: None pytest: None IPython: 7.2.0 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2649/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 21.3ms · About: xarray-datasette