home / github / issues

Menu
  • GraphQL API
  • Search all tables

issues: 2197290853

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
2197290853 I_kwDOAMm_X86C9_9l 8858 impossible to save in netcdf because of the time dimension 4996342 closed 0     8 2024-03-20T11:22:05Z 2024-03-20T15:50:26Z 2024-03-20T15:37:49Z NONE      

What happened?

I have a dataset ds with a variable TEMP(time,level,ni,nj). The time dimension a of type datetime64 but I get an error when I try to save a temporal selection.

What did you expect to happen?

Time dimension management should be completely transparent to the user when a datetime64 type is used.

Minimal Complete Verifiable Example

```Python ds.time xarray.DataArray 'time' time: 8760 array(['2015-01-01T00:00:00.000000', '2015-01-01T01:00:00.000000', '2015-01-01T02:00:00.000000', ..., '2015-12-31T21:00:00.000000', '2015-12-31T22:00:00.000000', '2015-12-31T23:00:00.000000'], dtype='datetime64[us]') Coordinates: time (time) datetime64[us] 2015-01-01 ... 2015-12-31T23:00:00 Indexes: (1) Attributes: axis : T conventions : relative number of seconds with no decimal part long_name : time in seconds (UT) standard_name : time time_origin : 01-JAN-1900 00:00:00 _FillValue : nan units : seconds since 1900-01-01

ds['TEMP'].isel(time=slice(0,1)).to_netcdf(f"./extract.nc")

AttributeError Traceback (most recent call last) Cell In[81], line 1 ----> 1 ds['TEMP'].isel(time=slice(0,1)).to_netcdf(f"./extract__2.nc")

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/core/dataarray.py:4081, in DataArray.to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf) 4077 else: 4078 # No problems with the name - so we're fine! 4079 dataset = self.to_dataset() -> 4081 return to_netcdf( # type: ignore # mypy cannot resolve the overloads:( 4082 dataset, 4083 path, 4084 mode=mode, 4085 format=format, 4086 group=group, 4087 engine=engine, 4088 encoding=encoding, 4089 unlimited_dims=unlimited_dims, 4090 compute=compute, 4091 multifile=False, 4092 invalid_netcdf=invalid_netcdf, 4093 )

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/backends/api.py:1339, in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf) 1334 # TODO: figure out how to refactor this logic (here and in save_mfdataset) 1335 # to avoid this mess of conditionals 1336 try: 1337 # TODO: allow this work (setting up the file for writing array data) 1338 # to be parallelized with dask -> 1339 dump_to_store( 1340 dataset, store, writer, encoding=encoding, unlimited_dims=unlimited_dims 1341 ) 1342 if autoclose: 1343 store.close()

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/backends/api.py:1386, in dump_to_store(dataset, store, writer, encoder, encoding, unlimited_dims) 1383 if encoder: 1384 variables, attrs = encoder(variables, attrs) -> 1386 store.store(variables, attrs, check_encoding, writer, unlimited_dims=unlimited_dims)

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/backends/common.py:393, in AbstractWritableDataStore.store(self, variables, attributes, check_encoding_set, writer, unlimited_dims) 390 if writer is None: 391 writer = ArrayWriter() --> 393 variables, attributes = self.encode(variables, attributes) 395 self.set_attributes(attributes) 396 self.set_dimensions(variables, unlimited_dims=unlimited_dims)

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/backends/common.py:482, in WritableCFDataStore.encode(self, variables, attributes) 479 def encode(self, variables, attributes): 480 # All NetCDF files get CF encoded by default, without this attempting 481 # to write times, for example, would fail. --> 482 variables, attributes = cf_encoder(variables, attributes) 483 variables = {k: self.encode_variable(v) for k, v in variables.items()} 484 attributes = {k: self.encode_attribute(v) for k, v in attributes.items()}

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/conventions.py:795, in cf_encoder(variables, attributes) 792 # add encoding for time bounds variables if present. 793 _update_bounds_encoding(variables) --> 795 new_vars = {k: encode_cf_variable(v, name=k) for k, v in variables.items()} 797 # Remove attrs from bounds variables (issue #2921) 798 for var in new_vars.values():

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/conventions.py:196, in encode_cf_variable(var, needs_copy, name) 183 ensure_not_multiindex(var, name=name) 185 for coder in [ 186 times.CFDatetimeCoder(), 187 times.CFTimedeltaCoder(), (...) 194 variables.BooleanCoder(), 195 ]: --> 196 var = coder.encode(var, name=name) 198 # TODO(kmuehlbauer): check if ensure_dtype_not_object can be moved to backends: 199 var = ensure_dtype_not_object(var, name=name)

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/coding/times.py:979, in CFDatetimeCoder.encode(self, variable, name) 977 calendar = encoding.pop("calendar", None) 978 dtype = encoding.get("dtype", None) --> 979 (data, units, calendar) = encode_cf_datetime(data, units, calendar, dtype) 981 safe_setitem(attrs, "units", units, name=name) 982 safe_setitem(attrs, "calendar", calendar, name=name)

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/coding/times.py:728, in encode_cf_datetime(dates, units, calendar, dtype) 726 return _lazily_encode_cf_datetime(dates, units, calendar, dtype) 727 else: --> 728 return _eagerly_encode_cf_datetime(dates, units, calendar, dtype)

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/coding/times.py:740, in _eagerly_encode_cf_datetime(dates, units, calendar, dtype, allow_units_modification) 731 def _eagerly_encode_cf_datetime( 732 dates: T_DuckArray, # type: ignore 733 units: str | None = None, (...) 736 allow_units_modification: bool = True, 737 ) -> tuple[T_DuckArray, str, str]: 738 dates = asarray(dates) --> 740 data_units = infer_datetime_units(dates) 742 if units is None: 743 units = data_units

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/coding/times.py:439, in infer_datetime_units(dates) 437 else: 438 reference_date = dates[0] if len(dates) > 0 else "1970-01-01" --> 439 reference_date = format_cftime_datetime(reference_date) 440 unique_timedeltas = np.unique(np.diff(dates)) 441 units = _infer_time_units_from_diff(unique_timedeltas)

File /home/datawork-marc/ENVS/pangeo2024/lib/python3.12/site-packages/xarray/coding/times.py:450, in format_cftime_datetime(date) 445 def format_cftime_datetime(date) -> str: 446 """Converts a cftime.datetime object to a string with the format: 447 YYYY-MM-DD HH:MM:SS.UUUUUU 448 """ 449 return "{:04d}-{:02d}-{:02d} {:02d}:{:02d}:{:02d}.{:06d}".format( --> 450 date.year, 451 date.month, 452 date.day, 453 date.hour, 454 date.minute, 455 date.second, 456 date.microsecond, 457 )

AttributeError: 'numpy.datetime64' object has no attribute 'year' ```

MVCE confirmation

  • [ ] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [ ] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [ ] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [ ] New issue — a search of GitHub Issues suggests this is not a duplicate.
  • [ ] Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

No response

Anything else we need to know?

No response

Environment

INSTALLED VERSIONS ------------------ commit: None python: 3.12.2 | packaged by conda-forge | (main, Feb 16 2024, 20:50:58) [GCC 12.3.0] python-bits: 64 OS: Linux OS-release: 3.12.53-60.30-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.14.3 libnetcdf: 4.9.2 xarray: 2024.2.0 pandas: 2.2.1 numpy: 1.26.4 scipy: 1.12.0 netCDF4: 1.6.5 pydap: None h5netcdf: None h5py: None Nio: None zarr: 2.17.1 cftime: 1.6.3 nc_time_axis: None iris: None bottleneck: None dask: 2024.3.1 distributed: 2024.3.1 matplotlib: 3.8.3 cartopy: 0.22.0 seaborn: None numbagg: None fsspec: 2024.3.0 cupy: None pint: None sparse: None flox: None numpy_groupies: None setuptools: 69.2.0 pip: 24.0 conda: None pytest: None mypy: None IPython: 8.22.2 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8858/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 2 rows from issues_id in issues_labels
  • 0 rows from issue in issue_comments
Powered by Datasette · Queries took 1.386ms · About: xarray-datasette