home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where author_association = "NONE", issue = 558293655 and user = 33062222 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • avatar101 · 2 ✖

issue 1

  • ValueError when trying to encode time variable in a NetCDF file with CF convensions · 2 ✖

author_association 1

  • NONE · 2 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
590115864 https://github.com/pydata/xarray/issues/3739#issuecomment-590115864 https://api.github.com/repos/pydata/xarray/issues/3739 MDEyOklzc3VlQ29tbWVudDU5MDExNTg2NA== avatar101 33062222 2020-02-23T21:03:53Z 2020-02-23T21:03:53Z NONE

@mathause

thanks for your suggestions. Your first solution works fine for correcting the time data stored in the array. I also don't understand why ds_test.time.encoding is empty and yet, its the reason for an error while saving it. Maybe its a bug?

@Chan-Jer Another work around which I used was to set the correct time value using cdo's settime function.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ValueError when trying to encode time variable in a NetCDF file with CF convensions 558293655
580935355 https://github.com/pydata/xarray/issues/3739#issuecomment-580935355 https://api.github.com/repos/pydata/xarray/issues/3739 MDEyOklzc3VlQ29tbWVudDU4MDkzNTM1NQ== avatar101 33062222 2020-01-31T22:17:34Z 2020-01-31T22:17:34Z NONE

Hi Ryan, thanks for your reply. Apologies for not creating a reproducible problem earlier as the files weren't created by xarray routine. Please find my attempt at reproducing the problem below:

Minimum steps to reproduce the error

```python import numpy as np import xarray as xr import pandas as pd

data1 = np.ones(shape=(1, 181, 360)) lats=np.arange(-90,91, 1) lons=np.arange(-180,180,1) time1 = np.array([0])

creating the first dataset

da_1 = xr.DataArray(data1, coords=[time1, lats, lons], dims=['time', 'lats', 'lons']) da_1.time.attrs['units'] = "hours since 1988-01-01 00:00:00" da_1.time.attrs['calendar'] = "proleptic_gregorian" da_1.time.attrs['standard_name'] = "time" ds_1 = xr.Dataset({'V':da_1}) ds_1.attrs['Conventions'] = 'CF' ds_1.to_netcdf('ds_1.nc', encoding=None)

creating second test dataset

time2=np.array([6]) # wrong time value da_2 = xr.DataArray(data1, coords=[time2, lats, lons], dims=['time', 'lats', 'lons']) da_2.time.attrs['units'] = "hours since 1988-01-01 06:00:00" da_2.time.attrs['calendar'] = "proleptic_gregorian" da_2.time.attrs['standard_name'] = "time"

ds_2 = xr.Dataset({'V':da_2}) ds_2.attrs['Conventions'] = 'CF'

saving it with wrong time value

ds_2.to_netcdf('ds_2.nc', encoding=None)

Reading the 2 files and concatenating them

files = ['/path/to/ds_1.nc', '/path/to/ds_2.nc']

ds_test = xr.open_mfdataset(files, combine='nested', concat_dim='time', decode_cf=False) yr = 1988 # year dates = pd.date_range(start=(yr), end=str(yr+1), freq='6H', closed='left') ds_test.time.values=dates[:2] # fixing the time values ds_test.time.attrs['units'] = "Seconds since 1970-01-01 00:00:00" #required encoding ds_test.to_netcdf('ds_1_2.nc') # gives the same error ```

ValueError: failed to prevent overwriting existing key units in attrs on variable 'time'. This is probably an encoding field used by xarray to describe how a variable is serialized. To proceed, remove this key from the variable's attributes manually.

I've also mentioned your suggestion earlier in the original post. It also gives the same error message Please find the following reproducible steps incorporating your suggestion.

Trying time encoding solution

```

Reading the files

files = ['/path/to/ds_1.nc', '/path/to/ds_2.nc']

ds_test = xr.open_mfdataset(files, combine='nested', concat_dim='time', decode_cf=False) yr = 1988 # year dates = pd.date_range(start=(yr), end=str(yr+1), freq='6H', closed='left') ds_test.time.values=dates[:2] # fixing the time values

encoding try

ds_test.time.encoding['units'] = "Seconds since 1970-01-01 00:00:00" ds_test.to_netcdf('ds_1_2.nc') # gives same error ```

ValueError: failed to prevent overwriting existing key calendar in attrs on variable 'time'. This is probably an encoding field used by xarray to describe how a variable is serialized. To proceed, remove this key from the variable's attributes manually.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ValueError when trying to encode time variable in a NetCDF file with CF convensions 558293655

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 641.202ms · About: xarray-datasette