home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

8 rows where user = 33062222 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 5

  • Dataset.to_netcdf() fails to interpret 'encoding' option 2
  • add info in doc on how to facet with cartopy 2
  • ValueError when trying to encode time variable in a NetCDF file with CF convensions 2
  • How to broadcast along dayofyear 1
  • Dataset.to_netcdf() results in unexpected encoding parameters for 'netCDF4' backend 1

user 1

  • avatar101 · 8 ✖

author_association 1

  • NONE 8
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
590115864 https://github.com/pydata/xarray/issues/3739#issuecomment-590115864 https://api.github.com/repos/pydata/xarray/issues/3739 MDEyOklzc3VlQ29tbWVudDU5MDExNTg2NA== avatar101 33062222 2020-02-23T21:03:53Z 2020-02-23T21:03:53Z NONE

@mathause

thanks for your suggestions. Your first solution works fine for correcting the time data stored in the array. I also don't understand why ds_test.time.encoding is empty and yet, its the reason for an error while saving it. Maybe its a bug?

@Chan-Jer Another work around which I used was to set the correct time value using cdo's settime function.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ValueError when trying to encode time variable in a NetCDF file with CF convensions 558293655
580935355 https://github.com/pydata/xarray/issues/3739#issuecomment-580935355 https://api.github.com/repos/pydata/xarray/issues/3739 MDEyOklzc3VlQ29tbWVudDU4MDkzNTM1NQ== avatar101 33062222 2020-01-31T22:17:34Z 2020-01-31T22:17:34Z NONE

Hi Ryan, thanks for your reply. Apologies for not creating a reproducible problem earlier as the files weren't created by xarray routine. Please find my attempt at reproducing the problem below:

Minimum steps to reproduce the error

```python import numpy as np import xarray as xr import pandas as pd

data1 = np.ones(shape=(1, 181, 360)) lats=np.arange(-90,91, 1) lons=np.arange(-180,180,1) time1 = np.array([0])

creating the first dataset

da_1 = xr.DataArray(data1, coords=[time1, lats, lons], dims=['time', 'lats', 'lons']) da_1.time.attrs['units'] = "hours since 1988-01-01 00:00:00" da_1.time.attrs['calendar'] = "proleptic_gregorian" da_1.time.attrs['standard_name'] = "time" ds_1 = xr.Dataset({'V':da_1}) ds_1.attrs['Conventions'] = 'CF' ds_1.to_netcdf('ds_1.nc', encoding=None)

creating second test dataset

time2=np.array([6]) # wrong time value da_2 = xr.DataArray(data1, coords=[time2, lats, lons], dims=['time', 'lats', 'lons']) da_2.time.attrs['units'] = "hours since 1988-01-01 06:00:00" da_2.time.attrs['calendar'] = "proleptic_gregorian" da_2.time.attrs['standard_name'] = "time"

ds_2 = xr.Dataset({'V':da_2}) ds_2.attrs['Conventions'] = 'CF'

saving it with wrong time value

ds_2.to_netcdf('ds_2.nc', encoding=None)

Reading the 2 files and concatenating them

files = ['/path/to/ds_1.nc', '/path/to/ds_2.nc']

ds_test = xr.open_mfdataset(files, combine='nested', concat_dim='time', decode_cf=False) yr = 1988 # year dates = pd.date_range(start=(yr), end=str(yr+1), freq='6H', closed='left') ds_test.time.values=dates[:2] # fixing the time values ds_test.time.attrs['units'] = "Seconds since 1970-01-01 00:00:00" #required encoding ds_test.to_netcdf('ds_1_2.nc') # gives the same error ```

ValueError: failed to prevent overwriting existing key units in attrs on variable 'time'. This is probably an encoding field used by xarray to describe how a variable is serialized. To proceed, remove this key from the variable's attributes manually.

I've also mentioned your suggestion earlier in the original post. It also gives the same error message Please find the following reproducible steps incorporating your suggestion.

Trying time encoding solution

```

Reading the files

files = ['/path/to/ds_1.nc', '/path/to/ds_2.nc']

ds_test = xr.open_mfdataset(files, combine='nested', concat_dim='time', decode_cf=False) yr = 1988 # year dates = pd.date_range(start=(yr), end=str(yr+1), freq='6H', closed='left') ds_test.time.values=dates[:2] # fixing the time values

encoding try

ds_test.time.encoding['units'] = "Seconds since 1970-01-01 00:00:00" ds_test.to_netcdf('ds_1_2.nc') # gives same error ```

ValueError: failed to prevent overwriting existing key calendar in attrs on variable 'time'. This is probably an encoding field used by xarray to describe how a variable is serialized. To proceed, remove this key from the variable's attributes manually.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  ValueError when trying to encode time variable in a NetCDF file with CF convensions 558293655
477580493 https://github.com/pydata/xarray/pull/1203#issuecomment-477580493 https://api.github.com/repos/pydata/xarray/issues/1203 MDEyOklzc3VlQ29tbWVudDQ3NzU4MDQ5Mw== avatar101 33062222 2019-03-28T12:48:09Z 2019-03-28T12:48:09Z NONE

@dcherian Thanks a lot for providing an example with another approach

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add info in doc on how to facet with cartopy 200376941
476769164 https://github.com/pydata/xarray/pull/1203#issuecomment-476769164 https://api.github.com/repos/pydata/xarray/issues/1203 MDEyOklzc3VlQ29tbWVudDQ3Njc2OTE2NA== avatar101 33062222 2019-03-26T17:46:57Z 2019-03-28T12:46:14Z NONE

@vnoel I'm trying to plot coastline using facet. I tried to give ax.coastline() as a subplot_kws argument but that didn't work: ``` # ds_v_test is my Dataset containing V as variable

p = ds_v_test.sel(time=slice('2012-12-01', '2012-12-02 18:00'),\ lat= slice(-90,0)).V.squeeze().plot.pcolormesh(figsize=(16, 12), col='time', col_wrap=2, levels=16 ,\ cbar_kwargs={'label':'meridional v (m/s)'}, subplot_kws={'projection':ccrs.PlateCarree(),\ 'ax':ax.coastlines()}) ``` It gives me

AttributeError: Unknown property ax

At the moment, I can get around it by using this approach:

for ax in p.axes.flat: ax.coastlines()

INSTALLED VERSIONS

commit: None python: 3.6.8 |Anaconda, Inc.| (default, Dec 30 2018, 01:22:34) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.15.0-45-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.6.2

xarray: 0.12.0 pandas: 0.24.2 numpy: 1.16.2 scipy: 1.2.1 netCDF4: 1.4.2 pydap: None h5netcdf: None h5py: None Nio: 1.5.5 zarr: None cftime: 1.0.3.4 nc_time_axis: None PseudonetCDF: None rasterio: None cfgrib: 0.9.6.1.post1 iris: None bottleneck: None dask: None distributed: None matplotlib: 3.0.3 cartopy: 0.17.0 seaborn: 0.9.0 setuptools: 40.8.0 pip: 19.0.3 conda: None pytest: None IPython: 7.3.0 sphinx: None

​

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add info in doc on how to facet with cartopy 200376941
462271456 https://github.com/pydata/xarray/issues/2758#issuecomment-462271456 https://api.github.com/repos/pydata/xarray/issues/2758 MDEyOklzc3VlQ29tbWVudDQ2MjI3MTQ1Ng== avatar101 33062222 2019-02-11T09:59:06Z 2019-02-11T09:59:06Z NONE

@shoyer Ah! thanks. Certainly, a better error message would have helped me in this case. I agree that the easiest way is to just let xarray handle the datetime conversion.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.to_netcdf() results in unexpected encoding parameters for 'netCDF4' backend 408426920
461833981 https://github.com/pydata/xarray/issues/709#issuecomment-461833981 https://api.github.com/repos/pydata/xarray/issues/709 MDEyOklzc3VlQ29tbWVudDQ2MTgzMzk4MQ== avatar101 33062222 2019-02-08T15:11:41Z 2019-02-08T15:11:41Z NONE

@shoyer Sure, I found a way around by using cdo but I can revisit it and provide more details

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.to_netcdf() fails to interpret 'encoding' option 125386091
461546376 https://github.com/pydata/xarray/issues/709#issuecomment-461546376 https://api.github.com/repos/pydata/xarray/issues/709 MDEyOklzc3VlQ29tbWVudDQ2MTU0NjM3Ng== avatar101 33062222 2019-02-07T18:38:58Z 2019-02-07T18:38:58Z NONE

I'm facing the same problem of the unexpected encoding parameter

big_ds.to_netcdf(out_dir + 'V250hov_N2000_2018_v4' + '.nc', encoding={'V':{'_FillValue': -999.0}, 'time':{'units': "seconds since 1970-01-01 00:00:00"}}) The error is as follows: File "./R_metric.py", line 61, in <module> big_ds.to_netcdf(out_dir + 'V250hov_N2000_2018_v4' + '.nc',encoding={'V':{'_FillValue': -999.0}, 'time':{'units': "seconds since 1970-01-01 00:00:00"}}) File "/usr/local/anaconda3/envs/work_env/lib/python3.6/site-packages/xarray/core/dataset.py", line 1222, in to_netcdf compute=compute) File "/usr/local/anaconda3/envs/work_env/lib/python3.6/site-packages/xarray/backends/api.py", line 718, in to_netcdf unlimited_dims=unlimited_dims) File "/usr/local/anaconda3/envs/work_env/lib/python3.6/site-packages/xarray/backends/api.py", line 761, in dump_to_store unlimited_dims=unlimited_dims) File "/usr/local/anaconda3/envs/work_env/lib/python3.6/site-packages/xarray/backends/common.py", line 266, in store unlimited_dims=unlimited_dims) File "/usr/local/anaconda3/envs/work_env/lib/python3.6/site-packages/xarray/backends/common.py", line 304, in set_variables name, v, check, unlimited_dims=unlimited_dims) File "/usr/local/anaconda3/envs/work_env/lib/python3.6/site-packages/xarray/backends/netCDF4_.py", line 450, in prepare_variable unlimited_dims=unlimited_dims) File "/usr/local/anaconda3/envs/work_env/lib/python3.6/site-packages/xarray/backends/netCDF4_.py", line 225, in _extract_nc4_variable_encoding ' %r' % (backend, invalid)) ValueError: unexpected encoding parameters for 'netCDF4' backend: ['units']

However, the same syntax works when I tried writing out a different file. I'm using xarray version 11.0

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dataset.to_netcdf() fails to interpret 'encoding' option 125386091
441034802 https://github.com/pydata/xarray/issues/1844#issuecomment-441034802 https://api.github.com/repos/pydata/xarray/issues/1844 MDEyOklzc3VlQ29tbWVudDQ0MTAzNDgwMg== avatar101 33062222 2018-11-22T13:43:23Z 2018-11-22T13:44:48Z NONE

For anyone stumbling upon this thread in the future, I would like to mention that I used the above grouping approach suggested by @spencerkclark for my dataset to calculate climatology with calendar day and it works smoothly. The only thing one should be careful is that you can't directly plot the data using

In[1]: da.groupby(month_day_str).mean('time').plot() Out[1]: TypeError: Plotting requires coordinates to be numeric or dates of type np.datetime64 or datetime.datetime.

To get around it, either use group by the

modified_ordinal _day

Or convert back the grouped coordinate month_day_str to numeric. However, after doing all this I found out that the CDO function also calculates climatology by the ordinal day of the year. So, to be consistent I would stick to that method but it's anyway good to know that there is a way around to group by day and month if required in Xarray.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  How to broadcast along dayofyear 290023410

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 3438.723ms · About: xarray-datasette