issue_comments
15 rows where issue = 363299007 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue 1
- save "encoding" when using open_mfdataset · 15 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
888204512 | https://github.com/pydata/xarray/issues/2436#issuecomment-888204512 | https://api.github.com/repos/pydata/xarray/issues/2436 | IC_kwDOAMm_X8408Ozg | corentincarton 15659891 | 2021-07-28T10:35:42Z | 2021-07-28T10:35:42Z | NONE | Any update about this issue? I'm working on a code where I want to make sure I have consistent calendars for all my inputs. Couldn't we add an option to use the encoding from the first file in the list or something? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
661721510 | https://github.com/pydata/xarray/issues/2436#issuecomment-661721510 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDY2MTcyMTUxMA== | mickaellalande 20254164 | 2020-07-21T08:43:04Z | 2020-07-21T08:43:04Z | CONTRIBUTOR | DescriptionAny news about this issue? I am facing the same problem and I had to get the calendars by hand... I tried to update xarray but there is still the same issue of missing the Step to reproduceHere is a simple example to illustrate:
that gives:
Let's slipt this dataset and try to read it back with Adding some arguments Xarray versionxr.show_versions()```python INSTALLED VERSIONS ------------------ commit: None python: 3.8.4 | packaged by conda-forge | (default, Jul 17 2020, 15:16:46) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 4.19.0-9-amd64 machine: x86_64 processor: byteorder: little LC_ALL: None LANG: fr_FR.UTF-8 LOCALE: fr_FR.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.0.5 numpy: 1.19.0 scipy: None netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: None Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.21.0 distributed: 2.21.0 matplotlib: None cartopy: None seaborn: None numbagg: None pint: None setuptools: 49.2.0.post20200712 pip: 20.1.1 conda: None pytest: None IPython: 7.16.1 sphinx: None ``` |
{ "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
610466323 | https://github.com/pydata/xarray/issues/2436#issuecomment-610466323 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYxMDQ2NjMyMw== | sbiner 16655388 | 2020-04-07T15:49:03Z | 2020-04-07T15:49:03Z | NONE |
```
Yes,
Here is the output: ``` In [2]: xr.show_versions() INSTALLED VERSIONScommit: None python: 3.6.9 |Anaconda, Inc.| (default, Jul 30 2019, 19:07:31) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 3.10.0-514.2.2.el7.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: fr_CA.UTF-8 LOCALE: fr_CA.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.6.1 xarray: 0.15.2.dev29+g6048356 pandas: 1.0.1 numpy: 1.18.1 scipy: 1.4.1 netCDF4: 1.4.2 pydap: None h5netcdf: None h5py: 2.9.0 Nio: None zarr: None cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.1 dask: 2.10.1 distributed: 2.10.0 matplotlib: 3.0.2 cartopy: 0.16.0 seaborn: 0.9.0 numbagg: None pint: 0.9 setuptools: 45.2.0.post20200210 pip: 20.0.2 conda: None pytest: 5.3.4 IPython: 7.8.0 sphinx: 2.4.0 ```
Here is an output ouf ``` 11:41 neree ~/travail/xarray_open_mfdataset_perd_time_attributes :ncdump -hs /expl6/climato/arch/bbw/series/200001/snw_bbw_200001_se.nc netcdf snw_bbw_200001_se { dimensions: height = 1 ; rlat = 300 ; rlon = 340 ; time = UNLIMITED ; // (248 currently) variables: double height(height) ; height:units = "m" ; height:long_name = "height" ; height:standard_name = "height" ; height:axis = "Z" ; height:positive = "up" ; height:coordinate_defines = "point" ; height:actual_range = 0., 0. ; height:_Storage = "chunked" ; height:_ChunkSizes = 1 ; height:_DeflateLevel = 6 ; height:_Endianness = "little" ; double lat(rlat, rlon) ; lat:units = "degrees_north" ; lat:long_name = "latitude" ; lat:standard_name = "latitude" ; lat:actual_range = 7.83627367019653, 82.5695037841797 ; lat:_Storage = "chunked" ; lat:_ChunkSizes = 50, 50 ; lat:_DeflateLevel = 6 ; lat:_Endianness = "little" ; double lon(rlat, rlon) ; lon:units = "degrees_east" ; lon:long_name = "longitude" ; lon:standard_name = "longitude" ; lon:actual_range = -179.972747802734, 179.975296020508 ; lon:_Storage = "chunked" ; lon:_ChunkSizes = 50, 50 ; lon:_DeflateLevel = 6 ; lon:_Endianness = "little" ; double rlat(rlat) ; rlat:long_name = "latitude in rotated pole grid" ; rlat:units = "degrees" ; rlat:standard_name = "grid_latitude" ; rlat:axis = "Y" ; rlat:coordinate_defines = "point" ; rlat:actual_range = -30.7100009918213, 35.0699996948242 ; rlat:_Storage = "chunked" ; rlat:_ChunkSizes = 50 ; rlat:_DeflateLevel = 6 ; rlat:_Endianness = "little" ; double rlon(rlon) ; rlon:long_name = "longitude in rotated pole grid" ; rlon:units = "degrees" ; rlon:standard_name = "grid_longitude" ; rlon:axis = "X" ; rlon:coordinate_defines = "point" ; rlon:actual_range = -33.9900054931641, 40.5899810791016 ; rlon:_Storage = "chunked" ; rlon:_ChunkSizes = 50 ; rlon:_DeflateLevel = 6 ; rlon:_Endianness = "little" ; char rotated_pole ; rotated_pole:grid_mapping_name = "rotated_latitude_longitude" ; rotated_pole:grid_north_pole_latitude = 42.5f ; rotated_pole:grid_north_pole_longitude = 83.f ; rotated_pole:north_pole_grid_longitude = 0.f ; float snw(time, rlat, rlon) ; snw:units = "kg m-2" ; snw:long_name = "Surface Snow Amount" ; snw:standard_name = "surface_snow_amount" ; snw:realm = "landIce land" ; snw:cell_measures = "area: areacella" ; snw:coordinates = "lon lat" ; snw:grid_mapping = "rotated_pole" ; snw:level_desc = "Height" ; snw:cell_methods = "time: point" ; snw:_Storage = "chunked" ; snw:_ChunkSizes = 250, 50, 50 ; snw:_DeflateLevel = 6 ; snw:_Endianness = "little" ; double time(time) ; time:long_name = "time" ; time:standard_name = "time" ; time:axis = "T" ; time:calendar = "gregorian" ; time:units = "days since 2000-01-01 00:00:00" ; time:coordinate_defines = "point" ; time:_Storage = "chunked" ; time:_ChunkSizes = 250 ; time:_DeflateLevel = 6 ; time:_Endianness = "little" ; // global attributes: :Conventions = "CF-1.6" ; :contact = "paquin.dominique@ouranos.ca" ; :comment = "CRCM5 v3331 0.22 deg AMNO22d2 L56 S17-15m ERA-INTERIM 0,75d PILSPEC PS3" ; :creation_date = "2016-08-15 " ; :experiment = "simulation de reference " ; :experiment_id = "bbw" ; :driving_experiment = "ERA-INTERIM " ; :driving_model_id = "ECMWF-ERAINT " ; :driving_model_ensemble_member = "r1i1p1 " ; :driving_experiment_name = "evaluation " ; :institution = "Ouranos " ; :institute_id = "Our. " ; :model_id = "OURANOS-CRCM5" ; :rcm_version_id = "v3331" ; :project_id = "" ; :ouranos_domain_name = "AMNO22d2 " ; :ouranos_run_id = "bbw OURALIB 1.3" ; :product = "output" ; :reference = "http://www.ouranos.ca" ; :history = "Mon Nov 7 10:13:55 2016: ncks -O --chunk_policy g3d --cnk_dmn plev,1 --cnk_dmn rlon,50 --cnk_dmn rlat,50 --cnk_dmn time,250 /localscratch/72194520.gm-1r16-n04.guillimin.clumeq.ca/bbw/bbw/200001/nc4c_snw_bbw_200001_se.nc /localscratch/72194520.gm-1r16-n04.guillimin.clumeq.ca/bbw/bbw/200001/snw_bbw_200001_se.nc\n", "Mon Nov 7 10:13:50 2016: ncks -O --fl_fmt=netcdf4_classic -L 6 /localscratch/72194520.gm-1r16-n04.guillimin.clumeq.ca/bbw/bbw/200001/trim_snw_bbw_200001_se.nc /localscratch/72194520.gm-1r16-n04.guillimin.clumeq.ca/bbw/bbw/200001/nc4c_snw_bbw_200001_se.nc\n", "Mon Nov 7 10:13:48 2016: ncks -d time,2000-01-01 00:00:00,2000-01-31 23:59:59 /home/dpaquin1/postprod/bbw/transit2/200001/snw_bbw_200001_se.nc /localscratch/72194520.gm-1r16-n04.guillimin.clumeq.ca/bbw/bbw/200001/trim_snw_bbw_200001_se.nc\n", "Fri Nov 4 12:49:33 2016: ncks -4 -L 1 --no_tmp_fl -u -d time,2000-01-01 00:00,2000-02-01 00:00 /localscratch/72001487.gm-1r16-n04.guillimin.clumeq.ca/I5/snw_bbw_2000_se.nc /home/dpaquin1/postprod/bbw/work/200001/snw_bbw_200001_se.nc\n", "Fri Nov 4 12:48:52 2016: ncks -4 -L 1 /localscratch/72001487.gm-1r16-n04.guillimin.clumeq.ca/I5/snw_bbw_2000_se.nc /home/dpaquin1/postprod/bbw/work/2000/snw_bbw_2000_se.nc\n", "Fri Nov 4 12:48:44 2016: ncatted -O -a cell_measures,snw,o,c,area: areacella /localscratch/72001487.gm-1r16-n04.guillimin.clumeq.ca/I5/snw_bbw_2000_se.nc 25554_bbb" ; :NCO = "4.4.4" ; :_SuperblockVersion = 2 ; :_IsNetcdf4 = 1 ; :_Format = "netCDF-4 classic model" ; } ``` I guess the next option could be to go into xarray code to try to find what the problem is but I would need some direction for doing this. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
610073334 | https://github.com/pydata/xarray/issues/2436#issuecomment-610073334 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYxMDA3MzMzNA== | keewis 14808389 | 2020-04-06T22:38:30Z | 2020-04-06T22:38:30Z | MEMBER | I removed it since it doesn't change anything. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
610053703 | https://github.com/pydata/xarray/issues/2436#issuecomment-610053703 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYxMDA1MzcwMw== | keewis 14808389 | 2020-04-06T21:45:47Z | 2020-04-06T22:03:08Z | MEMBER | unfortunately, ```python
In #3498, the original proposal was to name the new kwarg Before trying to help with debugging your issue: could you post the output of Also, could you try to demonstrate your issue using a synthetic example? I've been trying to reproduce it with:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
610060062 | https://github.com/pydata/xarray/issues/2436#issuecomment-610060062 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYxMDA2MDA2Mg== | dcherian 2448579 | 2020-04-06T22:01:13Z | 2020-04-06T22:01:13Z | MEMBER | This example works without |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
610020749 | https://github.com/pydata/xarray/issues/2436#issuecomment-610020749 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYxMDAyMDc0OQ== | sbiner 16655388 | 2020-04-06T20:31:10Z | 2020-04-06T20:31:10Z | NONE |
3498 says something about a
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
610008589 | https://github.com/pydata/xarray/issues/2436#issuecomment-610008589 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYxMDAwODU4OQ== | TomNicholas 35968931 | 2020-04-06T20:05:10Z | 2020-04-06T20:05:10Z | MEMBER |
No worries!
3498 added a new keyword argument to
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
609998713 | https://github.com/pydata/xarray/issues/2436#issuecomment-609998713 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYwOTk5ODcxMw== | sbiner 16655388 | 2020-04-06T19:43:55Z | 2020-04-06T19:43:55Z | NONE | @TomNicholas I forgot about this sorry. I just made a quick check with the latest xarray master and I still have the problem ... see code. Related question but maybe out of line, is there any way to know that the snw.time type is cftime.DatetimeNoLeap (as it is visible in the overview of
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
609479629 | https://github.com/pydata/xarray/issues/2436#issuecomment-609479629 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDYwOTQ3OTYyOQ== | TomNicholas 35968931 | 2020-04-05T20:44:00Z | 2020-04-05T20:44:00Z | MEMBER | @sbiner I know it's been a while, but I expect that #3498 and #3877 probably resolve your issue? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
449737841 | https://github.com/pydata/xarray/issues/2436#issuecomment-449737841 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDQ0OTczNzg0MQ== | TomNicholas 35968931 | 2018-12-24T14:02:57Z | 2018-12-24T14:02:57Z | MEMBER | If So I think to fix this then either |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
424436617 | https://github.com/pydata/xarray/issues/2436#issuecomment-424436617 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDQyNDQzNjYxNw== | sbiner 16655388 | 2018-09-25T17:43:35Z | 2018-09-25T17:43:35Z | NONE | @spencerkclark Yes I was looking at time.encoding. Following you example I did some tests and the problem is related to the fact that I am opening multiple netCDF files with open_mfdataset. Doing so time.encoding is empty while it is as expected when opening any of the files with open_dataset instead. |
{ "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
424399958 | https://github.com/pydata/xarray/issues/2436#issuecomment-424399958 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDQyNDM5OTk1OA== | spencerkclark 6628425 | 2018-09-25T15:54:28Z | 2018-09-25T15:54:28Z | MEMBER | @sbiner are you looking at the encoding attribute of the full Dataset or the time variable? The time variable should retain the calendar encoding (the Dataset will not). E.g.: ``` In [1]: import cftime In [2]: import numpy as np In [3]: import xarray as xr In [4]: units = 'days since 2000-02-25' In [5]: times = cftime.num2date(np.arange(7), units=units, calendar='365_day') In [6]: da = xr.DataArray(np.arange(7), coords=[times], dims=['time'], name='a') In [7]: da.to_netcdf('data-noleap.nc') In [8]: ds = xr.open_dataset('data-noleap.nc') In [9]: ds.encoding['calendar']KeyError Traceback (most recent call last) <ipython-input-38-677c245c7bb8> in <module>() ----> 1 default.encoding['calendar'] KeyError: 'calendar' In [10]: ds.time.encoding['calendar'] Out[10]: u'noleap' ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
424117789 | https://github.com/pydata/xarray/issues/2436#issuecomment-424117789 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDQyNDExNzc4OQ== | sbiner 16655388 | 2018-09-24T20:42:20Z | 2018-09-24T20:42:20Z | NONE | It would be ok but it is (or looks) empty when I use open_dataset() |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 | |
424111082 | https://github.com/pydata/xarray/issues/2436#issuecomment-424111082 | https://api.github.com/repos/pydata/xarray/issues/2436 | MDEyOklzc3VlQ29tbWVudDQyNDExMTA4Mg== | rabernat 1197350 | 2018-09-24T20:20:04Z | 2018-09-24T20:20:04Z | MEMBER | Do you know you can access them in the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
save "encoding" when using open_mfdataset 363299007 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 8