id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 29008494,MDU6SXNzdWUyOTAwODQ5NA==,55,Allow datetime.timedelta coordinates.,514053,closed,0,,,4,2014-03-07T23:56:39Z,2014-12-12T09:41:01Z,2014-12-12T09:41:01Z,CONTRIBUTOR,,,,"This would allow you to have coordinates which are offsets from a time coordinates which comes in handy when dealing with forecast data where the 'time' coordinate might be the forecast run time and you then want a 'lead' coordinate which is an offset from the run time. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 28600785,MDU6SXNzdWUyODYwMDc4NQ==,39,OpenDAP loaded Dataset has lon/lats with type 'object'.,514053,closed,0,,,4,2014-03-03T06:07:17Z,2014-03-24T07:21:02Z,2014-03-24T07:21:02Z,CONTRIBUTOR,,,,"``` ds = xray.open_dataset('http://motherlode.ucar.edu/thredds/dodsC/grib/NCEP/GFS/Global_0p5deg/files/GFS_Global_0p5deg_20140303_0000.grib2', decode_cf=False) In [4]: ds['lat'].dtype Out[4]: dtype('O') ``` This makes serialization fail. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 28445412,MDU6SXNzdWUyODQ0NTQxMg==,26,Allow the ability to add/persist details of how a dataset is stored.,514053,closed,0,,,4,2014-02-27T19:10:38Z,2014-03-03T02:54:16Z,2014-03-03T02:54:16Z,CONTRIBUTOR,,,,"Both Issues https://github.com/akleeman/xray/pull/20 and https://github.com/akleeman/xray/pull/21 are dealing with similar conceptual issues. Namely sometimes the user may want fine control over how a dataset is stored (integer packing, time units and calendars ...). Taking time as an example, the current model interprets the units and calendar in order to create a DatetimeIndex, but then throws out those attributes so that if the dataset were re-serialized the units may not be preserved. One proposed solution to this issue is to include a distinct set of encoding attributes that would hold things like 'scale_factor', and 'add_offset' allowing something like this ``` ds['time'] = ('time', pd.date_range('1999-01-05', periods=10)) ds['time'].encoding['units'] = 'days since 1989-08-19' ds.dump('netcdf.nc') > ncdump -h ... int time(time) ; time:units = ""days since 1989-08-19"" ; ... ``` The encoding attributes could also handle masking, scaling, compression etc ... ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue