home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 279832457 and user = 1554921 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • neishm · 2 ✖

issue 1

  • Multi-dimensional coordinate mixup when writing to netCDF · 2 ✖

author_association 1

  • CONTRIBUTOR 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
350292555 https://github.com/pydata/xarray/issues/1763#issuecomment-350292555 https://api.github.com/repos/pydata/xarray/issues/1763 MDEyOklzc3VlQ29tbWVudDM1MDI5MjU1NQ== neishm 1554921 2017-12-08T15:34:01Z 2017-12-08T15:34:01Z CONTRIBUTOR

I think I've duplicated the logic from _construct_dataarray into _encode_coordinates. Test cases are passing, and my actual files are writing out properly. Hopefully nothing else got broken along the way.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Multi-dimensional coordinate mixup when writing to netCDF 279832457
350015214 https://github.com/pydata/xarray/issues/1763#issuecomment-350015214 https://api.github.com/repos/pydata/xarray/issues/1763 MDEyOklzc3VlQ29tbWVudDM1MDAxNTIxNA== neishm 1554921 2017-12-07T16:11:55Z 2017-12-07T16:11:55Z CONTRIBUTOR

I can try putting together a pull request, hopefully without breaking any existing use cases. I just tested switching the any condition to all in the above code, and it does fix my one test case...

...However, it breaks other cases, such as if there's another axis in the data (such as a time axis). I think the all condition would require "time" to be one of the dimensions of the coordinates.

Here's an updated test case:

```python import xarray as xr import numpy as np

zeros1 = np.zeros((1,5,3)) zeros2 = np.zeros((1,6,3)) zeros3 = np.zeros((1,5,4)) d = xr.Dataset({ 'lon1': (['x1','y1'], zeros1.squeeze(0), {}), 'lon2': (['x2','y1'], zeros2.squeeze(0), {}), 'lon3': (['x1','y2'], zeros3.squeeze(0), {}), 'lat1': (['x1','y1'], zeros1.squeeze(0), {}), 'lat2': (['x2','y1'], zeros2.squeeze(0), {}), 'lat3': (['x1','y2'], zeros3.squeeze(0), {}), 'foo1': (['time','x1','y1'], zeros1, {'coordinates': 'lon1 lat1'}), 'foo2': (['time','x2','y1'], zeros2, {'coordinates': 'lon2 lat2'}), 'foo3': (['time','x1','y2'], zeros3, {'coordinates': 'lon3 lat3'}), 'time': ('time', [0.], {'units': 'hours since 2017-01-01'}), }) d = xr.conventions.decode_cf(d) The resulting Dataset: <xarray.Dataset> Dimensions: (time: 1, x1: 5, x2: 6, y1: 3, y2: 4) Coordinates: lat1 (x1, y1) float64 ... * time (time) datetime64[ns] 2017-01-01 lat3 (x1, y2) float64 ... lat2 (x2, y1) float64 ... lon1 (x1, y1) float64 ... lon3 (x1, y2) float64 ... lon2 (x2, y1) float64 ... Dimensions without coordinates: x1, x2, y1, y2 Data variables: foo1 (time, x1, y1) float64 ... foo2 (time, x2, y1) float64 ... foo3 (time, x1, y2) float64 ... saved to netCDF usingpython d.to_netcdf("test.nc") ```

With the any condition, I have too many coordinates: ~$ ncdump -h test.nc netcdf test { dimensions: x1 = 5 ; y1 = 3 ; time = 1 ; y2 = 4 ; x2 = 6 ; variables: ... double foo1(time, x1, y1) ; foo1:_FillValue = NaN ; foo1:coordinates = "lat1 lat3 lat2 lon1 lon3 lon2" ; double foo2(time, x2, y1) ; foo2:_FillValue = NaN ; foo2:coordinates = "lon1 lon2 lat1 lat2" ; double foo3(time, x1, y2) ; foo3:_FillValue = NaN ; foo3:coordinates = "lon1 lon3 lat1 lat3" ; ... }

With the all condition, I don't get any variable coordinates (they're dumped into the global attributes): ``` ~$ ncdump -h test.nc netcdf test { dimensions: x1 = 5 ; y1 = 3 ; time = 1 ; y2 = 4 ; x2 = 6 ; variables: ... double foo1(time, x1, y1) ; foo1:_FillValue = NaN ; double foo2(time, x2, y1) ; foo2:_FillValue = NaN ; double foo3(time, x1, y2) ; foo3:_FillValue = NaN ;

// global attributes: :_NCProperties = "version=1|netcdflibversion=4.4.1.1|hdf5libversion=1.8.18" ; :coordinates = "lat1 lat3 lat2 lon1 lon3 lon2" ; }

```

So the update may be a bit more tricky to get right. I know the DataArray objects (foo1,foo2,foo3) already have the right coordinates associated with them before writing to netCDF, so maybe the logic in _encode_coordinates could be changed to utilize v.coords somehow? I'll see if I can get something working for my test cases...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Multi-dimensional coordinate mixup when writing to netCDF 279832457

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 9.989ms · About: xarray-datasette