home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

10 rows where issue = 262930380 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 3

  • jhamman 5
  • fmaussion 3
  • shoyer 2

issue 1

  • fix to_netcdf append bug (GH1215) · 10 ✖

author_association 1

  • MEMBER 10
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
339216097 https://github.com/pydata/xarray/pull/1609#issuecomment-339216097 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzOTIxNjA5Nw== shoyer 1217238 2017-10-25T05:09:01Z 2017-10-25T05:09:01Z MEMBER

@jhamman This looks great, thank you!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
339198772 https://github.com/pydata/xarray/pull/1609#issuecomment-339198772 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzOTE5ODc3Mg== jhamman 2443309 2017-10-25T02:56:55Z 2017-10-25T02:56:55Z MEMBER

@shoyer - take another look. I have basically merged our two ideas and refactored the roundtrip tests. Tests are still failing but not for py2.7, on appveyor, or py3.6 locally.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
339005604 https://github.com/pydata/xarray/pull/1609#issuecomment-339005604 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzOTAwNTYwNA== jhamman 2443309 2017-10-24T14:17:27Z 2017-10-24T14:17:27Z MEMBER

@shoyer - ready for final review.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
338367037 https://github.com/pydata/xarray/pull/1609#issuecomment-338367037 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzODM2NzAzNw== shoyer 1217238 2017-10-21T06:06:16Z 2017-10-21T06:06:16Z MEMBER

@jhamman I think this is something to do with string/bytes. The data gets written as Unicode strings but then read back in as bytes, which is obviously not ideal.

If you want to ignore this, like the current tests, you can use assert_allclose() which has a flag for ignoring string/bytes issues (yeah, I know it's a nasty hack). In the long run we need to figure out how to solve issues like https://github.com/pydata/xarray/issues/1638. It seems like https://github.com/Unidata/netcdf-c/issues/402 is likely relevant.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
338363428 https://github.com/pydata/xarray/pull/1609#issuecomment-338363428 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzODM2MzQyOA== jhamman 2443309 2017-10-21T04:35:55Z 2017-10-21T04:35:55Z MEMBER

Question for those who are familiar with the scipy backend. I have a few failing tests here on the scipy backend and I'm not really sure what's going on. It seems like it could be related to the mmap feature in scipy.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
337930499 https://github.com/pydata/xarray/pull/1609#issuecomment-337930499 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzNzkzMDQ5OQ== fmaussion 10050469 2017-10-19T14:43:48Z 2017-10-19T14:43:48Z MEMBER

Thanks! I like it: simple and in accordance with netcdf4's silent overwriting. It would be cool to describe this behavior this in the documentation somewhere, and add a test that the data is correctly overwritten maybe?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
337754946 https://github.com/pydata/xarray/pull/1609#issuecomment-337754946 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzNzc1NDk0Ng== jhamman 2443309 2017-10-18T23:17:48Z 2017-10-18T23:17:48Z MEMBER

@fmaussion - I've updated the append logic slightly. I'm wondering what you think? This version more aggressively overwrites existing variables (data_vars and coords).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
335711382 https://github.com/pydata/xarray/pull/1609#issuecomment-335711382 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzNTcxMTM4Mg== fmaussion 10050469 2017-10-11T07:31:53Z 2017-10-11T07:31:53Z MEMBER

Netcdf4 would overwrite in this situation, and I am also in favor of overwriting as this could be quite a useful usecase:

python from netCDF4 import Dataset with Dataset('test.nc', 'w', format='NETCDF4') as nc: nc.createDimension('lat', 1) nc.createVariable('lat', 'f4', ('lat',)) nc['lat'][:] = 1 with Dataset("test.nc", "a") as nc: nc['lat'][:] = 2

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
335518624 https://github.com/pydata/xarray/pull/1609#issuecomment-335518624 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzNTUxODYyNA== jhamman 2443309 2017-10-10T15:50:09Z 2017-10-10T15:50:09Z MEMBER

@fmaussion - you bring up a good point. There are two scenarios here.

1) appending to a file with existing data variables 2) appending to a file with existing coordinate variables

I'm wondering if we should disallow 1, in favor of being more explicit. So:

```Python list_of_vars_to_append = ['var1', 'var2'] ds[list_of_vars_to_append].to_netcdf(filename, mode='a')

if either var1 or var2 are in filename, raise an error?

`` as apposed to the current behavior which would silently skip all vars already infilenameand not inlist_of_vars_to_append`.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380
335411346 https://github.com/pydata/xarray/pull/1609#issuecomment-335411346 https://api.github.com/repos/pydata/xarray/issues/1609 MDEyOklzc3VlQ29tbWVudDMzNTQxMTM0Ng== fmaussion 10050469 2017-10-10T09:12:57Z 2017-10-10T09:12:57Z MEMBER

Thanks @jhamman !

With this fix existing variables will silently be ignored and won't be written, right? Maybe the expected behaviour (or specs) of the "append" option should be written somewhere in the docs and tested to avoid future regressions like we had in https://github.com/pydata/xarray/issues/1215

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix to_netcdf append bug (GH1215) 262930380

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 13.11ms · About: xarray-datasette