home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 33112594 and user = 2443309 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • jhamman · 2 ✖

issue 1

  • Problems parsing time variable using open_dataset · 2 ✖

author_association 1

  • MEMBER 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
42785413 https://github.com/pydata/xarray/issues/118#issuecomment-42785413 https://api.github.com/repos/pydata/xarray/issues/118 MDEyOklzc3VlQ29tbWVudDQyNzg1NDEz jhamman 2443309 2014-05-11T22:26:54Z 2014-05-11T22:26:54Z MEMBER

@shoyer - my experience is that the dummy netCDF4.datetime objects don't play nice with setting up a pandas time index, so a intermediate conversion step is necessary. I haven't looked into why this is exactly.

I just tried the new decoding and it seems to work.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Problems parsing time variable using open_dataset 33112594
42598104 https://github.com/pydata/xarray/issues/118#issuecomment-42598104 https://api.github.com/repos/pydata/xarray/issues/118 MDEyOklzc3VlQ29tbWVudDQyNTk4MTA0 jhamman 2443309 2014-05-08T19:54:54Z 2014-05-08T19:54:54Z MEMBER

Thanks, the decode_cf keyword should get me around the problem for now.

I've made a habit of always directly converting my netCDF4.datetime to true datetime.datetime objects immediately, since netCDF4 only returns real datetime objects for the Gregorian calendars.

python f = netCDF4.Dataset('sample_for_xray.nc') decoded_times = netCDF4.num2date(f.variables['time'][:], f.variables['time'].units, f.variables['time'].calendar) for i, t in enumerate(decoded_times): decoded_times[i] = datetime.datetime(*t.timetuple()[:6])

The important piece to remember if this is done is that you have to be very picky about how you calculate timedeltas between these dates since they think they are on the Gregorian calendar. I usually just keep an ordinal based time array around for that reason.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Problems parsing time variable using open_dataset 33112594

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 51.332ms · About: xarray-datasette