home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

5 rows where author_association = "MEMBER", issue = 99836561 and user = 1197350 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • rabernat · 5 ✖

issue 1

  • time decoding error with "days since" · 5 ✖

author_association 1

  • MEMBER · 5 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
474898722 https://github.com/pydata/xarray/issues/521#issuecomment-474898722 https://api.github.com/repos/pydata/xarray/issues/521 MDEyOklzc3VlQ29tbWVudDQ3NDg5ODcyMg== rabernat 1197350 2019-03-20T15:55:15Z 2019-03-20T15:55:15Z MEMBER

@klindsay28 -- thanks for the clarification. You're clearly right about 2, and I was misinformed. The problem is that 3 makes it impossible follow the CF convention rules to overcome 2 (which xarray would try to do).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  time decoding error with "days since"  99836561
474840796 https://github.com/pydata/xarray/issues/521#issuecomment-474840796 https://api.github.com/repos/pydata/xarray/issues/521 MDEyOklzc3VlQ29tbWVudDQ3NDg0MDc5Ng== rabernat 1197350 2019-03-20T13:58:09Z 2019-03-20T13:58:46Z MEMBER

It's important to be clear that the issues 2 and 3 that @spencerkclark pointed out are objectively errors in the metadata. We have worked very hard over many years to enable xarray to correctly parse CF-compliant dates with non-standard calendars. But xarray cannot and should not be expected to magically fix metadata that is inconsistent or incomplete.

You really need to bring these issues to the attention of whoever generated some_CESM_output_file.nc.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  time decoding error with "days since"  99836561
129189753 https://github.com/pydata/xarray/issues/521#issuecomment-129189753 https://api.github.com/repos/pydata/xarray/issues/521 MDEyOklzc3VlQ29tbWVudDEyOTE4OTc1Mw== rabernat 1197350 2015-08-09T14:00:37Z 2015-08-09T14:00:37Z MEMBER

@jhamman Thanks for the clear explanation! One of the main uses for non-standard calendars would be climate model "control runs", which don't occur any any specific point in historical time but still have seasonal cycles, well defined months, etc. It would be nice to have "group by" functionality for these datasets. But I do see how this is impossible with the current numpy datetime64 datatype. Perhaps the long term fix is to implement non-standard calendars within numpy itself.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  time decoding error with "days since"  99836561
129069915 https://github.com/pydata/xarray/issues/521#issuecomment-129069915 https://api.github.com/repos/pydata/xarray/issues/521 MDEyOklzc3VlQ29tbWVudDEyOTA2OTkxNQ== rabernat 1197350 2015-08-08T23:30:38Z 2015-08-08T23:30:38Z MEMBER

The PR above fixes this issue. However, since my model years are in the range 100-200, I am still getting the warning

RuntimeWarning: Unable to decode time axis into full numpy.datetime64 objects, continuing using dummy netCDF4.datetime objects instead, reason: dates out of range

and eventually when I try to access the time data, an error with a very long stack trace ending with

pandas/tslib.pyx in pandas.tslib.Timestamp.__new__ (pandas/tslib.c:7638)() pandas/tslib.pyx in pandas.tslib.convert_to_tsobject (pandas/tslib.c:21232)() pandas/tslib.pyx in pandas.tslib._check_dts_bounds (pandas/tslib.c:23332)() OutOfBoundsDatetime: Out of bounds nanosecond timestamp: 100-02-01 00:00:00

I see there is a check in conventions.py that the year has to lie between 1678 and 2226. What is the reason for this?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  time decoding error with "days since"  99836561
129054598 https://github.com/pydata/xarray/issues/521#issuecomment-129054598 https://api.github.com/repos/pydata/xarray/issues/521 MDEyOklzc3VlQ29tbWVudDEyOTA1NDU5OA== rabernat 1197350 2015-08-08T21:59:54Z 2015-08-08T21:59:54Z MEMBER

In fact I just found a netCDF issue on this topic! Apparently they don't think it should be supported. Unidata/netcdf4-python#442

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  time decoding error with "days since"  99836561

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 50.239ms · About: xarray-datasette