home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

1 row where issue = 437418525 and user = 2448579 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date)

user 1

  • dcherian · 1 ✖

issue 1

  • to_netcdf with decoded time can create file with inconsistent time:units and time_bounds:units · 1 ✖

author_association 1

  • MEMBER 1
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
490143871 https://github.com/pydata/xarray/issues/2921#issuecomment-490143871 https://api.github.com/repos/pydata/xarray/issues/2921 MDEyOklzc3VlQ29tbWVudDQ5MDE0Mzg3MQ== dcherian 2448579 2019-05-07T16:04:58Z 2019-05-07T16:06:39Z MEMBER

Thanks for the great report @klindsay28.

Looks like CF recommends that time_bounds (boundary variable) not have the units or calendar attributes:

Since a boundary variable is considered to be part of a coordinate variable’s metadata, it is not necessary to provide it with attributes such as long_name and units.

Boundary variable attributes which determine the coordinate type (units, standard_name, axis and positive) or those which affect the interpretation of the array values (units, calendar, leap_month, leap_year and month_lengths) must always agree exactly with the same attributes of its associated coordinate, scalar coordinate or auxiliary coordinate variable. To avoid duplication, however, it is recommended that these are not provided to a boundary variable.

We already have special treatment for time_bounds in the decode step. It makes sense to treat it specially in the encode step. In fact it looks like @spencerkclark identified this issue in https://github.com/pydata/xarray/pull/2571 :clap:

@klindsay28 Any interest in putting together a PR that would avoid setting these attributes on time_bounds during the encode step?

Ping @fmaussion for feedback.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  to_netcdf with decoded time can create file with inconsistent time:units and time_bounds:units 437418525

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 302.897ms · About: xarray-datasette