home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

1 row where issue = 979916914 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • kmuehlbauer 1

issue 1

  • Writing and reopening introduces bad values · 1 ✖

author_association 1

  • MEMBER 1
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1527461082 https://github.com/pydata/xarray/issues/5739#issuecomment-1527461082 https://api.github.com/repos/pydata/xarray/issues/5739 IC_kwDOAMm_X85bCzTa kmuehlbauer 5821660 2023-04-28T12:00:15Z 2023-04-28T12:00:15Z MEMBER

@dougrichardson Sorry for the delay. If you are still interested in the source of this issue here is what I found:

The root cause is different scale_factor and add_offset in the source files.

When merging only the .encoding of the first dataset survives. This leads to wrongly encoded file for the may-dates. But why is this so?

The issue is with the packed dtype ("int16") and the particular values of scale_factor/add_offset.

For feb the dynamic range is (228.96394336525748, 309.9690856933594) K whereas for may it is (205.7644192729947, 311.7797088623047) K.

Now we can clearly see that all our values which are above 309.969 K will be folded to the lower end (>229 K).

To circumvent that you have at least two options:

  • change scale_factor and add_offset values in the variables .encoding before writing to appropriate values which cover your whole dynamic range
  • drop scale_factor/add_offset (and other CF related attributes) from .encoding to write floating point values

It might be nice to have checks for that in the encoding steps, to prevent writing erroneous values. So this is not really a bug, but might be less impactful when encoding is dropped on operations (see discussion in #6323).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Writing and reopening introduces bad values 979916914

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 9.671ms · About: xarray-datasette