home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where user = 6514690 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 3

  • Problem decoding times in data from OpenDAP server 2
  • Feature request: only allow nearest-neighbor .sel for valid data (not NaN positions) 1
  • Issue with GFS time reference 1

user 1

  • albertotb · 4 ✖

author_association 1

  • NONE 4
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1016260786 https://github.com/pydata/xarray/issues/644#issuecomment-1016260786 https://api.github.com/repos/pydata/xarray/issues/644 IC_kwDOAMm_X848kuiy albertotb 6514690 2022-01-19T09:47:15Z 2022-01-19T09:47:15Z NONE

I just want to +1 this issue since I'm having the exact same problem. It would be great if the .sel(method="nearest") could ignore NaNs as an option

{
    "total_count": 15,
    "+1": 15,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Feature request: only allow nearest-neighbor .sel for valid data (not NaN positions) 114773593
1011120549 https://github.com/pydata/xarray/issues/4422#issuecomment-1011120549 https://api.github.com/repos/pydata/xarray/issues/4422 IC_kwDOAMm_X848RHml albertotb 6514690 2022-01-12T14:47:46Z 2022-01-12T14:48:49Z NONE

I just want to add here for reference that this issue was posted earlier but closed at the time as stale. I will leave this here just to link them both and to note that this is fixed: https://github.com/pydata/xarray/issues/827

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Problem decoding times in data from OpenDAP server 701062999
1011121158 https://github.com/pydata/xarray/issues/827#issuecomment-1011121158 https://api.github.com/repos/pydata/xarray/issues/827 IC_kwDOAMm_X848RHwG albertotb 6514690 2022-01-12T14:48:24Z 2022-01-12T14:48:24Z NONE

Also mentioned in #4422 and fixed in #4506

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Issue with GFS time reference 148876551
695978867 https://github.com/pydata/xarray/issues/4422#issuecomment-695978867 https://api.github.com/repos/pydata/xarray/issues/4422 MDEyOklzc3VlQ29tbWVudDY5NTk3ODg2Nw== albertotb 6514690 2020-09-21T08:34:23Z 2020-09-21T08:38:07Z NONE

Thank you very much for the detailed explanation and taking the time to look into this. I had a feeling it had to do with time decoding, but did not know exactly what was going on. IMHO the confusing thing is that the file parses without problems, so maybe a warning indicating that the parsing failed with pandas could help.

Also, would this be solved if using use_cftime=True when reading back the file?

If I understood correctly the code you quote, in that case you are overwritting the units attribute. Maybe the same could be donde in this case

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Problem decoding times in data from OpenDAP server 701062999

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 53.183ms · About: xarray-datasette