home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where author_association = "CONTRIBUTOR", issue = 1611288905 and user = 39069044 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • slevang · 3 ✖

issue 1

  • xr.where increase the bytes of the dataset · 3 ✖

author_association 1

  • CONTRIBUTOR · 3 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1457345587 https://github.com/pydata/xarray/issues/7587#issuecomment-1457345587 https://api.github.com/repos/pydata/xarray/issues/7587 IC_kwDOAMm_X85W3VQz slevang 39069044 2023-03-07T01:34:12Z 2023-03-07T01:34:12Z CONTRIBUTOR

Your m0tot variable is also being broadcast in the fami dimension. So, an additional 10x384x1233x8/1e6=37MB.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xr.where increase the bytes of the dataset  1611288905
1457080267 https://github.com/pydata/xarray/issues/7587#issuecomment-1457080267 https://api.github.com/repos/pydata/xarray/issues/7587 IC_kwDOAMm_X85W2UfL slevang 39069044 2023-03-06T22:06:11Z 2023-03-06T22:06:11Z CONTRIBUTOR

Same issue as #1234. This has tripped me up before as well. A kwarg to control this behavior would be a nice enhancement to .where().

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xr.where increase the bytes of the dataset  1611288905
1457061064 https://github.com/pydata/xarray/issues/7587#issuecomment-1457061064 https://api.github.com/repos/pydata/xarray/issues/7587 IC_kwDOAMm_X85W2PzI slevang 39069044 2023-03-06T21:55:14Z 2023-03-06T21:55:14Z CONTRIBUTOR

Since you're using tp (dims fami, time, site) as the condition, these dimensions are broadcast across all other variables in the dataset. The problem looks to be your variable wshedOut, which is now broadcast across all 5 dimensions in the dataset, hence greatly increased memory usage.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xr.where increase the bytes of the dataset  1611288905

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 51.344ms · About: xarray-datasette