home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where user = 19285200 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 2

  • xarray imshow and pcolormesh behave badly when the array does not contain values larger the BoundaryNorm vmax 2
  • `open_dataset` with `chunks="auto"` fails when a netCDF4 variables/coordinates is encoded as `NC_STRING` 2

user 1

  • ghiggi · 4 ✖

author_association 1

  • NONE 4
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1561358915 https://github.com/pydata/xarray/issues/7868#issuecomment-1561358915 https://api.github.com/repos/pydata/xarray/issues/7868 IC_kwDOAMm_X85dEHJD ghiggi 19285200 2023-05-24T15:20:00Z 2023-05-24T15:20:00Z NONE

Dask array with dtype object can contain whatever python object (i.e. I saw examples of geometry and matplotlib collections within dask arrays with object dtype). As a consequence, dask do not try the conversion to i.e. str to estimate the array size, since there is no clean way AFAIK to attach an attribute to dtype suggesting that the object is actually a string.

With your PR, the dtype is not anymore object when creating the dask.array and this solves the issue I guess. Did I overlooked something?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  `open_dataset` with `chunks="auto"` fails when a netCDF4 variables/coordinates is encoded as `NC_STRING` 1722417436
1560651807 https://github.com/pydata/xarray/issues/7868#issuecomment-1560651807 https://api.github.com/repos/pydata/xarray/issues/7868 IC_kwDOAMm_X85dBagf ghiggi 19285200 2023-05-24T08:12:18Z 2023-05-24T08:12:18Z NONE

Thanks @kmuehlbauer ! https://github.com/pydata/xarray/pull/7869 solve the issues !

Summarizing: - With #7869, netCDF4 with NC_STRING variable arrays are now read into xarray as Unicode dtype (instead of object) - As a consequence dask can estimate the array's size and xr.open_dataset(fpath, chunks="auto") does not raise anymore the NotImplementedError. - NC_CHAR variable arrays continue to be read into xarray as fixed-length byte-string dtype. Maybe something more could be done to deserialize also NC_CHAR to Unicode. However, this might cause some backward incompatibilities and might be better to address this in a separate PR.

Thanks again @kmuehlbauer for having resolved the problem in less than 2 hours :1st_place_medal:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  `open_dataset` with `chunks="auto"` fails when a netCDF4 variables/coordinates is encoded as `NC_STRING` 1722417436
1448588687 https://github.com/pydata/xarray/issues/7014#issuecomment-1448588687 https://api.github.com/repos/pydata/xarray/issues/7014 IC_kwDOAMm_X85WV7WP ghiggi 19285200 2023-02-28T17:33:46Z 2023-02-28T17:34:24Z NONE

@veenstrajelmer I am not sure I understand what you are saying. In the example I pass only norm to the plotting function ... As suggested by @kmuehlbauer the solution to this issue is to specify the extend argument in both the plotting call and cbar_kwargs. In the example, extend was only defined in cbar_kwargs (since is an argument also of mpl.Figure.colorbar). Likely we should align the arguments somewhere in the code with something like this:

if extend is None and cbar_kwargs.get("extend) is not None: extend = cbar_kwargs.get("extend) if extend is not None and "extend" not in cbar_kwargs cbar_kwargs["extend"] = extend

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xarray imshow and pcolormesh behave badly when the array does not contain values larger the BoundaryNorm vmax 1368027148
1446791387 https://github.com/pydata/xarray/issues/7014#issuecomment-1446791387 https://api.github.com/repos/pydata/xarray/issues/7014 IC_kwDOAMm_X85WPEjb ghiggi 19285200 2023-02-27T17:58:15Z 2023-02-27T22:22:44Z NONE

Thanks to all the people above that have started digging into the problem ! @veenstrajelmer : Adding the levels=levels argument (together with norm, ... or dropping norm) does not correct/change the output figure. Of course commenting #da1.data[da1.data>=norm.vmax] = norm.vmax - 1 "solves" the issue, but this line of code is what enables to show up the bug, which is occurring when the array does not contain any value equal to or higher than norm.vmax

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  xarray imshow and pcolormesh behave badly when the array does not contain values larger the BoundaryNorm vmax 1368027148

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 237.453ms · About: xarray-datasette