home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 1340482904

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/7363#issuecomment-1340482904 https://api.github.com/repos/pydata/xarray/issues/7363 1340482904 IC_kwDOAMm_X85P5iVY 5821660 2022-12-07T06:59:11Z 2022-12-07T06:59:11Z MEMBER

@jerabaul29 Does your Dataset with the 3 Million time points fit into your machine's memory? Are the arrays dask-backed? It is unfortunately not seen in the screenshots. Calculating from the sizes this is 106 x 3_208_464 single measurements -> 340_097_184. Going from float (8 byte) this will lead to 2_720_777_472, roughly 2.7GB which should fit in most setups. I'm not really sure but good chance that reindex is creating a completely new Dataset, which means the computer has to hold the origin as well as the new Dataset (which is roughly 3.2GB). This adds up to almost 6GB RAM. Depending on your machine and other tasks this might drive into RAM issues. But xarray devs will know better.

@keewis suggestion of creating and concatenating a new array with predefined values which is file-backed could resolve the issues you are currently facing.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  1479121713
Powered by Datasette · Queries took 0.965ms · About: xarray-datasette