home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 1046665303

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/2186#issuecomment-1046665303 https://api.github.com/repos/pydata/xarray/issues/2186 1046665303 IC_kwDOAMm_X84-YthX 691772 2022-02-21T09:41:00Z 2022-02-21T09:41:00Z CONTRIBUTOR

I just stumbled across the same issue and created a minimal example similar to @lkilcher. I am using xr.open_dataarray() with chunks and do some simple computation. After that 800mb of RAM is used, no matter whether I close the file explicitly, delete the xarray objects or invoke the Python garbage collector.

What seems to work: do not use the threading Dask scheduler. The issue does not seem to occur with the single-threaded or processes scheduler. Also setting MALLOC_MMAP_MAX_=40960 seems to solve the issue as suggested above (disclaimer: I don't fully understand the details here).

If I understand things correctly, this indicates that the issue is a consequence of dask/dask#3530. Not sure if there is anything to be fixed on the xarray side or what would be the best work around. I will try to use the processes scheduler.

I can create a new (xarray) ticket with all details about the minimal example, if anyone thinks that this might be helpful (to collect work-a-rounds or discuss fixes on the xarray side).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  326533369
Powered by Datasette · Queries took 0.508ms · About: xarray-datasette