issues: 663235664
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
663235664 | MDU6SXNzdWU2NjMyMzU2NjQ= | 4243 | Manually drop DataArray from memory? | 35968931 | closed | 0 | 3 | 2020-07-21T18:54:40Z | 2023-09-12T16:17:12Z | 2023-09-12T16:17:12Z | MEMBER | Is it possible to deliberately drop data associated with a particular DataArray from memory? Obviously Also does calling python's built-in garbage collector (i.e. The context of this question is that I'm trying to resave some massive variables (~65GB each) that were loaded from thousands of files into just a few files for each variable. I would love to use @rabernat 's new rechunker package but I'm not sure how easily I can convert my current netCDF data to Zarr, and I'm interested in this question no matter how I end up solving the problem. I don't currently have a particularly good understanding of file I/O and memory management in xarray, but would like to improve it. Can anyone recommend a tool I can use to answer this kind of question myself on my own machine? I suppose it would need to be able to tell me the current memory usage of specific objects, not just the total memory usage. (@johnomotani you might be interested) |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4243/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | 13221727 | issue |