issue_comments: 1363988341
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/issues/7397#issuecomment-1363988341 | https://api.github.com/repos/pydata/xarray/issues/7397 | 1363988341 | IC_kwDOAMm_X85RTM91 | 720460 | 2022-12-23T14:15:25Z | 2022-12-23T14:15:53Z | NONE | Because I want to have a worry-free holidays, I wrote a bit of code that basically creates a new NetCDF file from scratch. I load the data from Xarray, change the data to Numpy arrays and use the NetCDF4 library to write the files (does what I want). In the process, I also slice the data and drop unwanted variables to keep just the bits I want (unlike my original post). If I call .load() or .compute() on my xarray variable, the memory goes crazy (even if I am dropping unwanted variables - which I would expect to release memory). The same happens for slicing followed by .compute(). Unfortunately, the MCVE will have to wait until I am back from my holidays. Happy holidays to all! |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1506437087 |