html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/5567#issuecomment-874205134,https://api.github.com/repos/pydata/xarray/issues/5567,874205134,MDEyOklzc3VlQ29tbWVudDg3NDIwNTEzNA==,25382032,2021-07-05T15:48:50Z,2021-07-05T15:48:50Z,NONE,"oh I get it now. Thanks. Indeed it works now when chunking lat and lon from the start.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,935818279 https://github.com/pydata/xarray/issues/5567#issuecomment-873123273,https://api.github.com/repos/pydata/xarray/issues/5567,873123273,MDEyOklzc3VlQ29tbWVudDg3MzEyMzI3Mw==,25382032,2021-07-02T16:37:03Z,2021-07-02T16:37:03Z,NONE,"> > ds.chunk({'time': -1}) > > I suspect this is making your entire dataset one big chunk. I would chunk along `lat` and `lon` in `open_mfdataset` first. But if I am doing `ds.quantile(quantiles, dim='time')` and assigning it again to ds, wouldn't that erase the big dataset from memory? (sorry for the ignorance). Thanks","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,935818279