html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/3961#issuecomment-778841149,https://api.github.com/repos/pydata/xarray/issues/3961,778841149,MDEyOklzc3VlQ29tbWVudDc3ODg0MTE0OQ==,2560426,2021-02-14T21:01:21Z,2021-02-14T21:01:21Z,NONE,"> Or alternatively you can try to set sleep between openings.
To clarify, do you mean adding a sleep of e.g. 1 second prior to your `preprocess` function (and setting `preprocess` to just sleep then `return ds` if you're not doing any preprocessing)? Or, are you instead sleeping before the entire `open_mfdataset` call?
Is this solution only addressing the issue of opening the same ds multiple times within a python process, or would it also address multiple processes opening the same ds?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,597657663
https://github.com/pydata/xarray/issues/3961#issuecomment-778838527,https://api.github.com/repos/pydata/xarray/issues/3961,778838527,MDEyOklzc3VlQ29tbWVudDc3ODgzODUyNw==,2560426,2021-02-14T20:40:38Z,2021-02-14T20:40:38Z,NONE,"Also seeing this as of version 0.16.1.
In some cases, I need `lock=False` otherwise I'll run into hung processes a certain percentage of the time. `ds.load()` prior to `to_netcdf()` does not solve the problem.
In other cases, I need `lock=None` otherwise I'll consistently get `RuntimeError: NetCDF: Not a valid ID`.
Is the current recommended solution to set `lock=False` and retry until success? Or, is it to keep `lock=None` and use `zarr` instead? @dcherian ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,597657663