issues: 1976092931
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1976092931 | I_kwDOAMm_X851yMkD | 8411 | FileNotFoundError when accessing same file from multiple processes | 14296832 | closed | 0 | 9 | 2023-11-03T12:28:52Z | 2023-12-12T21:26:41Z | 2023-12-12T21:26:41Z | NONE | What is your issue?I am trying to access the same file using xr.load_dataset() from multiple processes. They are all trying to read it at the same time (or within 0.1s of each other). But only the 1st process is able to access it, while the other processes are not able to read it. It gives a generic "FileNotFoundError" even though the file is there. The file is written about 2-3s before it is read by different processes. Is this expencted? Earlier I suspected xr.open_dataset() to be the culprit but removing that with load_dataset() also did not solve the issue. The issue is sporadic and cannot be reproduced easily but it happens in our production process. Any suggestions please? |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/8411/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
not_planned | 13221727 | issue |