html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/5511#issuecomment-1546951468,https://api.github.com/repos/pydata/xarray/issues/5511,1546951468,IC_kwDOAMm_X85cNJss,1217238,2023-05-14T17:17:56Z,2023-05-14T17:17:56Z,MEMBER,"If we can find cases where we know concurrent writes are unsafe, we can definitely start raising errors. Better to be safe than to suffer from silent data corruption!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,927617256 https://github.com/pydata/xarray/issues/5511#issuecomment-866334987,https://api.github.com/repos/pydata/xarray/issues/5511,866334987,MDEyOklzc3VlQ29tbWVudDg2NjMzNDk4Nw==,1217238,2021-06-22T21:08:35Z,2021-06-22T21:08:35Z,MEMBER,"Thanks for the report! This does look broken, which I was able to verify by running your code. My guess is that something in Xarray's logic for appending datasets implicitly assumes that the existing datasets has been written in complete ""chunks"", which is not the case here.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,927617256