html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/1225#issuecomment-343332976,https://api.github.com/repos/pydata/xarray/issues/1225,343332976,MDEyOklzc3VlQ29tbWVudDM0MzMzMjk3Ng==,13906519,2017-11-10T00:07:24Z,2017-11-10T00:07:24Z,NONE,"Thanks for that Stephan.
The workaround looks good for the moment ;-)...
Detecting a mismatch (and maybe even correcting it) automatically would be very useful
cheers,
C","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,202964277
https://github.com/pydata/xarray/issues/1225#issuecomment-343325842,https://api.github.com/repos/pydata/xarray/issues/1225,343325842,MDEyOklzc3VlQ29tbWVudDM0MzMyNTg0Mg==,13906519,2017-11-09T23:28:28Z,2017-11-09T23:28:28Z,NONE,"Is there any news on this? Have the same problem. A reset_chunksizes() method would be very helpful. Also, what is the cleanest way to remove all chunk size info? I have a very long computation and it fails at the very end with the mentioned error message. My file is patched together from many sources...
cheers","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,202964277