issue_comments: 343325842
This data as json
| html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
|---|---|---|---|---|---|---|---|---|---|---|---|
| https://github.com/pydata/xarray/issues/1225#issuecomment-343325842 | https://api.github.com/repos/pydata/xarray/issues/1225 | 343325842 | MDEyOklzc3VlQ29tbWVudDM0MzMyNTg0Mg== | 13906519 | 2017-11-09T23:28:28Z | 2017-11-09T23:28:28Z | NONE | Is there any news on this? Have the same problem. A reset_chunksizes() method would be very helpful. Also, what is the cleanest way to remove all chunk size info? I have a very long computation and it fails at the very end with the mentioned error message. My file is patched together from many sources... cheers |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
202964277 |