issue_comments: 326138431
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/issues/1225#issuecomment-326138431 | https://api.github.com/repos/pydata/xarray/issues/1225 | 326138431 | MDEyOklzc3VlQ29tbWVudDMyNjEzODQzMQ== | 2443309 | 2017-08-30T22:36:14Z | 2017-08-30T22:36:14Z | MEMBER | @tbohn - What is happening here is that xarray is storing the netCDF4 chunk size from the input file. For the
Those integers correspond to the dimensions from LAI. When you slice your dataset, you end up with lat/lon dimensions that are now smaller than the The logical fix is to validate this encoding attribute and either 1) throw an informative error if something isn't going to work, or 2) change the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
202964277 |