issue_comments: 124002256
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/issues/489#issuecomment-124002256 | https://api.github.com/repos/pydata/xarray/issues/489 | 124002256 | MDEyOklzc3VlQ29tbWVudDEyNDAwMjI1Ng== | 1217238 | 2015-07-23T07:13:26Z | 2015-07-23T07:13:26Z | MEMBER | Do have bottleneck installed? I've seen error messages from summing big endian arrays before, but never silently wrong results. We resolved many of these issues for netcdf3 files by coercing arrays to little endian upon reading them from disk. We might even extend this to all arrays loaded into xray. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
96732359 |