issue_comments: 553832745
This data as json
| html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
|---|---|---|---|---|---|---|---|---|---|---|---|
| https://github.com/pydata/xarray/issues/3514#issuecomment-553832745 | https://api.github.com/repos/pydata/xarray/issues/3514 | 553832745 | MDEyOklzc3VlQ29tbWVudDU1MzgzMjc0NQ== | 6213168 | 2019-11-14T10:47:43Z | 2019-11-14T10:47:43Z | MEMBER | I prefer max-sixty could you post your benchmark where you measure 150us? I tried caching that property with @cache_readonly and I only get a boost of 7us. ```python import xarray ds = xarray.Dataset({'d': ('x', [1, 2]), 'x': [10, 20]}) ds ds.to_netcdf("foo.nc") ds.close() ds = xarray.open_dataset("foo.nc") %timeit ds.isel(x=[0])
|
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
521754870 |