pull_requests: 700860368
This data as json
id | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | auto_merge | repo | url | merged_by |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
700860368 | MDExOlB1bGxSZXF1ZXN0NzAwODYwMzY4 | 5661 | closed | 0 | Speed up _mapping_repr | 14371165 | Creating a ordered list for filtering purposes using `.items()` turns out being rather slow. Use `.keys()` instead as that doesn't trigger a bunch of dataarray initializations. - [x] Passes `pre-commit run --all-files` Test case: ```python import numpy as np import xarray as xr a = np.arange(0, 2000) data_vars = dict() for i in a: data_vars[f"long_variable_name_{i}"] = xr.DataArray( name=f"long_variable_name_{i}", data=np.arange(0, 20), dims=[f"long_coord_name_{i}_x"], coords={f"long_coord_name_{i}_x": np.arange(0, 20) * 2}, ) ds0 = xr.Dataset(data_vars) ds0.attrs = {f"attr_{k}": 2 for k in a} ``` Before: ```python %timeit print(ds0) 14.6 s ± 215 ms per loop (mean ± std. dev. of 7 runs, 1 loop each) ``` After: ```python %timeit print(ds0) 120 ms ± 2.06 ms per loop (mean ± std. dev. of 7 runs, 10 loops each) ``` | 2021-08-01T08:44:17Z | 2022-08-12T09:07:44Z | 2021-08-02T19:45:16Z | 2021-08-02T19:45:16Z | 8f5b4a185b304ab68723c390b1ad88e57f9a60d6 | 0 | 704d14e731d4d8d633aeacb29b27e142766badd1 | c44b816bf4a858af1bb621d96e1d3482db3976da | MEMBER | 13221727 | https://github.com/pydata/xarray/pull/5661 |
Links from other tables
- 1 row from pull_requests_id in labels_pull_requests