issues: 187790480
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
187790480 | MDU6SXNzdWUxODc3OTA0ODA= | 1091 | Reindex twice efficiently | 5253102 | closed | 0 | 4 | 2016-11-07T18:30:07Z | 2016-11-07T23:00:06Z | 2016-11-07T23:00:06Z | NONE | The goal is to re-index using a 10 year series of daily dates. A brute force Python loop method is outlined below. Is there a better way? Let the DataArray have 3 Coordinates: location: (4000 locations) sample: (10 samples) 0,1,2,3,4,5,6,7,8,9 attributes: (2 attributes) date, temp Each location has a different set of 'date' values. Using brute force, one could Python loop through the 4000 locations: 1) reindex each location to its 'date' attribute, then drop 'date' 2) reindex this new DataArray using the 10 year series, with padding 3) append this new array to a {location: DataArray} dictionary. Then, construct a DataSet from the dictionary, and then a DataArray from the DataSet. Thank you, Doug |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/1091/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | 13221727 | issue |