pull_requests: 1541626039
This data as json
id | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | auto_merge | repo | url | merged_by |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1541626039 | PR_kwDOAMm_X85b41i3 | 8272 | closed | 0 | Fix datetime encoding precision loss regression for units requiring floating point values | 6628425 | <!-- Feel free to remove check-list items aren't relevant to your change --> This PR proposes a fix to #8271. I think the basic issue is that the only time we need to update the `needed_units` is if the `data_delta` does not evenly divide the `ref_delta`. If it does evenly divide it--as it does in the example in #8271--and we try to update the `needed_units` solely according to the value of the `ref_delta`, we run the risk of resetting them to something that would be coarser than the data requires. If it does not evenly divide it, we are safe to reset the `needed_units` because they will be guaranteed to be finer-grained than the data requires. I modified `test_roundtrip_float_times` to reflect the example given by @larsbuntemeyer in #8271. @kmuehlbauer let me know if this fix makes sense to you. - [x] Closes #8271 - [x] Tests added - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` | 2023-10-04T11:12:59Z | 2023-10-06T14:09:34Z | 2023-10-06T14:08:51Z | 2023-10-06T14:08:51Z | 1b0012a44aa45c67858489bc815928e1712dbd00 | 0 | 8f271a3548e9de650b8a8d2ef4ad2646788ab7e9 | d5f17858e5739c986bfb52e7f2ad106bb4489364 | MEMBER | 13221727 | https://github.com/pydata/xarray/pull/8272 |
Links from other tables
- 1 row from pull_requests_id in labels_pull_requests