home / github / issues

Menu
  • Search all tables
  • GraphQL API

issues: 1925977158

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1925977158 PR_kwDOAMm_X85b41i3 8272 Fix datetime encoding precision loss regression for units requiring floating point values 6628425 closed 0     1 2023-10-04T11:12:59Z 2023-10-06T14:09:34Z 2023-10-06T14:08:51Z MEMBER   0 pydata/xarray/pulls/8272

This PR proposes a fix to #8271. I think the basic issue is that the only time we need to update the needed_units is if the data_delta does not evenly divide the ref_delta. If it does evenly divide it--as it does in the example in #8271--and we try to update the needed_units solely according to the value of the ref_delta, we run the risk of resetting them to something that would be coarser than the data requires. If it does not evenly divide it, we are safe to reset the needed_units because they will be guaranteed to be finer-grained than the data requires.

I modified test_roundtrip_float_times to reflect the example given by @larsbuntemeyer in #8271. @kmuehlbauer let me know if this fix makes sense to you.

  • [x] Closes #8271
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8272/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    13221727 pull

Links from other tables

  • 1 row from issues_id in issues_labels
  • 0 rows from issue in issue_comments
Powered by Datasette · Queries took 0.719ms · About: xarray-datasette