home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 853900455

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/pull/5402#issuecomment-853900455 https://api.github.com/repos/pydata/xarray/issues/5402 853900455 MDEyOklzc3VlQ29tbWVudDg1MzkwMDQ1NQ== 20629530 2021-06-03T14:11:16Z 2021-06-03T14:11:16Z CONTRIBUTOR

Agreed that this is limited and there's need for a more general solution!

However, this problem is quite tricky... The best I see is what I think you suggested in the other thread : special workaround with re-casting when the operation involves "<m8[ns]" and "O" (and that the "O" is cftime datetimes). That clears the instability problem of my solution. It loads a value, so dask-compat is not optimal, but that's already what dt does.

The encoding problem remains, but I made some tests and I realized numpy automatically casts timedeltas to the smallest unit, but accepts operations between timedeltas of different units. Could we accept timedelta64 arrays with another units than "ns"? Then, there could be a check when converting python timedeltas : if nanosecond information exist : go with [ns], if not, go with [ms]. This bumps the limit from 292 years to 292471. And, with this relaxation of the dtype strictness, would allow a user to use even coarser frequencies if need be.

Throwing ideas. There might be other issues that I did not see!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  906175200
Powered by Datasette · Queries took 0.682ms · About: xarray-datasette