issues: 481250429
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
481250429 | MDU6SXNzdWU0ODEyNTA0Mjk= | 3222 | Minimum versions for optional libraries | 6213168 | closed | 0 | 12 | 2019-08-15T17:18:16Z | 2019-10-08T21:23:47Z | 2019-10-08T21:23:47Z | MEMBER | In CI there are:
There are no tests for legacy versions of the optional libraries. Today I tried downgrading dask in the py37 environment to dask=1.1.2, which is 6 months old... ...it's a bloodbath. 383 errors of the most diverse kind. In the codebase I found mentions to much older minimum versions: installing.rst mentions dask >=0.16.1, and Dataset.chunk() even asks for dask>=0.9. It think we should add CI tests for old versions of the optional dependencies. What policy should we adopt when we find an incompatibility? How old a library should be not to bother fixing bugs and just require a newer version? I personally would go for an aggressive 6 months worth' of backwards compatibility; less if the time it takes to fix the issues is excessive. The tests should run on py36 because py35 builds are becoming very scarce in anaconda. This has the outlook of being an exercise in extreme frustration. I'm afraid I personally hold zero interest towards packages older than the latest available in the anaconda official repo, so I'm not volunteering for this one (sorry). I'd like to hear other people's opinions and/or offers of self-immolation... :) |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3222/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | 13221727 | issue |