• SuspiciousScript@fediverser.communick.devB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    It’s not about the speed, which is trivial

    Tell that to Anaconda/Poetry. If I remember correctly, there’s some deficiency with Python packaging that makes dependency solving harder for than it is for other languages.

    • muntoo@fediverser.communick.devB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I believe it’s because when the metadata is missing, you need to download the entire package and try installing it to check compatibility.

      Of course, this could still be mitigated by generating metadata / precomputing dependencies, or creating and hosting “mini” package proxies for dependency resolution only for the big packages, or many other engineering solutions, if the poetry devs were sufficiently motivated.

      No one’s saying that we have to solve the Halting Problem (or similar) to get massive speedups in many cases.

      • SuspiciousScript@fediverser.communick.devB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I believe it’s because when the metadata is missing, you need to download each version of an entire massive package and try installing it to check compatibility. Repeated for every possible version, backwards until one works.

        I think you’re right; that definitely seems to be poetry’s behaviour. So that’s clearly IO-bound. On the other hand, Anaconda got massive speedups by switching to a better dependency solver. Dependency resolution is basically SAT, after all.