• fatbob42@fediverser.communick.devB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I don’t think that’s significantly CPU-bound, although you can certainly slow it down with a bad dependency-resolution algorithm (as pipenv did).

      • epage@fediverser.communick.devB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        The question isn’t a matter of creating a unified tool but of figuring out why past attempts (e.g. poetry) haven’t taken off more and if those problems can be avoided.

          • epage@fediverser.communick.devB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            For me, a big problem with poetry is the author’s insistence on being as strict as cargo on dependency resolution when the Python ecosystem doesn’t have the culture to go with it. You need to be able to override bad transitive dependencies.

          • theAndrewWiggins@fediverser.communick.devB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I see typechecking as the next natural extension to this. Won’t be easy, it’s probably the trickiest thing to do well, but it would be the ultimate python tool if it did that. It’s a natural extension point as well, since they have the parsing down.

      • SuspiciousScript@fediverser.communick.devB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s not about the speed, which is trivial

        Tell that to Anaconda/Poetry. If I remember correctly, there’s some deficiency with Python packaging that makes dependency solving harder for than it is for other languages.

        • muntoo@fediverser.communick.devB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I believe it’s because when the metadata is missing, you need to download the entire package and try installing it to check compatibility.

          Of course, this could still be mitigated by generating metadata / precomputing dependencies, or creating and hosting “mini” package proxies for dependency resolution only for the big packages, or many other engineering solutions, if the poetry devs were sufficiently motivated.

          No one’s saying that we have to solve the Halting Problem (or similar) to get massive speedups in many cases.

          • SuspiciousScript@fediverser.communick.devB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I believe it’s because when the metadata is missing, you need to download each version of an entire massive package and try installing it to check compatibility. Repeated for every possible version, backwards until one works.

            I think you’re right; that definitely seems to be poetry’s behaviour. So that’s clearly IO-bound. On the other hand, Anaconda got massive speedups by switching to a better dependency solver. Dependency resolution is basically SAT, after all.