This release includes the Beta version of the Ruff formatter — an extremely fast, Black-compatible Python formatter.
Try it today with ruff format.
Changes
Preview features
[pylint] Implement non-...
Tell that to Anaconda/Poetry. If I remember correctly, there’s some deficiency with Python packaging that makes dependency solving harder for than it is for other languages.
I believe it’s because when the metadata is missing, you need to download the entire package and try installing it to check compatibility.
Of course, this could still be mitigated by generating metadata / precomputing dependencies, or creating and hosting “mini” package proxies for dependency resolution only for the big packages, or many other engineering solutions, if the poetry devs were sufficiently motivated.
No one’s saying that we have to solve the Halting Problem (or similar) to get massive speedups in many cases.
I believe it’s because when the metadata is missing, you need to download each version of an entire massive package and try installing it to check compatibility. Repeated for every possible version, backwards until one works.
I think you’re right; that definitely seems to be poetry’s behaviour. So that’s clearly IO-bound. On the other hand, Anaconda got massive speedups by switching to a better dependency solver. Dependency resolution is basically SAT, after all.
The question isn’t a matter of creating a unified tool but of figuring out why past attempts (e.g. poetry) haven’t taken off more and if those problems can be avoided.
The only thing holding poetry back is still not supporting the standardized pyproject.toml. That’s the closest thing to cargo that we have. And issues related to various ML libraries
For me, a big problem with poetry is the author’s insistence on being as strict as cargo on dependency resolution when the Python ecosystem doesn’t have the culture to go with it. You need to be able to override bad transitive dependencies.
I see typechecking as the next natural extension to this. Won’t be easy, it’s probably the trickiest thing to do well, but it would be the ultimate python tool if it did that. It’s a natural extension point as well, since they have the parsing down.
It’s not about the speed, which is trivial. It’s about having an unified tool like cargo
Tell that to Anaconda/Poetry. If I remember correctly, there’s some deficiency with Python packaging that makes dependency solving harder for than it is for other languages.
I believe it’s because when the metadata is missing, you need to download the entire package and try installing it to check compatibility.
Of course, this could still be mitigated by generating metadata / precomputing dependencies, or creating and hosting “mini” package proxies for dependency resolution only for the big packages, or many other engineering solutions, if the poetry devs were sufficiently motivated.
No one’s saying that we have to solve the Halting Problem (or similar) to get massive speedups in many cases.
I think you’re right; that definitely seems to be poetry’s behaviour. So that’s clearly IO-bound. On the other hand, Anaconda got massive speedups by switching to a better dependency solver. Dependency resolution is basically SAT, after all.
The question isn’t a matter of creating a unified tool but of figuring out why past attempts (e.g. poetry) haven’t taken off more and if those problems can be avoided.
The only thing holding poetry back is still not supporting the standardized pyproject.toml. That’s the closest thing to cargo that we have. And issues related to various ML libraries
For me, a big problem with poetry is the author’s insistence on being as strict as cargo on dependency resolution when the Python ecosystem doesn’t have the culture to go with it. You need to be able to override bad transitive dependencies.
I see typechecking as the next natural extension to this. Won’t be easy, it’s probably the trickiest thing to do well, but it would be the ultimate python tool if it did that. It’s a natural extension point as well, since they have the parsing down.
There’s pylyzer for a Rust-based type checking LSP alternative to Pyright, but I couldn’t really get it to work with Neovim.