For that matter, IMX much of what people praise uv for is simply stuff that pip (and venv) can now do that it couldn't back when they gave up on pip. Which in turn has become possible because of several ecosystem standards (defined across many PEPs) and increasing awareness and adoption of those standards.
The "install things that have complex non-Python dependencies using pip" story is much better than several years ago, because of things like pip gaining a new resolver in 2020, but in large part simply because it's now much more likely that the package you want offers a pre-built wheel (and that its dependencies also do). A decade ago, it was common enough that you'd be stuck with source packages even for pure-Python projects, which forced pip to build a wheel locally first (https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-...).
Another important change is that for wheels on PyPI the installer can now obtain separate .metadata files, so it can learn what the transitive dependencies are for a given version of a given project from a small plain-text file rather than having to speculatively download the entire wheel and unpack the METADATA file from it. (This is also possible for source distributions that include PKG-INFO, but they aren't forced to do so, and a source distribution's metadata is allowed to have "dynamic" dependencies that aren't known until the wheel is built (worst case) or a special metadata-only build hook is run (requires additional effort for the build system to support and the developer to implement)).
Things uv does better by pip by default: - really hard to install a package globally by accident (pip: forgetting to activate venv) - really easy to distinguish de and main dependencies (pip: create different files for different groups and set up their relationship) - distinguish direct dependencies from indirect dependencies, making it easy to find when a package is not needed anymore (pip: I bet most devs are either not tracking sub dependencies or mixing all together with pip freeze) - easily use different python versions for different projects (pip: not really)
With uv it just works. With pip, technically you can make it work, and I bet you'll screw something up along the way.
> - really hard to install a package globally by accident (pip: forgetting to activate venv)
This is different as of Python 3.11. Please see https://peps.python.org/pep-0668/ for details. Nowadays, to install a package globally, you first have to have a global copy of pip (Debian makes you install that separately), then you have to intentionally bypass a security marker using --break-system-packages.
Also, you don't have to activate the venv to use it. You can specify the path to the venv's pip explicitly; or you can use a different copy of pip (e.g. a globally-installed one) passing it the `--python` argument (you have been able to do this for about 3 years now).
(Pedantically, yes, you could use a venv-installed copy of pip to install into the system environment, passing both --python and --break-system-packages. I can't prove that anyone has ever done this, and I can't fathom a reason beyond bragging rights.)
> - really easy to distinguish [dev] and main dependencies
As of 25.1, pip can install from dependency groups described in pyproject.toml, which is the standard way to group your dependencies in metadata.
> distinguish direct dependencies from indirect dependencies, making it easy to find when a package is not needed anymore
As of 25.1, pip can create PEP 751 standard lockfiles.
> easily use different python versions for different projects
If you want something to install Python for you, yes, that was never in pip's purview, by design.
If you want to use an environment based off an existing Python, that's what venv is for.
For sure, we see the same thing in the JS ecosystem. New tooling adds some feature, other options implement feature, convergence to a larger common set.
I'm still mostly on poetry
The things you list may be a reason for some, but in all discussions I’ve had and read about on uv, the reason is that it behaves as a package manger should. It can just install dependencies from an automatically generated lockfile. It can update outdated minor versions. It can tell me about outdated versions of my dependencies. It can reproduce a build on another machine. The lock file can be put into version control. A coworker can run a single command to install everything. It abstracts the stupidity that is virtual environments away so much you don’t even have to touch them anymore. And also, it’s fast.
Wake me up when pip can do any of that.
> the reason is that it behaves as a package manger should.
This is a matter of opinion. Pip exists to install the packages and their dependencies. It does not, by design, exist to manage a project for you.
The overwhelming majority of developers seem to agree with me though.
If anything, pip is a dependency installer, while working with even trivial projects requires a dependency manager. Parent's point was that pip is actually good enough that you don’t even need uv anymore, but as long as pip doesn’t satisfy 80% of the requirements, that’s just plain false.
I'm not sure an overwhelming majority of Python developers care one way or the other. Like, I'm sure uv is nice, but I've somehow never had an issue with pip or conda, so there's just no reason to futz with uv. Same deal with Jujutsu. It's probably great, but git isn't a problem, so jj isn't a priority.
A majority of HN users might agree with you, but I'd guess that a majority of developers, to paraphrase Don Draper, don't think about it at all.
"anymore" makes no sense, since pip long predates uv.
Some people don't have, or don't care about, the additional requirements you have in mind.