Your implication is that pyenv, virtualenv, and pip should be 3 different tools. But for the average developer, these tools are all related to managing the python environment and versions which in my head sounds like one thing. Other languages don't have 3 different tools for this.

pip and virtualenv also add a ton of complexity and when they break (which happens quite often) debugging it is even harder despite them being "battle tested" tools.

Yeah, I agree. In particular it seems insane to me that virtualenv should have to exist. I can't see any valid use case for a machine-global pool of dependencies. Why would anyone think it should be a separate tool rather than just the obvious thing that a dependency manager does? I say this as someone with nearly 20 years of Python experience.

It's the same sort of deal with pyenv--the Python version is itself a dependency of most libraries, so it's a little silly to have a dependency manager that only manages some dependencies.

And in practice it usually ends up being 6 different machine-global pools that all weirdly intersect, and some are python2.

I started using NodeJS more after lots of Python experience. Packages make so much more sense there. Even imports. You know how hard it is to do the equivalent of "require '../foo.js'" in Python?

I, too, have ~20 years of Python experience.

`virtualenv` is a heavy-duty third-party library that adds functionality to the standard library venv. Or rather, venv was created as a subset of virtualenv in Python 3.3, and the projects have diverged since.

The standard library `venv` provides "obvious thing that a dependency manager does" functionality, so that every dependency manager has the opportunity to use it, and so that developers can also choose to work at a lower level. And the virtual-environment standard needs to exist so that Python can know about the pool of dependencies thus stored. Otherwise you would be forced to... depend on the dependency manager to start Python and tell it where its dependency pool is.

Fundamentally, the only things a venv needs are the `pyvenv.cfg` config file, the appropriate folder hierarchy, and some symlinks to Python (stub executables on Windows). All it's doing is providing a place for that "pool of dependencies" to exist, and providing configuration info so that Python can understand the dependency path at startup. The venvs created by the standard library module — and by uv — also provide "activation" scripts to manipulate some environment variables for ease of use; but these are completely unnecessary to making the system work.

Fundamentally, tools like uv create the same kind of virtual environment that the standard library does — because there is only one kind. Uv doesn't bootstrap pip into its environments (since that's slow and would be pointless), but you can equally well disable that with the standard library: `python -m venv --without-pip`.

> the Python version is itself a dependency of most libraries

This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.

If you're trying to solve the problem of deploying an application to people who don't have Python (or to people who don't understand what Python is), you need another layer of wrapping anyway. You aren't going to get end users to install uv first.

I don’t think people consider things from a first principles perspective these days.

“…I can't see any valid use case for a machine-global pool of dependencies…” - Rhetorical question for OP but how do you run an operating system without having said operating systems dependencies available to everything else?

That quote is mine, so I think you’re meaning to address me?

> how do you run an operating system without having said operating systems dependencies available to everything else?

I’m not sure if I understand your question, but I’ll answer based on what I think you mean. The OS gets compiled into an artifact, so the dependencies aren’t available to the system itself unless they are explicitly added.

You asked what’s the point of a machine based global pool of dependencies - I answered: it’s an OS.

An OS isn’t a machine-global pool dependencies. Strange claim.

I agree with all of that context about virtualenv and venv, but it all seems orthogonal to my point. I still can’t see a case where you would want the default Python behavior (global dependencies).

> This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.

“normally” is biased by what the tooling supports. If Python tooling supported pinning to an interpreter by default then perhaps it would seem more normal?

I write a lot of Go these days, and the libs pin to a version of Go. When you build a project, the toolchain will resolve and (if necessary) install the necessary Go dependency just like all of the other dependencies. It’s a very natural and pleasant workflow.

> I still can’t see a case where you would want the default Python behavior (global dependencies).

It's easier to work with if you're new to Python, lazy, or just not generally familiar with the concept of a "project". Tons of people use Python through Jupyter notebooks and install libraries to play with them in a notebook, and have no real reason to think about which installations are required for the current notebook until there's a conflict, or until they want to share their work (which might never happen).

Also as you're well aware, Python existed for a long time before the virtual environment concept.

> It's easier to work with if you're new to Python, lazy, or just not generally familiar with the concept of a "project".

If you have automatic virtual environments then it doesn't matter to these personas (including the Jupyter user) unless they need multiple projects or something in which case automatic virtual environments are strictly better.

> Also as you're well aware, Python existed for a long time before the virtual environment concept.

Correct, but I don't think that supports the idea that global environments have a use case that isn't better served by project-scoped environments.

I think OP's complaint is rather that using `uv` is leaky: now you need to learn all the underlying stuff AND uv as well.

The alternative, of course, is having Python natively support a combined tool. Which you can support while also not liking `uv` for the above reason.

I don't think that's true, most projects using uv don't rely on those tools at all, and you don't need to understand them. You just `uv sync` and do your work.

I'm just clarifying OP in response to parent comment that ignored the most important parts of OP's message. Your comment also seems to ignore the meat of OP's complaint. But maybe I'm interpreting OP's comment as "my use cases for these tools are complicated" because I, too, have complicated Python builds with exotic compiled dependencies.

Python versions and environments can be solved in more reliable abstraction level as well, e.g. if you are heavy Nix user.

On the other hand, Nix and Bazel and friends are a lot of pain. I'm sure the tradeoff makes sense in a lot of situations, but not needing to bring in Nix or Bazel just to manage dependencies is a pretty big boon. It would be great to see some of the all-in-one build tools become more usable though. Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!

Well Nix is the only sane way I know to manage fully reproducible envs that incorporate programs/scripts spanning multiple ecosystems. Very common situation in applied data analysis.

Nix is a 10x force multiplier for managing Linux systems. The fact that I can write python, go, bash, jq, any tool that is right for the job of managing and configuring the system is amazing. And on top of that I can patch any part of the entire system with just that, a patch from my fork on GitHub or anywhere else.

Top that off with first class programming capabilities and modularization and I can share common configuration and packages across systems. And add that those same customized packages can be directly included in a dev shell making all of the amazing software out there available for tooling and support. Really has changed my outlook and I have so much fun now not EVER dealing with tooling issues except when I have explicitly upgrade my shell and nixpkgs version.

I just rebuilt our CI infrastructure with nix and was a able to configure multiple dockerd isolated daemons per host, calculate the subnet spread for all the networks, write scripts configuring the env so you can run docker1 and hit daemon 1. Now we can saturate our CI machines with more parallel work without them fighting over docker system resources like ports. Never would have attempting doing this without nix, being able to generate the entire system config tree and inspect systemd service configs befor even applying to a host reduced my iteration loop to an all time low in the infrastructure land where 10-15mins lead times of building images to find out I misspelling Kafka and kakfa somewhere and now need to rebuild again for 15mins. Now I get almost instant feedback for most of these types of errors.

> Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!

Yep: Nix

Unless you’re packaging anything or consuming packages or teasing out the conflicting advice from the community on which nix-related tooling to use or literally anything else of interest.

oh man, don't even bother with bazel... hermetic python builds are such a mess.

Yeah, I burn my face on that particular stove once every 3 years or so.

"other languages don't have 3 different tools for this." But other languages DO have 3 different tools so we should do that too!