Yeah, I agree. In particular it seems insane to me that virtualenv should have to exist. I can't see any valid use case for a machine-global pool of dependencies. Why would anyone think it should be a separate tool rather than just the obvious thing that a dependency manager does? I say this as someone with nearly 20 years of Python experience.
It's the same sort of deal with pyenv--the Python version is itself a dependency of most libraries, so it's a little silly to have a dependency manager that only manages some dependencies.
And in practice it usually ends up being 6 different machine-global pools that all weirdly intersect, and some are python2.
I started using NodeJS more after lots of Python experience. Packages make so much more sense there. Even imports. You know how hard it is to do the equivalent of "require '../foo.js'" in Python?
I, too, have ~20 years of Python experience.
`virtualenv` is a heavy-duty third-party library that adds functionality to the standard library venv. Or rather, venv was created as a subset of virtualenv in Python 3.3, and the projects have diverged since.
The standard library `venv` provides "obvious thing that a dependency manager does" functionality, so that every dependency manager has the opportunity to use it, and so that developers can also choose to work at a lower level. And the virtual-environment standard needs to exist so that Python can know about the pool of dependencies thus stored. Otherwise you would be forced to... depend on the dependency manager to start Python and tell it where its dependency pool is.
Fundamentally, the only things a venv needs are the `pyvenv.cfg` config file, the appropriate folder hierarchy, and some symlinks to Python (stub executables on Windows). All it's doing is providing a place for that "pool of dependencies" to exist, and providing configuration info so that Python can understand the dependency path at startup. The venvs created by the standard library module — and by uv — also provide "activation" scripts to manipulate some environment variables for ease of use; but these are completely unnecessary to making the system work.
Fundamentally, tools like uv create the same kind of virtual environment that the standard library does — because there is only one kind. Uv doesn't bootstrap pip into its environments (since that's slow and would be pointless), but you can equally well disable that with the standard library: `python -m venv --without-pip`.
> the Python version is itself a dependency of most libraries
This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.
If you're trying to solve the problem of deploying an application to people who don't have Python (or to people who don't understand what Python is), you need another layer of wrapping anyway. You aren't going to get end users to install uv first.
I don’t think people consider things from a first principles perspective these days.
“…I can't see any valid use case for a machine-global pool of dependencies…” - Rhetorical question for OP but how do you run an operating system without having said operating systems dependencies available to everything else?
That quote is mine, so I think you’re meaning to address me?
> how do you run an operating system without having said operating systems dependencies available to everything else?
I’m not sure if I understand your question, but I’ll answer based on what I think you mean. The OS gets compiled into an artifact, so the dependencies aren’t available to the system itself unless they are explicitly added.
You asked what’s the point of a machine based global pool of dependencies - I answered: it’s an OS.
An OS isn’t a machine-global pool dependencies. Strange claim.
I agree with all of that context about virtualenv and venv, but it all seems orthogonal to my point. I still can’t see a case where you would want the default Python behavior (global dependencies).
> This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.
“normally” is biased by what the tooling supports. If Python tooling supported pinning to an interpreter by default then perhaps it would seem more normal?
I write a lot of Go these days, and the libs pin to a version of Go. When you build a project, the toolchain will resolve and (if necessary) install the necessary Go dependency just like all of the other dependencies. It’s a very natural and pleasant workflow.
> I still can’t see a case where you would want the default Python behavior (global dependencies).
It's easier to work with if you're new to Python, lazy, or just not generally familiar with the concept of a "project". Tons of people use Python through Jupyter notebooks and install libraries to play with them in a notebook, and have no real reason to think about which installations are required for the current notebook until there's a conflict, or until they want to share their work (which might never happen).
Also as you're well aware, Python existed for a long time before the virtual environment concept.
> It's easier to work with if you're new to Python, lazy, or just not generally familiar with the concept of a "project".
If you have automatic virtual environments then it doesn't matter to these personas (including the Jupyter user) unless they need multiple projects or something in which case automatic virtual environments are strictly better.
> Also as you're well aware, Python existed for a long time before the virtual environment concept.
Correct, but I don't think that supports the idea that global environments have a use case that isn't better served by project-scoped environments.