I must be the odd man out but I am not a fan of uv.
1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.
2. You end up needing to use `uv pip` so it's not even a full replacement for pip.
3. It does not play well with Docker.
4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.
Your implication is that pyenv, virtualenv, and pip should be 3 different tools. But for the average developer, these tools are all related to managing the python environment and versions which in my head sounds like one thing. Other languages don't have 3 different tools for this.
pip and virtualenv also add a ton of complexity and when they break (which happens quite often) debugging it is even harder despite them being "battle tested" tools.
Yeah, I agree. In particular it seems insane to me that virtualenv should have to exist. I can't see any valid use case for a machine-global pool of dependencies. Why would anyone think it should be a separate tool rather than just the obvious thing that a dependency manager does? I say this as someone with nearly 20 years of Python experience.
It's the same sort of deal with pyenv--the Python version is itself a dependency of most libraries, so it's a little silly to have a dependency manager that only manages some dependencies.
And in practice it usually ends up being 6 different machine-global pools that all weirdly intersect, and some are python2.
I started using NodeJS more after lots of Python experience. Packages make so much more sense there. Even imports. You know how hard it is to do the equivalent of "require '../foo.js'" in Python?
I, too, have ~20 years of Python experience.
`virtualenv` is a heavy-duty third-party library that adds functionality to the standard library venv. Or rather, venv was created as a subset of virtualenv in Python 3.3, and the projects have diverged since.
The standard library `venv` provides "obvious thing that a dependency manager does" functionality, so that every dependency manager has the opportunity to use it, and so that developers can also choose to work at a lower level. And the virtual-environment standard needs to exist so that Python can know about the pool of dependencies thus stored. Otherwise you would be forced to... depend on the dependency manager to start Python and tell it where its dependency pool is.
Fundamentally, the only things a venv needs are the `pyvenv.cfg` config file, the appropriate folder hierarchy, and some symlinks to Python (stub executables on Windows). All it's doing is providing a place for that "pool of dependencies" to exist, and providing configuration info so that Python can understand the dependency path at startup. The venvs created by the standard library module — and by uv — also provide "activation" scripts to manipulate some environment variables for ease of use; but these are completely unnecessary to making the system work.
Fundamentally, tools like uv create the same kind of virtual environment that the standard library does — because there is only one kind. Uv doesn't bootstrap pip into its environments (since that's slow and would be pointless), but you can equally well disable that with the standard library: `python -m venv --without-pip`.
> the Python version is itself a dependency of most libraries
This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.
If you're trying to solve the problem of deploying an application to people who don't have Python (or to people who don't understand what Python is), you need another layer of wrapping anyway. You aren't going to get end users to install uv first.
I don’t think people consider things from a first principles perspective these days.
“…I can't see any valid use case for a machine-global pool of dependencies…” - Rhetorical question for OP but how do you run an operating system without having said operating systems dependencies available to everything else?
That quote is mine, so I think you’re meaning to address me?
> how do you run an operating system without having said operating systems dependencies available to everything else?
I’m not sure if I understand your question, but I’ll answer based on what I think you mean. The OS gets compiled into an artifact, so the dependencies aren’t available to the system itself unless they are explicitly added.
You asked what’s the point of a machine based global pool of dependencies - I answered: it’s an OS.
An OS isn’t a machine-global pool dependencies. Strange claim.
I agree with all of that context about virtualenv and venv, but it all seems orthogonal to my point. I still can’t see a case where you would want the default Python behavior (global dependencies).
> This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.
“normally” is biased by what the tooling supports. If Python tooling supported pinning to an interpreter by default then perhaps it would seem more normal?
I write a lot of Go these days, and the libs pin to a version of Go. When you build a project, the toolchain will resolve and (if necessary) install the necessary Go dependency just like all of the other dependencies. It’s a very natural and pleasant workflow.
> I still can’t see a case where you would want the default Python behavior (global dependencies).
It's easier to work with if you're new to Python, lazy, or just not generally familiar with the concept of a "project". Tons of people use Python through Jupyter notebooks and install libraries to play with them in a notebook, and have no real reason to think about which installations are required for the current notebook until there's a conflict, or until they want to share their work (which might never happen).
Also as you're well aware, Python existed for a long time before the virtual environment concept.
> It's easier to work with if you're new to Python, lazy, or just not generally familiar with the concept of a "project".
If you have automatic virtual environments then it doesn't matter to these personas (including the Jupyter user) unless they need multiple projects or something in which case automatic virtual environments are strictly better.
> Also as you're well aware, Python existed for a long time before the virtual environment concept.
Correct, but I don't think that supports the idea that global environments have a use case that isn't better served by project-scoped environments.
I think OP's complaint is rather that using `uv` is leaky: now you need to learn all the underlying stuff AND uv as well.
The alternative, of course, is having Python natively support a combined tool. Which you can support while also not liking `uv` for the above reason.
I don't think that's true, most projects using uv don't rely on those tools at all, and you don't need to understand them. You just `uv sync` and do your work.
I'm just clarifying OP in response to parent comment that ignored the most important parts of OP's message. Your comment also seems to ignore the meat of OP's complaint. But maybe I'm interpreting OP's comment as "my use cases for these tools are complicated" because I, too, have complicated Python builds with exotic compiled dependencies.
Python versions and environments can be solved in more reliable abstraction level as well, e.g. if you are heavy Nix user.
On the other hand, Nix and Bazel and friends are a lot of pain. I'm sure the tradeoff makes sense in a lot of situations, but not needing to bring in Nix or Bazel just to manage dependencies is a pretty big boon. It would be great to see some of the all-in-one build tools become more usable though. Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!
Well Nix is the only sane way I know to manage fully reproducible envs that incorporate programs/scripts spanning multiple ecosystems. Very common situation in applied data analysis.
Nix is a 10x force multiplier for managing Linux systems. The fact that I can write python, go, bash, jq, any tool that is right for the job of managing and configuring the system is amazing. And on top of that I can patch any part of the entire system with just that, a patch from my fork on GitHub or anywhere else.
Top that off with first class programming capabilities and modularization and I can share common configuration and packages across systems. And add that those same customized packages can be directly included in a dev shell making all of the amazing software out there available for tooling and support. Really has changed my outlook and I have so much fun now not EVER dealing with tooling issues except when I have explicitly upgrade my shell and nixpkgs version.
I just rebuilt our CI infrastructure with nix and was a able to configure multiple dockerd isolated daemons per host, calculate the subnet spread for all the networks, write scripts configuring the env so you can run docker1 and hit daemon 1. Now we can saturate our CI machines with more parallel work without them fighting over docker system resources like ports. Never would have attempting doing this without nix, being able to generate the entire system config tree and inspect systemd service configs befor even applying to a host reduced my iteration loop to an all time low in the infrastructure land where 10-15mins lead times of building images to find out I misspelling Kafka and kakfa somewhere and now need to rebuild again for 15mins. Now I get almost instant feedback for most of these types of errors.
> Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!
Yep: Nix
Unless you’re packaging anything or consuming packages or teasing out the conflicting advice from the community on which nix-related tooling to use or literally anything else of interest.
oh man, don't even bother with bazel... hermetic python builds are such a mess.
Yeah, I burn my face on that particular stove once every 3 years or so.
"other languages don't have 3 different tools for this." But other languages DO have 3 different tools so we should do that too!
> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.
I think there are more cases where pip, pyenv, and virtualenv are used together than not. It makes sense to bundle the features of the three into one. uv does not replace ruff.
> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.
uv pip is there for compatibility and to facilitate migration but once you are full on the uv workflow you rarely need `uv pip` if ever
> 3. It does not play well with Docker.
In what sense?
> 4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.
You don't need to touch them at all
> It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.
uv doesn’t try to replace ruff.
> You end up needing to use `uv pip` so it's not even a full replacement for pip.
"uv pip" doesn't use pip, it provides a low-level pip-compatible interface for uv, so it is, in fact, still uv replacing pip, with the speed and other advantages of uv when using that interface.
Also, while I’ve used uv pip and uv venv as part of familiarizing myself with the tool, I’ve never run into a situation where I need either of those low-level interfaces rather than the normal high-level interface.
> It does not play well with Docker.
How so?
There is an optional & experimental code formatting tool within uv (that just downloads riff), which is what OP may be referring to: https://pydevtools.com/blog/uv-format-code-formatting-comes-...
uv's pip interface is like dipping one toe in the bathtub. Take a minute and try on the full managed interface instead: https://docs.astral.sh/uv/concepts/projects/dependencies. Your commands then become:
- uv add <package_name>
- uv sync
- uv run <command>
Feels very ergonomic, I don't need to think much, and it's so much faster.
So I have been doing Python for far too long and have all sort of tooling I've accreted to make Python work well for me across projects and computers and I never quite made the leap to Poetry and was suspicious of uv.
Happened to buy a new machine and decided to jump in the deep end and it's been glorious. I think the difference from your comment (and others in this chain) and my experience is that you're trying to make uv fit how you have done things. Jumping all the way in, I just . . . never needed virtualenvs. Don't really think about them once I sorted out a mistake I was making. uv init and you're pretty much there.
>You end up needing to use `uv pip` so it's not even a full replacement for pip
The only time I've used uv pip is on a project at work that isn't a uv-powered project. uv add should be doing what you need and it really fights you if you're trying to add something to global because it assumes that's an accident, which it probably is (but you can drop back to uv pip for that).
>`UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.
I've been using it for six months and didn't know those existed. I would suggest this is a symptom of trying to make it be what you're used to. I would also gently suggest those of us who have decades of Python experience may have a bit of Stockholm Syndrome around package management, packaging, etc.
> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.
In my experience it generally does all of those well. Are you running into issues with the uv replacements?
> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.
What do end up needing to use `uv pip` for?
> It tries to do too many things. Please just do one thing and do it well.
I disagree with this principle. Sometimes what I need is a kitset. I don't want to go shopping for things, or browse multiple docs. I just want it taken care of for me. I don't use uv so I don't know if the pieces fit together well but the kitset can work well and so can a la carte.
uv has played well with Docker in my experience, from dev containers to CI/CD to production image builds. Would be interested to hear what is not working for you.
The uv docs even have a whole page dedicated to Docker; you should definitely check that out if you haven't already: https://docs.astral.sh/uv/guides/integration/docker/
> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.
Needing pip and virtualenvs was enough to make me realize uv wasn't what I was looking for. If I still need to manage virtualenvs and call pip I'm just going to do so with both of these directly.
I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.
> I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.
Isn't that exactly a pyproject.toml via the the uv add/sync/run interface? What is that missing that you need?
> pyproject.toml
Ah ok I was missing this and this does sound like what I was expecting. Thank you!
What are you needing to use `uv pip` for? I don't think I ever call into pip from uv for anything nowadays. I typically just need to do `uv sync` and `uv run`, maybe sometimes `uvx` if I want to run some random 3rd party python script
> I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.
If you are using uv, you don’t need to do shell shenanigans, you just use uv run. So I'm not sure how uv with pyproject.toml doesn't meet this description (yes, the venv is still there, it is used exactly as you describe.)
In most cases, you don't really need to manage virtual envs though ? uv commands that need a venv will just create one for you or install to the existing one automatically.
Pixi is an alternative that you may want to try.
Really sounds like you're using it wrong, no? I completely forgot about virtualenvs, pip and requirements.txt since I start using UV.
This. I was researching uv to replace my pipenv+pyenv setup, but after reading up a bit I decided to just give up. Pipenv is just straightforward and “just works”. Aside from being slow, not much is wrong with it. I’m not in the mood to start configuring uv, a tool that should take me 2 minutes and a “uv —-help” to learn.
What doesn’t just work about uv in particular? You basically need three commands - uv add, uv sync, and uv run. Forget about virtual environments, and get back to working. No configuration necessary.
Slow doesn't really begin to do justice, I'd have to wait for >5 minutes for pipenv to finish figuring out our lock file. uv does it in less than a second.
> Pipenv is just straightforward and “just works”
I have worked on numerous projects that started with pipenv and it has never "just works" ever. Either there's some trivial dependency conflict that it can't resolve or it's slow as molasses or something or the other. pipenv has been horrible to use. I started switching projects to pip-tools and now I recommend using uv
Yeah, I'm with you. I'm forcing myself to learn it because it looks like that's the way PyWorld is going. I don't dislike uv as much as poetry. But I guess I never really ran into issues using pyenv and pip. shrug Maybe I wasn't working on complex enough projects.
Yeah, I find that I like to use uv for one thing, quickly/efficiently getting a Python into a new venv for some project. A la:
So here uv installs the Python version wanted. But it's just a venv. And we pip install using requirements.txt, like normal, within that venv.Someone, please tell me what's wrong with this. To me, this seems much less complicated that some uv-centric .toml config file, plus some uv-centric commands for more kinds of actions.
What do you mean it doesn't play well with docker?
If your pyproject.toml is setup properly you shouldn't need to use `uv pip` at all.
I'm using uv in two dozen containers with no issues at all. So not sure what you mean that it doesn't play well with Docker.
We use it for many projects. None of these are true for us.
Why do you need to use uv pip?
What problems you have in Docker?
I don't understand any of those env variables you listed, yet I use uv without problems.
> You end up needing to use `uv pip` so it's not even a full replacement for pip.
No you don't. That's just a set of compatibility approaches for people who can't let go of pip/venv. Move to uv/PEP723, world's your oyster.
> It does not play well with Docker.
Huh? I use uv both during container build and container runtime, and it works just fine?
> You end up needing to understand all of these new environmental variables
Not encountered the need for any of these yet. Your comments on uv are so far out of line of all the uses I've seen, I'd love to hear what you're specifically doing that these become breaking points.
What problems do you encounter using it with Docker?
When do you use `uv pip`? I never use it. It feels like an edge case only command.
I would also add UV_NO_SYNC as smth I had to learn. It comes in combination with uv pip
What's your use case for UV_NO_SYNC? I assume the option exists for a reason, but aside from maybe a modest performance improvement when working with a massive complex package environment, I'm not sure what problem it solves.
some packages that I use for development need to be part of the virtual env. for example ipdb.
so i do uv pip install ipdb.
but then, after uv add somepackage
uv sync happens and cleans up all extras. to keep extras, you need to run uv sync --inexact. But there is no env var for `--inexact`, so I end up doing the sync manually.
uv has support for dev-time dependencies https://docs.astral.sh/uv/concepts/projects/dependencies/#de...
It's funny, I feel like half the reason I use docker is for python projects.
5. No concept of global/shell/local venv auto activation, so get used to typing "uv run", or manually recreating these concepts, with shell stuffs.
uv pip is a full reimplementation of pip. Way faster, better caching, less disk usage. What'd not to like about it?
It's still one tool to orchestrate and run everything, which is preferable to many.
yeah, I've moved away from it too, but that's a great tool. A rush of rust tools is the best thing that happened to python in the decade
oh look, the average golang fan. here’s a challenge for you: explain _why_ the complexity is bad without:
- resorting to logical fallacies, or
- relying on your unstated assumption that all complexity is bad
I mean I’ve had quite awful bugs from using pip pyenv and venv at the same time