> no standard way of setting up an environment for MANY years hurt

Serious question: is that solved? I still see a forest of options, some of which depend on each other, and at last count my laptop has 38 python binaries. What's the standard way?

There's no "de jure" standard, but uv sure looks like it's on its way to becoming the "de facto" standard.

uv.

https://docs.astral.sh/uv/

It’s hard to call that standard, it’s just the latest hn rust craze idolisation.

Sure, but it’s demonstrably better than Poetry, which was the best until uv.

If uv isn’t a standard, it’s because not enough people have tried it. It is obscenely good at its job.

uv is young, unstable and still lacking in details. It has multiple updates per month, nearly every week there are significant enhancements and bug fixes. It's not mature enough for being a standard yet, even though, what it already offers is excellent. But let it grow, change needs time.

   standard
You keep using that word, I don't think it means what you think it means.

"Prepare to die."

False dichotomy. Been using pipenv for 8 years. At first it was a bit too slow, but at this moment it gets the job done.

uv is an excellent piece of software regardless of the language used to write it. Really, if you do python, it's worth giving it a try, especially script mode.

Sure but it’s quality or your/my opinion doesn’t make it ‘standard’ even if it will be some day in the future.

Eh, I’m not so sure.

We didn’t see adoption nearly this fast for poetry, pipenv, or conda (or hatch or PDM, but I never saw those as even reaching critical mass in the first place).

Those tools got pretty popular, but it took a long time and most folks found them to have a lot of tradeoffs (miles better than Python’s first party tooling, but still).

I’m not seeing that with “uv”. Other than concerns about Astral’s stewardship model (which could be valid!), I’m not seeing widespread “it works but is hard to use” dissatisfaction with the to the way I do with, say, poetry.

Couple that with uv durably solving the need for pyenv/asdf/mise by removing the pain of local interpreter compilation entirely, and I do think that adds up to uv being fundamentally different in popularity or approach compared to prior tools. Is that “different” the same as “better”? Time will tell.

As to being written in Rust? Shrug. A ton of shops for whom uv has been transformative don’t even know or care what language it’s written in. Being Rust provides, in my opinion, two benefits: a) avoiding chicken-and-egg problems by writing the tool for managing a programming language environment in a different language that is b) not bash.

> avoiding chicken-and-egg problems by writing the tool for managing a programming language environment in a different language

I've heard this a lot, but I don't really understand the use case. It seems that people want to do development in Python, want to install and manage third-party Python packages, and know how to use command-line tools, but somehow they don't already have Python installed and would find it challenging to install directly? Building from source on Linux is a standard "get dev packages from the system package manager; configure, make and install" procedure that I've done many times (yes, putting it beside the system Python could break things, but you can trivially set an alternate install prefix, and anyway the system Python will usually be a version that meets the basic needs of most developers). Installing on Windows is a standard Windows installer experience.

Aside from that, people seem to imagine "chicken-and-egg" scenarios with pip making itself available in the environment. But this is a thoroughly (if inefficiently) solved problem. First off, for almost three years now pip has been able to install cross-environment (albeit with an ugly hack; I detail some of this in https://zahlman.github.io/posts/2025/01/07/python-packaging-...). Second, the standard library `venv` defaults to bootstrapping pip into new environments — taking advantage of the fact that pre-built Python packages are zip archives, and that Python has a protocol for running code from zip archives, which the pip package implements.

The real bootstrapping issue I've heard about is https://github.com/pypa/packaging-problems/issues/342 , but this affects very few people — basically, Linux distro maintainers who want to "build" an entire Python toolchain "from source" even though it's all Python code that the runtime could bytecode-compile on demand anyway.

Whether or not you like it, and whether or not it is better, it is not in the standard lib, and is not the way the vast majority of people install python libraries.

You forgot to update your HN craze list. Zig is chic, Rust is out.

No, it’s not. Everywhere I see, uv is adopted.

We're like a year into the uv hype cycle. It needs enough time to make sure it solves the issues of its predecessors.

So what if uv is everywhere you look? So were Poetry, pipenv, and so on. Give it time.

Tried all of them, dropped all of them. Sticked with uv, I’ll take my chances.

I'm using uv but in corporate places I'm seeing just Conda.

Yes. Try uv and never look back.

You still need pip-tools in uv environment

What for (honest question) ? Doesn't uv handle locking?

uv for project management and pipx for user-/system-wide tool installation.

uv handles that too with "uv tool".

But does it create completely isolated, updatable tools possessing all of pipx functionality?

Yep!

What is Astral's business model?

> Serious question: is that solved?

It depends on what "setting up" means.

Creating an environment, given that the Python binary it will use is already installed, is trivial (standard library functionality since late 2012). So is choosing which environment to use. So is installing pre-built packages, and even legacy source packages are pretty easy (but slow, and installation runs arbitrary code which is entirely needless for these) when they only contain pure Python code. Even dependency resolution is usually not too bad.

The big problems are things like

* building multi-language packages from source locally, because this is expected to set up temporary local build environments (and build tools have to avoid recursion there)

* using external non-Python dependencies (essentially unsolved, and everyone works around this by either vendoring stuff or by not declaring the dependency and failing at runtime) — see https://pypackaging-native.github.io/ for an overview of the problems and https://peps.python.org/pep-0725/ for what they're trying to standardize to deal with it

* dealing with metadata for source packages; in the really general case you have to build the source to get this (although the package-building API now provides a hook so that build backends can specifically prepare metadata). This is mainly because some packages have dependencies that depend on very particular platform details that (apparently) can't be expressed with the "environment marker" scheme in standard metadata (https://peps.python.org/pep-0508/#environment-markers)

* and, of course, figuring out which packages need to be in your environment (Python won't decide for you what your direct dependencies are) and managing that environment over time. The reason all these other tools popped up is because Pip only installs the packages and offers very basic environment inspection; it's only now starting to do anything with lockfiles, for example, now that there is finally a standard for them (https://peps.python.org/pep-0751/).

But if you mean, is there a standard toolchain that does everything and will be officially blessed by the core language developers, then no, you should not ever expect this. There is no agreement on what "everything" entails, and Python users (a large fraction of which don't fit the traditional image of a "developer" at all) have widely varying workflows and philosophical/aesthetic preferences about that. Besides which, the core language team doesn't generally work on or care about the problem; they care about the interpreter first and foremost. Packaging is an arms-length consideration. Good news, though: the Python Packaging Authority (not at all authoritative, and named with tongue firmly in cheek, but a lot of people didn't get that) is stepping up and working on official governance (see https://peps.python.org/pep-0772/).

> at last count my laptop has 38 python binaries

Something has gone very wrong (unless you're on Windows, such that admin rights would be needed to create symlinks and by default `venv` doesn't try). To be clear, I mean with your setup, not with the tooling. You should only need one per distinct version of Python that your various environments use. I'd be happy to try to help if you'd like to shoot me an email (I use that Proton service, with the same username as here) and give more details on how things are currently set up and what you're trying to accomplish that way.