All of these tools are third-party and the Python core development team can't do anything to prevent people from inventing new ones. Even pip is technically at arms length; it has special support in the standard library (Python releases will vendor a wheel for it, which is designed to be able to bootstrap itself for installation[0]), but is developed separately.
Standards are developed to allow existing tools to inter-operate; this entails allowing new tools to appear (and inter-operate), too.
This system was in some regards deliberate, specifically to support competition in "build backends". The background here is that many popular Python projects must interface to non-Python code provided with the project; in many cases this is code in compiled languages (typically C, Fortran or Rust) and it's not always possible to pre-build for the user's system. This can get really, really complicated, and people need to connect to heavyweight build systems in some cases. The Python ecosystem standards are designed with the idea that installers can automatically obtain and use those systems when necessary.
And by doing all of this, Python core developers get to focus on Python itself.
Another important concern is that some bad choices were made initially with Setuptools, and we have been seeing a very long transition because of a very careful attitude towards backwards compatibility (even if it doesn't seem that way!) which in turn is motivated by the battle scars of the 2->3 transition. In particular, it used to be normal and expected that your project would use arbitrary Python code (in `setup.py` at the project root) simply to specify metadata. Further, `setup.py` generally expects to `import setuptools`, and might require a specific version of Setuptools; but it can't express its build-time Setuptools version requirement until the file is already running - a chicken-and-egg scenario.
Modern projects use a declarative TOML file for "abstract" metadata instead (which is the source for concrete metadata included in the actual build artifacts), but the whole ecosystem still has to support a lot of really outdated ways of doing things, because in part of how much abandonware is out there.
[0]: Wheels are zip-compressed, and Python can run code from a zip file, with some restrictions. The pip project is designed to make sure that this will work. The standard library provides a module "ensurepip" which locates this wheel and runs a bootstrap script from that wheel, which will then install into the current environment. Further, the standard library "venv", used to create virtual environments, defaults to using this bootstrap in the newly created environment.
It's helpful context but still seems like a lost opportunity for python to provide the UI. It feels like every couple years we are reworking the wheel and redefining how to publish software.
With python over the years i can think of pip, pipx, setuptools, easy_install, distutils, venv, conda, wheel, .egg, wheel (formats) , now uv.
PHP stabilized with composer, perl with cpan , go with `go mod` and `go get` (builtin).
Java and Swift had some competition with Gradle/maven and swiftPM / cocoapods, but nothing as egregious.
file tree, dep tree, task DAG. how many ways can they be written?
> It feels like every couple years we are reworking the wheel
Almost literally: https://wheelnext.dev/
> how many ways can they be written?
It's not just a matter of how they're written. For Python specifically, build orchestration is a big deal. But also, you know, there are all the architecture ideas that make uv faster than pip. Smarter (and more generous) caching; hard-linking files where possible rather than copying them; parallel downloads (I tend to write this off but it probably does help a bit, even though the downloading process is intermingled with resolution); using multiple cores for precompiling bytecode (the one real CPU-intensive task for a large pure-Python installation).
It sounds great and I’m not against Uv . It probably is the best . I’m wondering what’s wrong with the Python community that 25 years sees 10 package managers. I’m not being cynical it’s a clinical / empirical question
I don't think this is something "wrong with the Python community", any more than the proliferation of Linux distros, desktop environments etc. is something "wrong with the Linux community".
Ok well thanks for helping me get some more context on this I wasn't aware.