> you should just use virtualenv with pip
This is the most insulting take in the ongoing ruination of Python. You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell. Now you get endlessly chastised for trying to use Python as a general purpose utility. Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.
Unless all python dependencies you ever used were available in your distro (and then at that point, you're no longer using pip, you're using dpkg...), this never worked well. What solves this well is PEP 723 and tooling around it.
With PEP 723 and confortable tooling (like uv), now you get scripts, that are "actually directly runnable", not just "fake directly runnable oops forgot to apt-get install something sorta runnable", and work reliably even when stuff around you is updated.
> You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell.
This wasn't really the case; in principle anything you installed in the system Python environment, even "at user level", had the potential to pollute that environment and thus interfere with system tools written in Python. And if you did install it at system level, that became files within the environment your system package manager is managing, that it doesn't know how to deal with, because they didn't come from a system package.
But it's worse now because of how many system tools are written in Python — i.e., a mark of Python's success.
Notably, these tools commonly include the system package manager itself. Since you mentioned Debian (actually this is Mint, but ya know):
> Now you get endlessly chastised for trying to use Python as a general purpose utility.No, you don't. Nothing prevents you from running scripts with the system Python that make use of system-provided libraries (including ones that you install later with the system package manager).
If you need something that isn't packaged by your distro, then of course you shouldn't expect your distro to be able to help with it, and of course you should expect to use an environment isolated from the distro's environment. In Python, virtual environments are the method of isolation. All reasonable tooling uses them, including uv.
> Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.
It's not "ruined". If you choose to install the system package for pip and to use it with --break-system-packages, the consequences are on you, but you get the legacy behaviour back. And the system packages still put files separately in dist-packages. It's just that... doing this doesn't actually solve all the problems, fundamentally because of how the Python import system works.
Nowadays pip also defaults to installing to the users home folder if you don't run it as root.
Basically the only thing missing from pip install being a smooth experience is something like npx to cleanly run modules/binary files that were installed to that directory. It's still futzing with the PATH variable to run those scripts correctly.
> Nowadays pip also defaults to installing to the users home folder if you don't run it as root.
This could still cause problems if you run system tools as that user.
I haven't checked (because I didn't install my distro's system package for pip, and because I use virtual environments properly) but I'm pretty sure that the same marker-file protection would apply to that folder (there's no folder there, on my system).
> Python as a general purpose utility
This ideology is what caused all the problems to begin with, the base python is built as if it's the only thing in the entire operating systems environment when it's entire packaging system is also built in a way that makes that impossible to do without manually having to juggle package conflicts/incompatibilities.
it's because so many essential system tools now rely on python, and if you install arbitrary code outside of a venv it can clobber the global namespace and break the core OS' guarantees.
I do agree it is annoying, and what they need to do is just provide an automatic "userspace" virtualenv for anything a user installs themselves... but that is a pandoras box tbh. (Do you do it per user? How does the user become aware of this?)
What they needed to do is allow side-by-side installs of different versions of the same distribution package and allow specifying or constraining versions at import time, then you wouldn't have the problem at all.
But that's probably not practical to retrofit given the ecosystem as it is now.
That couldn't happen with Debian's dist_packages which was explicitly for the the system tools managed by apt.
pipx solves this perfectly.
For "applications" (which are distributed on PyPI but include specified entry points for command-line use), yes. For development — installing libraries that your own code will use — you'll still generally need something else (although the restriction is really quite arbitrary).
Agreed! Sorry my read was for apps. You can use --user with pip to install into the user site rather than the system site, however it still causes overlap which can be problematic
This is very true! I was highly surprised when I installed Python from source and found out, that the entire problem is fixed since decades. You can have different Python versions in the same prefix just fine, you just need to pick a default one you install with `make install` and install all the others with `make altinstall`.