> I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.
pip and venv are not such things. The activation script is completely unnecessary, and provided as a convenience for those to whom that workflow makes more sense.
> Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore“
I can't fathom why. First off, surely your container image can just pin an older version of the distro? Second, right now I have Python versions 3.3 through 3.14 inclusive built from source on a very not-special consumer Linux distro, and 2.7 as well.
> and triggering an avalanche of new “you really shouldn’t install packages globally” messages, demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.
Literally all you need to do is make one virtual environment and install everything there, which again can use direct paths to pip and python without sourcing anything or worrying about environment variables. Oh, and fix your automated scripts so that they'll do the right thing next time.
> I know that if I was a full-time Python dev there’s a “better way” that I’d know about.
Or, when you get the "you really shouldn't install packages globally" message, you could read it — as it gives you detailed instructions about what to do, including pointing you at the documentation (https://peps.python.org/pep-0668/) for the policy change. Or do a minimum of research. You found out that venvs were a thing; search queries like "python venv best practices" or "python why do I need a venv" or "python pep 668 motivation" or "python why activate virtual environment" give lots of useful information.