> Python/pip though… Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore” [...]
How are you containerizing Python projects? What confuses me about your statement are the following things:(1) How old must the Python version of those projects be, to not be supported any longer with any decent GNU/Linux distribution?
(2) Are you not using official Python docker images?
(3) What's pip gotta do with a Python version being supported?
(4) How does that "Your Linux distro doesn’t support that version of Python anymore" show itself? Is that a literal error message you are seeing?
> [...] demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year
It seems you are talking about installing things in system Python, which you shouldn't do. More questions:(1) Why are you not using virtual environments?
(2) You are claiming Node.js projects to be better in this regard, but actually they are just creating a `node_modules` folder. Why then is it a problem for you to create a virtual environment folder? Is it merely, that one is automatic, and the other isn't?
> This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.
It is very easy to activate a venv just for one command. Use a subshell, where you `. venv/bin/activate && python ...(your program invocation here)...`. Aside from that, projects can be set up so that you don't even see that they are using a venv. For example I usually create a Makefile, that does the venv activating and running and all that for me. Rarely, if ever, I have to activate it manually. Since each line in a Makefile target is running in its own shell, nothing ever pollutes my actual top level shell.
> (1) How old must the Python version of those projects be, to not be supported any longer with any decent GNU/Linux distribution?
Debian-13 defaults to Python-3.13. Between Python-3.12 and Python-3.13 the support for `pkg_config` got dropped, so pip projects like
https://pypi.org/project/remt/
break. What I was not aware of: `venv`s need to be created with the version of python they are supposed to be run. So you need to have a downgraded Python executable first.
> What I was not aware of: `venv`s need to be created with the version of python they are supposed to be run. So you need to have a downgraded Python executable first.
This is one of uv’s selling points. It will download the correct python version automatically, and create the venv using it, and ensure that venv has your dependencies installed, and ensure that venv is active whenever you run your code. I’ve also been bit by the issue you’re describing many times before, and previously had to use a mix of tools (eg pyenv + pipenv). Now uv does it all, and much better than any previous solution.
> (2) Are you not using official Python docker images?
Would you help me make it work?
How do I source it?Instead of "python --version", just use the "python" executable from within the venv. Sourcing is a concept for interactive shells.
The python executable from the venv is not going to work inside the container as it's a symlink by default. That's a venv that was built on their host OS, and the symlink to the Python binary on the container is not going to work.
You could also pass the `--copies` parameter when creating the initial venv, so it's a copy and not symlinks, but that is not going to work if your on MacOS or Windows (because the binary platform is different to the Linux that's running the container), or if your development Python is built with different library versions than the container you're starting.
I'd recommend re-creating the virtual environment inside the Docker container.
The problem is you are mounting a virtual environment you have built in your development environment into a Docker container. Inside your virtual environment there's a `python` binary that in reality is a symlink to the python binary in your OS:
So, when you mount that virtual environment in a container, it won't find the path to the python binary.The most basic fix would be recreating the virtual environment inside the container, so from your project (approximately, I don't know the structure):
But, if you're developing and don't wanna build the virtual environment each time you start the container, you could create a cache volume for uv, and after the first time installation, everything is going to be way faster: You can also see some other examples, including a Docker Compose one that automatically updates your packages, here:https://docs.astral.sh/uv/guides/integration/docker/#develop...
---
Edit notes: