Python's just working like a normal unix program. Some people like that because they can reason about it the way they reason about any other utility so it has advantages when using python as a scripting language - which is what it was invented as. AI/ML/ASGI/blablahblah are just specific applications with problems that seem overwhelmingly important to their users.

Is this an AI generated comment?

Node or PHP also work like normal Unix programs...

PHP and node were not developed as general purpose scripting languages for use at the commandline and are very commonly used for specific purposes so there's no need for them to be like python.

I wonder what good you think insults do? I could insult your use of English for example but would that make my argument better?

Please do insult my use of English, it's a learning experience and it might be the same for others reading these comments.

> PHP and node were not developed as general purpose scripting languages for use at the commandline and are very commonly used for specific purposes so there's no need for them to be like python.

Perl, Ruby, Lua, I can keep going. You're just nitpicking. Practically only Python uses the venv [1] approach from all the programming languages I've used.

[1] Manual activation needed, non portable pile of dependencies per project (by design, fairly sure the documentation mentions this is not a supported use case - even across quasi-identical machines !!!), etc. I stand by my decision to call venvs a "fractal of bad design".

There are 5000 programs in my /usr/bin. They take environment settings from the current environment which means you can set a variable on the commandline for that process without finding that is ineffectual because it has been overridden by some configuration file somewhere or by some weird "automatic" behaviour. "clever" tools end up being a pain in the arse when you have to second guess them and work around their behaviour.

As things are, I can share a venv with several projects, or have one for my account if I don't want to break the system tools. I can even write a trivial bash function to activate a venv automatically based on whether a venv exists in my project. It's so trivial to do and yet generates all this discussion.

As for non-portability that's the most specious and pathetic argument of the lot. who can bother to make every library portable to every distro....? what is the point of distros if one demands that one can run the same binaries everywhere? This is what containers were invented for. If you need the latest versions you might just, heaven forbid, have to compile something and fight with the problems of running it on distro that it wasn't developed on.

I don’t understand what you’re talking about. Literally all common interpreted languages adhere to Unix standards—they take arguments, use environment files, handle input and output streams, and load libraries.

Their single advantage over Python is that they are able to work fine without virtual environments, as they just load libraries from a relative path: That way, you can copy-paste a project directory, move it to another system with a copy of the interpreter binary, and… run the software. There is nothing clever about that; I’d even say Python's way of magically configuring a shell just to be able to locate files is the "clever" solution that nobody asked for!

Portable as in "you can't copy a virtual env to an equivalent machine to deploy it". Look up the documentation, it's not recommended because of the hardcoded absolute paths or something like that :-|

Python venvs literally f*ed up the simplest form of deployment on the planet, scp. Yes, we have more complex solutions like Docker, another abomination (the software itself). Docker was invented in big part due to Python (not only, but it was a big factor).

Again, I use venvs. They're ok. But they're a stupid semi abstraction.

https://xkcd.com/1987/

venvs try to economise on space by symlinking to the existing python where possible....but of course there is a way to do a full copy.

    python -m venv --copies .myvenvdir
You can scp this but the destination machine should be using the same version of python and have whatever modules you're using from the system install in it's own system install.

There's nothing nice about this but it does protect you from a lot of issues where it might seem to work and then behave unexpectedly. e.g. if you had a different python on the destination.

Docker doesn't just help with scripts - it also manages the problem of binary compatibility with C and C++ (and whatever) libraries. You may not have had this problem with C/C++ code so you might imagine it's all about python but I can only say it's a misery that the C/C++ crowd have been suffering with for a very long time. How does one support 3 distros each with a different version of libffmpeg installed when you're making a media player? Sometimes there's a lot of "#if FFMPEG_MAJOR > 3" in the code to cope with it.

The distro developers build each package with a bunch of patches to adapt them to the distro they're in.

It's bad enough for the distro developers to curate all this when it's their job but with python devs are living in a more minimally curated world and in some applications like ML are now dealing with wide and deep dependency trees that rival a distro.

IMO perhaps "someone" should come up with some new distributions where they make everything work together.

From the programming language side of things the solution other languages use is to... just be fast. Java is fast enough, so is C#. Javascript/Typescript can be fast-ish, Go is fast enough, Rust is definitely fast.

So a lot of other major ecosystems are just self contained. All the "big" libraries are portable and written in the language itself, so they rarely plug into C/C++ (aka distribution/OS dependencies).

So Docker was basically built primarily for slow programming languages and I guess in a weird way, for C/C++, as you say? :-)))

If they were as easy to use as python everyone would do so but they're not because they're not as dynamic and that's the price to pay. https://pypy.org/ is the usual answer if you want speed with python but then you deal with glitches and libraries that don't all work and only a few people care enough about speed to put up with that. https://www.graalvm.org/python/ is technically even more fancy and by now it might also work most of the time.