This isn't a comment just about Python.. but it should just work. There shouldn't be constant ceremony for getting and keeping environments running.

There are basically 0 other programming languages that use the "directory/shell integration activated virtual environment", outside of Python.

How does the rest of the world manage to survive without venvs? Config files in the directory. Shocking, really :-)))

> Config files in the directory.

The problem is, that would require support from the Python runtime itself (so that `sys.path` can be properly configured at startup) and it would have to be done in a way that doesn't degrade the experience for people who aren't using a proper "project" setup.

One of the big selling points of Python is that you can just create a .py file anywhere, willy-nilly, and execute the code with a Python interpreter, just as you would with e.g. a Bash script. And that you can incrementally build up from there, as you start out learning programming, to get a sense of importing files, and then creating meaningful "projects", and then thinking about packaging and distribution.

And how is that different from any other interpreted language? Node and PHP handle this just fine, and they don’t need a Rube Goldberg contraption to load dependencies from a relative directory or the systems library path. I really don’t get why Python people act like that’s some kind of wicked witchcraft?

Python's just working like a normal unix program. Some people like that because they can reason about it the way they reason about any other utility so it has advantages when using python as a scripting language - which is what it was invented as. AI/ML/ASGI/blablahblah are just specific applications with problems that seem overwhelmingly important to their users.

Is this an AI generated comment?

Node or PHP also work like normal Unix programs...

PHP and node were not developed as general purpose scripting languages for use at the commandline and are very commonly used for specific purposes so there's no need for them to be like python.

I wonder what good you think insults do? I could insult your use of English for example but would that make my argument better?

Please do insult my use of English, it's a learning experience and it might be the same for others reading these comments.

> PHP and node were not developed as general purpose scripting languages for use at the commandline and are very commonly used for specific purposes so there's no need for them to be like python.

Perl, Ruby, Lua, I can keep going. You're just nitpicking. Practically only Python uses the venv [1] approach from all the programming languages I've used.

[1] Manual activation needed, non portable pile of dependencies per project (by design, fairly sure the documentation mentions this is not a supported use case - even across quasi-identical machines !!!), etc. I stand by my decision to call venvs a "fractal of bad design".

There are 5000 programs in my /usr/bin. They take environment settings from the current environment which means you can set a variable on the commandline for that process without finding that is ineffectual because it has been overridden by some configuration file somewhere or by some weird "automatic" behaviour. "clever" tools end up being a pain in the arse when you have to second guess them and work around their behaviour.

As things are, I can share a venv with several projects, or have one for my account if I don't want to break the system tools. I can even write a trivial bash function to activate a venv automatically based on whether a venv exists in my project. It's so trivial to do and yet generates all this discussion.

As for non-portability that's the most specious and pathetic argument of the lot. who can bother to make every library portable to every distro....? what is the point of distros if one demands that one can run the same binaries everywhere? This is what containers were invented for. If you need the latest versions you might just, heaven forbid, have to compile something and fight with the problems of running it on distro that it wasn't developed on.

I don’t understand what you’re talking about. Literally all common interpreted languages adhere to Unix standards—they take arguments, use environment files, handle input and output streams, and load libraries.

Their single advantage over Python is that they are able to work fine without virtual environments, as they just load libraries from a relative path: That way, you can copy-paste a project directory, move it to another system with a copy of the interpreter binary, and… run the software. There is nothing clever about that; I’d even say Python's way of magically configuring a shell just to be able to locate files is the "clever" solution that nobody asked for!

Portable as in "you can't copy a virtual env to an equivalent machine to deploy it". Look up the documentation, it's not recommended because of the hardcoded absolute paths or something like that :-|

Python venvs literally f*ed up the simplest form of deployment on the planet, scp. Yes, we have more complex solutions like Docker, another abomination (the software itself). Docker was invented in big part due to Python (not only, but it was a big factor).

Again, I use venvs. They're ok. But they're a stupid semi abstraction.

https://xkcd.com/1987/

venvs try to economise on space by symlinking to the existing python where possible....but of course there is a way to do a full copy.

    python -m venv --copies .myvenvdir
You can scp this but the destination machine should be using the same version of python and have whatever modules you're using from the system install in it's own system install.

There's nothing nice about this but it does protect you from a lot of issues where it might seem to work and then behave unexpectedly. e.g. if you had a different python on the destination.

Docker doesn't just help with scripts - it also manages the problem of binary compatibility with C and C++ (and whatever) libraries. You may not have had this problem with C/C++ code so you might imagine it's all about python but I can only say it's a misery that the C/C++ crowd have been suffering with for a very long time. How does one support 3 distros each with a different version of libffmpeg installed when you're making a media player? Sometimes there's a lot of "#if FFMPEG_MAJOR > 3" in the code to cope with it.

The distro developers build each package with a bunch of patches to adapt them to the distro they're in.

It's bad enough for the distro developers to curate all this when it's their job but with python devs are living in a more minimally curated world and in some applications like ML are now dealing with wide and deep dependency trees that rival a distro.

IMO perhaps "someone" should come up with some new distributions where they make everything work together.

From the programming language side of things the solution other languages use is to... just be fast. Java is fast enough, so is C#. Javascript/Typescript can be fast-ish, Go is fast enough, Rust is definitely fast.

So a lot of other major ecosystems are just self contained. All the "big" libraries are portable and written in the language itself, so they rarely plug into C/C++ (aka distribution/OS dependencies).

So Docker was basically built primarily for slow programming languages and I guess in a weird way, for C/C++, as you say? :-)))

If they were as easy to use as python everyone would do so but they're not because they're not as dynamic and that's the price to pay. https://pypy.org/ is the usual answer if you want speed with python but then you deal with glitches and libraries that don't all work and only a few people care enough about speed to put up with that. https://www.graalvm.org/python/ is technically even more fancy and by now it might also work most of the time.

There are path configuration files (*.pth) and you can configure sys.path in the script itself?

Yes, and in principle you can install each package into a separate folder (see the `--target` option for pip) and configure sys.path manually like that.

For .pth files to work, they have to be in a place where the standard library `site` module will look. You can add your own logic to `sitecustomize.py` and/or `usercustomize.py` but then you're really no better off vs. writing the sys.path manipulation logic.

Many years ago, the virtual environment model was considered saner, for whatever reasons. (I've actually heard people cite performance considerations from having an overly long `sys.path`, but I really doubt that matters.) And it's stuck.

The only word in the `source .venv/bin/activate` command that isn't a complete red flag that this was the wrong approach is probably bin. Everything else is so obviously wrong.

source - why are we using an OS level command to activate a programming language's environment

.venv - why is this hidden anyway, doesn't that just make it more confusing for people coming to the language

activate - why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment

Feels dirty every time I've had to type it out and find it particularly annoying when Python is pushed so much as a good first language and I see people paid at a senior level not understand this command.

> why are we using an OS level command to activate a programming language's environment

Because "activating an environment" means setting environment variables in the parent process (the shell that you use to run the command), which is otherwise impossible on Linux (see for example https://stackoverflow.com/questions/6943208).

> why is this hidden anyway, doesn't that just make it more confusing for people coming to the language

It doesn't have to be. You can call it anything you want, hidden or not, and you can put it anywhere in the filesystem. It so happens that many people adopted this convention because they liked having the venv in that location and hidden; and uv gives such venvs special handling (discovering and using them by default).

> why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment

Because the entire point is that, when you need to activate the environment, the folder in question is not on the path (the purpose of the script is to put it on the path!).

If activating virtual environments shadows e.g. /usr/bin/activate on your system (because the added path will be earlier in $PATH), you can still access that with a full absolute path; or you can forgo activation and do things like `.venv/bin/python -m foo`, `.venv/bin/my-program-wrapper`, etc.

> Feels dirty every time I've had to type it out

I use this:

  $ type activate-local 
  activate-local is aliased to `source .local/.venv/bin/activate'
Notice that, again, you don't have to put it at .venv . I use a .local folder to store notes that I don't want to publish in my repo nor mention in my project's .gitignore; it in turn has

  $ cat .local/.gitignore 
  # Anything found in this subdirectory will be ignored by Git.
  # This is a convenient place to put unversioned files relevant to your
  # working copy, without leaving any trace in the commit history.
  *
> and I see people paid at a senior level not understand this command.

If you know anyone who's hiring....

Fair response it's just nothing else feels like this weird duct tape'd together bunch of hacks to work around the design mistakes of the base language assuming it's a top level part of the OS.

> which is otherwise impossible on Linux

Node, Rust, etc all manage it.

> Because the entire point is that...

I just mean there is a history of Python using overly generic naming: activate, easy-install. Just feels weird and dirty to me that you'd call such a specific things names like these and I think it's indicative of this ideology that Python is deep in the OS.

Maybe if I'd aliased the activate command a decade ago I wouldn't feel this way or think about it.

> Node, Rust, etc all manage it.

  $ (bash -c 'export foo=bar && echo $foo')
  bar
  $ echo $foo

  $
How do they work around this?

You're arguing an awful lot in favor of Python venvs for someone who doesn't really seem to know any other programming language ecosystems in depth.

Similar mindset to the original creators of venv, I imagine :-)

Or you don't realize the difference between something like "cargo run" and "python file.py".

They don’t use environment variables. See also git.

... And as I explained repeatedly in multiple other posts throughout the thread, you can also use virtual environments without activating them, in which case you are similarly not using environment variables.

The git model is based on automatic detection of the .git folder, by having it in a location determined by convention. Many higher-level tools in the Python ecosystem have provided analogous handling of virtual environments over the years. Uv is doing it now; historically pyenv was used for that sort of thing, for example.

But when I said "which is otherwise impossible on Linux", I was specifically referring to the setting of environment variables, because OP asked why an activation script had to be sourced, and the reason is because that's what "activation" is.

This is a model that enough people liked using many years ago, to become dominant. It creates the abstraction of being "in" the virtual environment, while giving you the flexibility to put the actual file tree whereever you want.

Maybe it's just me, but it shouldn't be necessary to manage this and a few other things to get a python script working.

uv has increased my usage of python for production purposes because it's maintainable by a larger group of people, and beginners can become competent that much quicker.

One could say ... why do people not bother to learn the shell, or how programs get environment settings ...or how to write shell function to run activate for themselves or how to create a tiny makefile which would do all of this for them?

Surely the effort of programming the actual code is so significant that starting a tool is a minor issue?

Because the effort of programming the actual code is often less than figuring out how to glue together a dozen different inane tools with inexplicably different usage conventions. KISS always matters, and UV keeps it very simple. Small is often the opposite of simple.

I think this happens because in the ML world particularly, there's an unusually pressured situation.

Why are people not using the system python? Perhaps it's too old or not old enough for some library that they have to use. This suggests there's a lot of change going on at the moment and it's not all synced up. I also suspect that people are using a very great number of different modules that change incompatibly all the time and ontop of that they need binary libraries of various kinds which are quite difficult to build and have all their own dependencies that python cannot install itself.

Rust has the advantage that they can build a world more completely out of rust and not worry as much about what's on the system already.

I'm glad uv is helping people.

> Why are people not using the system python? Perhaps it's too old or not old enough for some library that they have to use.

If you're on a "stable" distro like Debian or Ubuntu LTS, that can be somewhere around 5 years old at the end of the stability period. And your system probably depends on its Python, so if you need a newer version of a library than the system's package manager provides you can't update it without risking breaking the system. Python itself has added several very nice new features in the last few versions, so anyone stuck on Ubuntu 22.04 LTS with Python 3.10 can't use newer Python features or libraries through their system's package manager.

I get it, but one can "just use Artix" or Arch or one of the other rolling distros where everything is always new. The only reason for being wedded to Ubuntu IS long term compatibility and that feeling that Ubuntu will fix security bugs but otherwise keep things "old".

I'm comfortable in the shell.

I also value my time and life and some degree of standardization.

A language grows on it's ability to create beginners, not to make the people who have learned it the harder way feel special at the expense of others.

Feeling special isn't important. What's important is the thought of something getting altered to fit a special usecase which excludes yours ...all because the people doing it don't know what the alternatives are.

Fortunately uv got written and we don't have a problem. I don't have to use it but I can when I want to.

The venv thing def stands out to me as being a bit of an outlier.

If uv makes it invisible it is a step forward.

By requiring special tooling to build and run your program. Namely cargo, npm, etc, while in Python these are a bit more separate concerns.

what happens when you have two projects using different versions of node, etc? isn't that a massive headache?

not that it's great to start with, but it does happen, no?

You create a .node-version file and use fnm or nvm, and presto, when you cd into a project dir, the corresponding node version is activated.

Installing a particular node version also becomes as easy as

    fnm install 24

The rest of the world handles that through PATH/PATH equivalent.

Either the package manager is invoked with a different PATH (one that contains the desired Node/Java/whatever version as a higher priority item than any other version on the system).

Or the package manager itself has some way to figure that out through its config file.

Or there is a package manager launch tool, just like pyenv or whatever, which does that for you.

In practice it's not that a big of a deal, even for Maven, a tool created 21 years ago. As the average software dev you figure that stuff out a few weeks into using the tool, maybe you get burnt a few times early on for misconfiguring it and then you're on autopilot for the rest of your career.

Wait till you hear about Java's CLASSPATH and the idea of having a SINGLE, UNIFIED package dependency repo on your system, with no need for per-project dependency repos (node_modules), symlinks, or all of that stupidity.

CLASSPATH was introduced by Java in 1996, I think, and popularized for Java dependency management in 2004.

> The rest of the world handles that through PATH/PATH equivalent.

Activating a venv is just setting a few environment variables, including PATH, and storing the old values so that you can put them back to deactivate the environment.

It's also putting dependencies in a certain folder in a setup that isn't even portable between machines. Probably some other stuff I'm forgetting.

Well, that is how Python does it as well, an venv is a script setting the PYTHONPATH.

This is wrong. A venv is a hierarchy of folders containing a config file, as well as some symlinks to Python (or stub executables on Windows). "How Python does it" is that when it starts up from the symlink, it looks for the config file at a relative path; if that checks out, it uses hard-coded rules to add another relative path to sys.path. PYTHONPATH is never involved.

The venvs created by the standard library `venv`, as well as by uv (and by the third-party `virtualenv` that formed the original basis for `venv`), also happen to include "activation" scripts that manipulate some environment variables. PYTHONPATH is not among these. It manipulates PATH, so that the venv's symlink is on the path. And it may unset PYTHONHOME.

Node.js does, if you use fnm or nvm.

It does, Python has essentially solved it for years.

lol no it hasn't

Why else is this discussion getting hundreds of comments?

For any random python tool out there, I had about a 60% chance it would work out of the box. uv is the first tool in the python ecosystem that has brought that number basically to 100%. Ironically, it's written in Rust because python does not lend itself well to distributing reliable, fast tools to end users.

It comes up again and again, because people don't realize that solutions already exist(ed) and they don't spend the time to figure things out.

I have managed reproducible Python services and software for multiple years now. This was solved already before uv, although uv does it faster and maybe offers a bit more comfort, although I abstract away such tooling using a simple Makefile anyway.

The reason you are having such a bad time getting random Python projects to work out of the box is, because people creating them did not spend the effort to make them reproducible, meaning, that they do not ensure the setup has same versions and checksums of every direct and transitive dependency. This can be facilitated using various tools these days. poetry, uv, and I am sure there are more. People are just clueless and think that a requirements.txt file with a few loose versions slung in is sufficient. It is not, and you end up with not working project setups like in those cases you refer to.

> For any random python tool out there, I had about a 60% chance it would work out of the box.

Had, past tense, because of the metadata situation and the lack of pre-built wheels. The ecosystem has moved on.

> uv is the first tool in the python ecosystem that has brought that number basically to 100%.

Show me a Python tool that you can install and have work out-of-box with uv, but cannot install and have work out-of-box with pip.

> Ironically, it's written in Rust because python does not lend itself well to distributing reliable, fast tools to end users.

I have repeatedly shown that the slowness of pip is overwhelmingly due to its terrible (organically developed on top of legacy cruft from an era where people simply didn't have the same requirements) architecture, not due to being written in Python. Most of the work of installation is simply not CPU-bound — why would it be? — and the main task that is (optional pre-compilation of Python source to .pyc) is one of the few things where uv is dependent on the Python runtime (which, in turn, will do the work in C).

The language (actually the standard implementations build system) has. The problem is programs and installations don't use it.

Big gap between essentially solving it and actually.

The pieces actually all existed for sure.

I can even write scripts to make it all happen.

But uv remains as a long overdue universal command line tool for python.