The sticking point for me is the way tools like uv and poetry build everything around the idea of a "project". I don't want a separate environment for every project, and I don't want to start by creating a project. I want to start with an environment that has stuff in it, and I start fiddling around, and gradually something comes together that eventually will be pulled out into a separate project. From what I can see uv doesn't make this easy.

This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.

When I drop into a Node.js project, usually some things have changed, but I always know that if I need to, I can find all of my dependencies in my node_modules folder, and I can package up that folder and move it wherever I need to without breaking anything, needing to reset my PATH or needing to call `source` inside a Dockerfile (oh lord). Many people complain about Node and npm, but as someone who works on a million things, Node/npm is never something I need to think about.

Python/pip though… Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore”, forcing me to use a newer version than the project wants and triggering an avalanche of new “you really shouldn’t install packages globally” messages, demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.

And then there’s Conda, which has all of these problems and is also closed source (I think?) and has a EULA, which makes it an even bigger pain to automate cleanly (And yes I know about mamba, and miniconda, but the default tool everyone uses should be the one that’s easy to work with).

And yes, I know that if I was a full-time Python dev there’s a “better way” that I’d know about. But I think a desirable quality for languages/ecosystems is the ability for an outsider to drop in with general Linux/Docker knowledge and be able to package things up in a sometimes unusual way. And until uv, Python absolutely failed in this regard.

Having a directory like node_modules containing the dependencies is such an obviously good choice, it's sad how Python steering council actively resists this with what I find odd arguments.

I think a lot of the decades old farce of Python package management would have been solved by this.

https://peps.python.org/pep-0582/

https://discuss.python.org/t/pep-582-python-local-packages-d...

It's literally what a venv does and it is very widespread to just make a venv per project, just like you are creating a node_modules per project.

This is not a good idea: this leads to longer build times and/or invalid builds (you build against different dependencies than declared in config).

Having dependency cache and build tool that knows where to look for it is much superior solution.

(p)npm manages both fine with the dependency directory structure.

This is literally not possible.

If you have local dependency repo and dependency manifest, during the build, you can either:

1. Check if local repo is in sync - correct build, takes more time

2. Skip the check - risky build, but fast

If the dependencies are only in the cache directory, you can have both - correct and fast builds.

I don't follow. In pnpm there's a global cache at ~/.pnpm with versioned packages and node_modules has symlinks to those. Dependencies are defined in package.json transitive dependencies are versioned and SHA512-hashed in pnpm-lock.yaml.

E.g.

  $ ls -l ./node_modules/better-sqlite3
  ... node_modules/better-sqlite3 -> .pnpm/better-sqlite3@12.4.1/node_modules/better-sqlite3

You still need to have those symlinks checked. For example you switch branch to one with updated package.json, now you need either to check symlinks or you risk to have incorrect build.

Introducing a directory that needs to stay in sync with dependency manifest will always lead to such problems. It is good that Python developers do not want to repeat such mistake.

Just run `pnpm install` after switching the branch. Or add `pnpm install` into the build step. And many build tools will do that automatically. If the deps are in sync with the manifest, that takes typically less than a second.

This is a problem I've never encountered in practice. And it's not like you don't have to update the dependencies in Python if they are different per-branch.

> This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH ...

What's the "this" that is supposedly always your issue? Your comment is phrased as if you're agreeing with the parent comment but I think you actually have totally different requirements.

The parent comment wants a way to have Python packages on their computer that persist across projects, or don't even have a notion of projects. venv is ideal for that. You can make some "main" venv in your user directory, or a few different venvs (e.g. one for deep learning, one for GUIs, etc.), or however you like to organise it. Before making or running a script, you can activate whichever one you prefer and do exactly like parent commenter requested - make use of already-installed packages, or install new ones (just pip install) and they'll persist for other work. You can even switch back and forth between your venvs for the same script. Totally slapdash, because there's no formal record of which scripts need which packages but also no ceremony to making new code.

Whereas your requirements seem to be very project-based - that sounds to me like exactly the opposite point of view. Maybe I misunderstood you?

    > Python/pip though… Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore” [...]
How are you containerizing Python projects? What confuses me about your statement are the following things:

(1) How old must the Python version of those projects be, to not be supported any longer with any decent GNU/Linux distribution?

(2) Are you not using official Python docker images?

(3) What's pip gotta do with a Python version being supported?

(4) How does that "Your Linux distro doesn’t support that version of Python anymore" show itself? Is that a literal error message you are seeing?

    > [...] demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year
It seems you are talking about installing things in system Python, which you shouldn't do. More questions:

(1) Why are you not using virtual environments?

(2) You are claiming Node.js projects to be better in this regard, but actually they are just creating a `node_modules` folder. Why then is it a problem for you to create a virtual environment folder? Is it merely, that one is automatic, and the other isn't?

    > This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.
It is very easy to activate a venv just for one command. Use a subshell, where you `. venv/bin/activate && python ...(your program invocation here)...`. Aside from that, projects can be set up so that you don't even see that they are using a venv. For example I usually create a Makefile, that does the venv activating and running and all that for me. Rarely, if ever, I have to activate it manually. Since each line in a Makefile target is running in its own shell, nothing ever pollutes my actual top level shell.

> (1) How old must the Python version of those projects be, to not be supported any longer with any decent GNU/Linux distribution?

Debian-13 defaults to Python-3.13. Between Python-3.12 and Python-3.13 the support for `pkg_config` got dropped, so pip projects like

https://pypi.org/project/remt/

break. What I was not aware of: `venv`s need to be created with the version of python they are supposed to be run. So you need to have a downgraded Python executable first.

> What I was not aware of: `venv`s need to be created with the version of python they are supposed to be run. So you need to have a downgraded Python executable first.

This is one of uv’s selling points. It will download the correct python version automatically, and create the venv using it, and ensure that venv has your dependencies installed, and ensure that venv is active whenever you run your code. I’ve also been bit by the issue you’re describing many times before, and previously had to use a mix of tools (eg pyenv + pipenv). Now uv does it all, and much better than any previous solution.

> (2) Are you not using official Python docker images?

Would you help me make it work?

  docker run -it --rm -v$(pwd):/venv --entrypoint python python:3.12-alpine -m venv /venv/remt-docker-venv
How do I source it?

  cd remt-docker-venv/
  source bin/activate
  python --version
  bash: python: command not found

Instead of "python --version", just use the "python" executable from within the venv. Sourcing is a concept for interactive shells.

The python executable from the venv is not going to work inside the container as it's a symlink by default. That's a venv that was built on their host OS, and the symlink to the Python binary on the container is not going to work.

You could also pass the `--copies` parameter when creating the initial venv, so it's a copy and not symlinks, but that is not going to work if your on MacOS or Windows (because the binary platform is different to the Linux that's running the container), or if your development Python is built with different library versions than the container you're starting.

I'd recommend re-creating the virtual environment inside the Docker container.

The problem is you are mounting a virtual environment you have built in your development environment into a Docker container. Inside your virtual environment there's a `python` binary that in reality is a symlink to the python binary in your OS:

  cd .venv
  ls -l bin/python
  lrwxr-xr-x@ 1 myuser  staff  85 Oct 29 13:13 bin/python -> /Users/myuser/.local/share/uv/python/cpython-3.13.5-macos-aarch64-none/bin/python3.13
So, when you mount that virtual environment in a container, it won't find the path to the python binary.

The most basic fix would be recreating the virtual environment inside the container, so from your project (approximately, I don't know the structure):

   docker run -it --rm -v$(pwd):/app --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
  / # cd /app
  /app # uv pip install --system -r requirements.txt
  Using Python 3.12.12 environment at: /usr/local
  Resolved 23 packages in 97ms
  Prepared 23 packages in 975ms
  Installed 23 packages in 7ms
  [...]
  /app # python
  Python 3.12.12 (main, Oct  9 2025, 22:34:22) [GCC 14.2.0] on linux
  Type "help", "copyright", "credits" or "license" for more information.
But, if you're developing and don't wanna build the virtual environment each time you start the container, you could create a cache volume for uv, and after the first time installation, everything is going to be way faster:

  # First run
   docker run -ti --rm --volume .:/app --volume uvcache:/uvcache -e UV_CACHE_DIR="/uvcache" -e UV_LINK_MODE="copy" --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
  / # cd /app
  /app # uv pip install -r requirements.txt --system
  Using Python 3.12.12 environment at: /usr/local
  Resolved 23 packages in 103ms
  Prepared 23 packages in 968ms
  Installed 23 packages in 16ms
  [...]
  # Second run
   docker run -ti --rm --volume .:/app --volume uvcache:/uvcache -e UV_CACHE_DIR="/uvcache" -e UV_LINK_MODE="copy" --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
  / # cd /app
  /app # uv pip install -r requirements.txt --system
  Using Python 3.12.12 environment at: /usr/local
  Resolved 23 packages in 10ms
  Installed 23 packages in 21ms
You can also see some other examples, including a Docker Compose one that automatically updates your packages, here:

https://docs.astral.sh/uv/guides/integration/docker/#develop...

---

Edit notes:

  - UV_LINK_MODE="copy" is to avoid a warning when using the cache volume
  - Creating the venv with `--copies` and mounting it into the container would fail 
    if your host OS is not exactly the same as the containers, and also defeats in a 
    way the use of a versioned Python container

> demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.

Literally, my case. I recently had to compile an abandoned six-year-old scientific package written in C with Python bindings. I wasn’t aware that modern versions of pip handle builds differently than they did six years ago — specifically, that it now compiles wheels within an isolated environment. I was surprised to see a message indicating that %package_name% was not installed, yet I was still able to import it. By the second day, I eventually discovered the --no-build-isolation option of pip.

For not having to call 'source ...' in a Dockerfile, if you use the python executable from the virtualenv directly, then it will be as if you've activated that virtualenv.

This works because of the relative path to the pyenv.cfg file.

The way to activate a virtual environment in a docker container is to export modified PATH and possibly change PYTHONHOME.

I think my ultimate problem with venv is that virtual environments are solved by Docker. Sure sure, full time Python devs need a way to manage multiple Python and package versions on their machine and that’s fine. But whatever they need has to not get in my way when I come in to do DevOps stuff. If my project needs a specific version of Node, I don’t need nvm or n, I just install the version I want in my Dockerfile. Same with Go, same with most languages I use.

Python sticks out for having the arrogance to think that it’s special, that “if you’re using Python you don’t need Docker, we already solved that problem with venv and conda”. And like, that’s cute and all, but I frequently need to package Python code and code in another language into one environment, and the fact that their choice for “containerizing” things (venv/conda) plays rudely with every other language’s choice (Docker) is really annoying.

Then use a Docker container that has the right Python version already? There are official containers for that.

If that's not good enough for you, you could do some devops stuff and build a docker container in which you compile Python.

I don't see where it is different from some npm project. You just need to use the available resources correctly.

I dont understand why you can't just install python in your container? How does venv make it hard?

> I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.

pip and venv are not such things. The activation script is completely unnecessary, and provided as a convenience for those to whom that workflow makes more sense.

> Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore“

I can't fathom why. First off, surely your container image can just pin an older version of the distro? Second, right now I have Python versions 3.3 through 3.14 inclusive built from source on a very not-special consumer Linux distro, and 2.7 as well.

> and triggering an avalanche of new “you really shouldn’t install packages globally” messages, demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.

Literally all you need to do is make one virtual environment and install everything there, which again can use direct paths to pip and python without sourcing anything or worrying about environment variables. Oh, and fix your automated scripts so that they'll do the right thing next time.

> I know that if I was a full-time Python dev there’s a “better way” that I’d know about.

Or, when you get the "you really shouldn't install packages globally" message, you could read it — as it gives you detailed instructions about what to do, including pointing you at the documentation (https://peps.python.org/pep-0668/) for the policy change. Or do a minimum of research. You found out that venvs were a thing; search queries like "python venv best practices" or "python why do I need a venv" or "python pep 668 motivation" or "python why activate virtual environment" give lots of useful information.

> I don’t want a thing that hijacks my terminal and PATH

The shame is ... it never had to be that way. A venv is just a directory with a pyvenv.cfg, symlinks to an interpreter in bin, and a site-packages directory in lib. Running anything with venv/bin/python _is_ running in the virtual environment. Pip operations in the venv are just venv/bin/python -m pip ... . All the source/deactivate/shell nonsense obfuscating that reality did a disservice to a generation of python programmers.

> The shame is ... it never had to be that way.

It isn't that way. Nothing is preventing you from running the venv's python executable directly.

But the original designer of the concept appears to have thought that activation was a useful abstraction. Setting environment variables certainly does a lot to create the feeling of being "in" the virtual environment.

Conda is open source. Not sure what you mean about an EULA. There are some license agreements if you use Anaconda, but if you just use conda-forge you don't have any entanglements with Anaconda the company. (I agree the nomenclature is confusing.)

I… I’m sorry to hear that. Wow. That is shockingly bad.

Seriously, this is why we have trademarks. If Anaconda and Conda (a made-up word that only makes sense as a nickname for Anaconda and thus sounds like it’s the same thing) are two projects by different entities, then whoever came second needs to change their name, and whoever came first should sue them to force them. Footguns like this should not be allowed to exist.

It's not like they're entirely separate and unrelated things. Anaconda is a company that created a program called Conda which can connect to various "channels" to get packages, and initially the main one was the Anaconda channel. Conda was open source but initially its development was all done by Anaconda. Gradually the Conda program was separated out and development was taken over by a community team. Also there is now conda-forge which is a community-run channel that you can use instead of the Anaconda one. And then there is also Mamba which is basically a faster implementation of Conda. That's why there's the awkward naming. It's not like there are competing groups with similar names, it's just things that started off being named similarly because they were built at one company, but gradually the pieces got separated off and moved to community maintenance.

Next to the Anaconda/conda/mamba, you forgot micromamba.

Anaconda suddendly increased the licensing fees like Broadcom did with VMWare, many companies stopped using it because of the sudden increase in costs.

https://blog.fulcrumgenomics.com/p/anaconda-licensing-change... https://www.theregister.com/2024/08/08/anaconda_puts_the_squ...

Conda was made by Anaconda, there's no one to sue, chromium vs Chrome

What you describe I think is what most other people hate the most about python. The fact that everything pollutes the global environment, which then becomes a mess of things depending on various versions, which also ends up breaking tools included in the OS and suddenly your whole system is effed.

Exactly, and this is why we use virtual environments. Which then get dumped on by other people. People can't have it both ways, but they already have all the choices in Python's ecosystem. They just complain, because they are not aware how they can do it their way and didn't invest the time to nail down the workflow they like, or their workflow idea is self-contradictory.

> The fact that everything pollutes the global environment

This is not anything like a fact. For three years now (since the 3.11 release) Python distributions on Linux have in fact taken special measures to prevent the user from using tools other than the system package manager to install into the global environment. And for thirteen years (since the 3.3 release) Python has offered standard library functionality to create isolated environments specifically to avoid that problem. (And that functionality is based on a third party library with public releases going back eighteen years.)

Pip is designed around giving you the freedom to choose where those environments are (by separately creating them) and your strategy for maintaining them (from a separate environment per-project as a dev, to a single-environment-still-isolated-from-the-system for everything as a data scientist, and everything in between).

I've never quite forgiven python for the time I tried to update it using yum, it failed which is then broke yum - not a fun place to end up.

Treating python as a project level dependency rather than a system level dependency is just an excellent design choice.

> Treating python as a project level dependency rather than a system level dependency

Nobody is treating Python as a project level dependency. Your Linux distro treats it as a system level dependency, which is exactly why you encountered the problem you did.

When you create a virtual environment, that does not install a Python version. It just makes symlinks to a base Python.

Building Python from source, and setting it up in a way that doesn't interfere with the package manager's space and will cause no problems, is easy on major distros. I have access to over a dozen builds right now, on Mint which is not exactly a "power user" distro (I didn't want to think about it too much when I initially switched from Windows).

System package management is a mess in the first place, if you have a program that uses python then all the packages that it uses need to be installed globally, so you have python packages bundled as system packages which can conflict with that same package installed with pip.

> if you have a program that uses python then all the packages that it uses need to be installed globally

Only if that "program that uses Python" is itself provided by a system package for global installation.

> so you have python packages bundled as system packages which can conflict with that same package installed with pip.

Right; you can choose whether to use system packages entirely within the system-package ecosystem (and treat "it's written in Python" as an irrelevant implementation detail); or you can create an isolated environment so as to use Python's native ecosystem, native packaging, and native tools.

I don't know why anyone would expect to mingle the two without problems. Do you do that for other languages? When I tried out Ruby, its "Bundler" and "gem" system was similarly isolated from the system environment.

My sentiment coming to Python after getting used to the DX of Node

https://uploads.dailydot.com/2024/04/damn-bitch-you-live-lik...

My exact reaction when I visit front-end-land. It always surprises me that you can't just leave a project alone for some time, get back to it and continue where you left off. There are always a couple of days of getting the thing to work again, then figuring out how everyone is doing things now, because apparently that has to change every few months.

While I don’t do frontend and don’t want to defend that mess, I think that locking a specific version for each dependency would solve this

Uv allows you to lock versions too. And creates a .venv inside the project which, I guess, is similar to the node_modules directory.

For sure, I was referring to the state of python before uv. Specifically projects not being self-contained, easily portable, etc

You could already do these things before, you just spent much more time twiddling your thumbs waiting for lock files to be resolved

I'm not talking about the global environment. I want to be able to have whatever environments I want. But my point is that the environments are not in a 1-1 relationship with "projects".

This is easier to do with uv than it is with pip.

You can create venvs wherever you please and then just install stuff into them. Nobody forces the project onto you, at work we don't even use the .toml yet because it's relatively new, we still use a python_requirements.txt and install into a venv that is global to the system.

Way to go to create non-reproducible results. Basically, nothing production ready is rolling out of that one.

This is what the OP wanted. I don't agree to do it this way.

At work for us we use uv pip freeze to generate a more strict requirements file.

What I was referring to is:

> [...] at work we don't even use the .toml yet because it's relatively new, we still use a python_requirements.txt and install into a venv that is global to the system.

Unless your `python_requirements.txt` also carries checksums, like uv's lock files or poetry's lock files, that is. Though of course things get spicy and non-reproducible again, if you have then multiple projects/services, each with their own `python_requirements.txt`, all installing into that same global venv ...

You can't just create yourself an "everything" environment with UV and then experiment with it? Honest question.

I think you're basically suggesting that you'd have a VM or something that has system-high packages already preinstalled and then use UV on top of it?

If so, it's certainly not obvious. I mean look at the docs: https://docs.astral.sh/uv/getting-started/features/

I don't see anything resembling "environments" in the list of features or in the table of contents. In some sections there is stuff like "When working on a project with uv, uv will create a virtual environment as needed", but it's all about environments as tied to particular projects (and maybe tools).

You can use the `uv venv` and the `uv pip` stuff to create an environment and install stuff into it, but this isn't really different from normal venvs. And in particular it doesn't give me much benefit over conda/mamba.

I get that the project-based workflow is what a lot of people want, and I might even want it sometimes, but I don't want to be forced into foregrounding the project.

> I don't want to be forced into foregrounding the project.

The advantage of being forced to do this is other people (including yourself on a new laptop) can clone your project, run uv install and get working. It's the death of "works on my machine" and "well it'll take them a couple of weeks to properly get up and running".

> The advantage of being forced to do this is other people (including yourself on a new laptop) can clone your project

I know this might be a strange idea on HN, but tons of people writing code in Python, who need access to PyPI packages to do what they're doing, have no intention whatsoever of providing a clonable project to others, or sharing it in any other way, or indeed expecting anyone else on the planet to run the code.

Sure, that's fine. I'm saying I just won't get to that stage until later.

A “project" is largely a way of storing the definition of the state of a environment in a config file (pyproject.toml). It does a bit more, but you can basically ignore that initially.

It takes a couple of seconds to setup, and then you just use uv add instead of (uv) pip install to add things to the environment, and the project file is kept in sync with the state of the environment. I'm not really understanding what it is for the workflow you describe that you expect a tool to do that uv isn’t providing?

Just professional software engineering practices.

> And in particular it doesn't give me much benefit over conda/mamba.

How about the advantage of not taking an entire lunch break to resolve the environment every time you go to install a new library?

That was the biggest sticking point with conda/mamba for me. It's been a few years since I last used them but in particular with geospatial packages I would often run into issues.

This is not the case anymore. With libmamba I've never waited more than a few minutes for resolving dependencies and installing. uv is faster for sure, but speed is no longer a big problem with conda

libmamba solved this year's ago. The dep solver is now much faster.

[deleted]

I assume if you stick a .venv in your home directory it'll essentially be available globally as you desired.. and overridden in project specific .venv or similar

I think that user (like me until finding out just now) didn't know that you could make ad-hoc virtual environments with uv, and that instead you would have to make a directory with a pyproject.toml and install things into it by adding to that pyproject.toml

You can, but an "everything environment" is rarely to be recommended. How to do it: Make uv use the --active environment to install things. Before using uv, activate that environment.

I have the same feeling, so I just looked it up and actually uv does support exactly that mode. It works the same as venv and pip but you just prefix a bunch of commands with "uv". Create a new virtual environment fooenv:

   uv venv fooenv
Activate virtual environment on Windows (yes I'm sorry that's what I'm currently typing on!):

   .\fooenv\Scripts\activate
Run some environment commands:

   uv pip install numpy
   uv pip freeze
   uv pip uninstall numpy
If you run python now, it will be in this virtual environment. You can deactivate it with:

   deactivate
Admittedly, there doesn't seem to be much benefit compared to traditional venv/pip for this use case.

This is covered in the section of the docs titled "The pip interface": https://docs.astral.sh/uv/pip/

> Admittedly, there doesn't seem to be much benefit compared to traditional venv/pip for this use case.

Performance?

Go ahead and do that then. uv is not preventing you from putting 10 projects within one folder.

It is still benefitial to not install stuff system wide, since this makes it easy to forget which stuff you already have installed and which is a missing dependency.

Keeping track of dependencies is kind part of a programers work, so as long as you're writing these things mostly for yourself do whatever you like. And I say that as someone who treats everything like a project that I will forget about in 3 days and need to deploy on some server a year later.

I agree that having a reliable main environment for quick experiments is great! On Windows I just use the main Python installation as a global environment, since no system stuff depends on it, on Linux I tend to create a "main" environment in the home directory. Then I can still have per-project environments as needed (say with uv), for example for stuff that I need to deploy to the VPS.

Note that I'm mostly in the research/hobby environments - I think this approach (and Python in general, re: some other discussions here about the language) works really well, especially for the latter, but the closer you get to "serious" work, the more sense the project environment approach makes of course

Especially in research the project environment approach makes sense, where results don't mean much without reproducibility.

You can setup uv inside your script, without a project.

Example: https://treyhunner.com/2024/12/lazy-self-installing-python-s...

I'm not talking about wanting single-file scripts, but about having a "sandbox" environment in which various things can be messed with before abstracting anything out into a project.

Doesn't the single-file script let you do exactly that?

If not, where do you see a meaningful difference?

Most people actually using python do not start off in scripts. Usually, I would mess around in IPython / Jupyter for a couple days until I have something I'm happy with. Then I'll "productionize" the project.

tbh this has been a sticking point for me too with uv (though I use it for everything now). I just want to start of a repl with a bunch of stuff installed so I can try out a bunch of stuff. My solution now it to have a ~/tmp dir where I can mess around with all kinds of stuff (not just python) and there I have a uv virtualenv installed with all kinds of packages pre-installed.

> Usually, I would mess around in IPython / Jupyter for a couple days until I have something I'm happy with. Then I'll "productionize" the project.

Right, it's this. I get the feeling a lot of people here don't work that way though. I mean I can understand why in a sense, because if you're doing something for your job where your boss says "the project is X" then it's natural to start with a project structure for X. But when I'm going "I wonder if this will work..." then I want to start with the code itself and only "productionize" it later if it turns out to work.

>tbh this has been a sticking point for me too with uv (though I use it for everything now). I just want to start of a repl with a bunch of stuff installed so I can try out a bunch of stuff.

I hope the people behind UV or someone else adress this. A repl/notebook thing that is running on a .venv preinstalled with stuff defined in some config file.

> A repl/notebook thing that is running on a .venv preinstalled with stuff defined in some config file.

So, create a project as a playground, put what you want it to include (including something like Jupyter if you want notebooks) in the pyproject.toml and... use it for that?

What do you want a tool to do for that style of exploration that uv doesn't already do? If you want to extract stuff from that into a new, regular project, that maybe could use some tooling, sure, that would take some new tooling.

Do you need a prepackaged set of things to define the right “bunch of stuff” for the starting point? Because that will vary a lot by what your area of exploration is.

    uv run --with=numpy,pandas python

I have a directory called workspace where there’s a projects directory and the main area is for messing around. Just setup workspace once as a project.

But I don't want the sandbox linked in any way to a directory. I just want to be able to use it from anywhere. (This is what I can do with conda.)

You can activate the uv venv from anywhere just fine, just do source path_to_sandbox/.venv/bin/activate. Probably makes sense to define a shortcut for that like activate sandbox. Your conda env is also linked to a directory, it’s just a hidden one, you can also create the uv obe somewhere hidden. But I get it to some extent conda has this large prefilled envs with a lot of stuff in it already that work together. Still if you then end up needing anything else you wait ages for the install. I find conda so unbearable by now that I voluntarily switch every conda thing I have left over to uv the second I need to touch the conda env.

A shell alias that activates workspace/sandbox?

Neither of the tools (uv, poetry) makes it the default, and the poetry designers/developers are kinda ignorant about this, but it can be done.

Serious question - what’s stopping you from having 1 large project called “sandbox”?

This is me exactly. I have a default dev environment, and a couple for experimenting with new versions of various packages, I don't need or want "projects" most of the time

This is nice just for playing and exploring alone. It is a nightmare if you need reproducibility, to send something for production, or event to share your work with someone. Maybe even to come back to your project an year later.

uv.lock is a bless

Use pixi (whici is build with uv) and use its "global". It should solve what you wanted to solve: https://pixi.sh/dev/global_tools/introduction/

It doesn't.

uv has a script mode, a temp env mode, and a way to superimpose a temp env on top of an existing env.

See: https://www.bitecode.dev/p/uv-tricks

That's one of the selling point of the tool: you don't need a project, you don't need activate anything, you don't even need to keep code around.

Yesterday I wanted to mess around with logoru in ipython. I just ran `uvx --with loguru ipython` and I was ready to go.

Not even a code file to open. Nothing to explicitly install nor to clean up.

For a tool that is that fantastic and create such enthusiasm, I'm always surprise of how little of its feature people know about. It can do crazy stuff.

But then you don't need uv. The pain point uv solves is projects. Different projects with different dependencies (even the same but different versions), multiple people, teams, and environments trying to run the same code.

That gets problematic if environments go out of sync, or you need different versions of python or dependencies.

So you are right, you probably won't benefit a lot if you just have one big environment and that works for you, but once you pull things in a project, uv is the best tool out there atm.

You could also just create a starter project that has all the things you want, and then later on pull it out, that would be the same thing.

Interesting. I never start working on something without a rough idea of what I am working on, be that just researching something or a program; and uv makes it extremely easy to create a folder, and make it a project.

Could it be that you’re just used to separate environments causing so much pain that you avoid it unless you’re serious about what you’re doing?

If they are like me, they are just exploring different data, ideas and stuff - maybe without knowing/having descided if it will be a project. Many times it nice to use Python as a tool explore some data.

It unblocks that workflow, that's why it's so great. You can have a single script with inline dependencies that are auto installed on execution. That can expand to importing other files, but there is very little setup tax to get started with a script and it does not block expansion.

It's not about single-file scripts, it's about having a "sandbox" environment in which various things can be messed with before abstracting anything out into a project.

This is a divide among different Python devs it seems. A lot of people are running scripts, which I will do eventually, but I spend a ton of time in a REPL experimenting and testing.

Yeah, it's this experiment-first workflow that's not so well supported.

Whenever I feel like doing that I just use "uv pip" and pretty much do the same things I'd do when using pip to messily install things in a typical virtual environment.

You set up a template project folder. In the script you can just add a uv command. In the age of Claude Code and Codex, creating a sandbox/container should be step one.

You can just `uv venv`? Or even uvx?

> I don't want a separate environment for every project

That is exactly 100% what I demand. Projects should be - must be - completely isolated from one another.

Quite frankly anything else seems utterly insane to me.