It would be nice indeed if there was a good solution to multi-gigabyte conda directories. Conda has been reproducible in my experience with pinned dependencies in the environment YAML... slow to build, sure, but reproducible.

I'd argue bzip compression was a mistake for Conda. There was a time when I had Conda packages made for the CUDA libraries so conda could locally install the right version of CUDA for every project, but boy it took forever for Conda to unpack 100MB+ packages.

It seems they are using zstd now for .conda packages, eg, bzip is obsoleted, so that should be faster.

uv does it by caching versions of packages so they can be shared across projects/environments. So you still have to store those multi-gig directories but you don't have so much duplication. If conda could do something similar that would be great