The open source ecosystem has come very far and proven to be resilient. And while trust will remain a crucial part of any ecosystem, we urgently need to improve our tools and practices when it comes to sandboxing 3rd party code.
Almost every time I bump into uv in project work, the touted benefit is that it makes it easier to run projects with different python versions and avoiding clashes of 3rd dependencies - basically pyenv + venv + speed.
That sends a cold shiver down my spine, because it tells me that people are running all these different tools on their host machine with zero sandboxing.
meh not always. I do use uv IN docker all the time, its quite handy
Honest question - what are the main benefits for you when you use it in docker?
ps. I feel like I've been doing python so long that my workflows have routed around a lot of legit problems :)
For us, the DX of uv for dependency management is much better than just using pip and requirements.txt.
To be clear though, we only use uv in the builder stage of our docker builds, there is no uv in the final image.
Main reason I now use uv is being able to specify a cool down period. pip allows it but it's with a timestamp so pretty much useless..
And that doesn't prevent me from running it into a sandbox or vm for an additional layer of security.
> pip allows it but it's with a timestamp
A PR to be able to use a relative timestamp in pip was merged just last week
https://github.com/pypa/pip/pull/13837/commits
Mainly the "project" system. I'm only developing python in my free time, not professionally so I'm not as well versed in its ecosystem as I would be in PHP. The fact that there's tons of way to have project-like stuff I don't want to deal with thoses. I used to do raw python containers + requirements.txt but the DX was absolutely not enjoyable. I'm just used to it now