I have used
pip freeze > requirements.txt
pip install -r requirements.txt
Way before "official" lockfile existed.
Your requirements.txt becomes a lockfile, as long as you accept to not use ranges.
Having this in a single tool etc why not, but I don't understand this hype, when it was basically already there.
That works for simple cases. Now, update a transitive dependency used by more than one dependency. You might get lucky and it'll just work.
Not sure how uv helps here, because I am not very familiar with it.
With pip you update a dependency, it won't work if it's not compatible, it'll work if they are. Not sure where the issue is?
> it won't work if it's not compatible
This is very new behavior in pip. Not so long ago, imagine this:
You `pip install foo` which depends on `bar==1.0`. It installs both of those packages. Now you install `pip install baz` which depends on `bar==2.0`. It installs baz, and updates bar to 2.0. Better hope foo's compatible with the newer version!
I think pip only changed in the last year or two to resolve conflicts, or die noisily explaining why it couldn't be done.
Simple for simple cases - but you update a dependency and that updates a dependency that has a window range of dependencies because one version had a security issue which causes you to downgrade three other packages.
It can get complicated. The resolver in uv is part of its magic.
https://docs.astral.sh/uv/reference/internals/resolver/
JavaScript has truly rotted the brains of software developers.
You include the security patch of whatever your dependencies are into your local vetted pypi repository. You control what you consider liabilities and you don't get shocked by breakages in what should be minor versions.
Of course you have to be able to develop software and not just snap Lego's together to manage a setup like that. Which is why uv is so popular.
You can make it a language flame war, but the Python ecosystem has had no problem making this bed for themselves. That's why people are complaining about running other people's projects, not setting up their own.
Sensible defaults would completely sidestep this, that's the popularity of uv. Or you can be an ass to people online to feel superior, which I'm sure really helps.
You're implying that I have to run a local Pypi just to update some dependencies for a project? When other languages somehow manage without that? No way I'm doing that.
Some organizations force you to use their internal dependency repos because the "IT department" or similar has blessed only certain versions in the name of "security" (or at least security theater.)
Inevitably, these versions are out-of-date. Sometimes, they are very, very out of date. "Sorry, I can only install [version from 5 years ago.]" is always great for productivity.
I ran into this recently with a third-party. You'd think a 5 year old version would trigger alarm bells...
This is the norm at all big tech companies now AFAIK.
I use 30 year old software regularly. Newer doesn't mean working.
Sure. I do a lot of retrocomputing and that's fine. I have OSes from the 80's running in emulators.
But when you're developing software, you want the newer stuff. Would you use MySQL 5.0 from 2005? No, you'd be out of your mind.
Im wondering if people like you are getting paid to vet other people’s libraries? Because with every modern project I have ever seen, you can’t do too much the rest of the day with the amount of library updates you have to be vetting.
He's a consultant. Making everyone else sound incompetent is part of the gig.
Cool so how does that work when you’re writing a library that you want to distribute to other people?
> Not sure how uv helps here, because I am not very familiar with it.
Which makes you part of the people the GP is referring to? Try using it anger for a week, you'll come to understand.
It's like Sisyphus rolling a cube up a hill and being offered a sphere instead: "no thanks, I just push harder when I have to overcome the edges."
Can you elaborate on this? How is npm/cargo/etc better than pip on this regard?
As far as I know, files like requirements.txt, package.json, cargo.toml are intended to be used as a snapshot of the dependencies in your project.
In case you need to update dependency A that also affects dependency B and C, I am not sure how one tool is better than other.
Open a requirements.txt and a package.lock.json next to each other and compare. Then you will know the answer to the question what npm, cargo, and others are doing better than pip. Oh, did I sneek a ".lock" in there? Damn right I did.
They will resolve a version that works for all dependencies if it exists.
I remember advocating for running nightly tests on every project/service I worked on because inevitably one night one of the transitive dependencies would update and shit would break. And at least with the nightly test it forced it to break early vs when you needed to do something else like an emergency bug fix and ran into then..
it won’t work of course, no one is that lucky :)
Even more importantly, uv forces you to do it right like npm always did
npm did not always do it right, and IMO still does not do it completely right (nor does pnpm, my preferred replacement for npm -- but it has `--frozen-lockfile` at least that forces it to do the right thing) because transitive dependencies can still be updated.
cargo can also update transitive dependencies (you need `--locked` to prevent that).
Ruby's Bundler does not, which is preferred and is the only correct default behaviour. Elixir's mix does not.
I don't know whether uv handles transitive dependencies correctly, but lockfiles should be absolute and strict for reproducible builds. Regardless, uv is an absolute breath of fresh air for this frequent Python tourist.
npm will not upgrade transient dependencies if you have a lockfile. All the `forzen-lockfile` or `npm ci` commands does is prevent upgrades if you have incompatible versions specified inside of `package.json`, which should never happen unless you have manually edited the `package.json` dependencies by hand.
(It also removed all untracked dependencies in node_modules, which you should also never have unless you've done something weird.)
I'm not sure when that behaviour might have changed, but I have seen it do so. Same with yarn when not specifying a frozen lockfile.
I switched to pnpm as my preferred package manager a couple of years ago because of this, and even that still requires explicit specification.
It was an unpleasant surprise, to say the least.
That works, more or less. But now you have a requirements.txt file with 300 dependencies. Which ones do you actually care about, and which are just transitive things that your top-level deps brought along for the ride? And a year later, when GitHub's Dependabot is telling you have a security vulnerability in some package you've never heard of, do you remember if you even care about that package in the first place, or if it's left over cruft from that time you experimented with aiohttp instead of httpx?
I always just used pip-tools. Your requirements.in is the file that is human-readable and -writable, and sets your top-level deps and the version ranges you want. requirements.txt is your lockfile that you generate from .in with pip-compile. pip-compile writes out comments specifying from where each package in requirements.txt is being required.
uv does it a lot faster and generates requirements.txts that are cross-platform, which is a nice improvement.
As a “pip is mostly fine” person, we would direct the result to a new lock file, so you could still have your direct does and then pin transitives and update
Pips solver could still cause problems in general on changes.
UV having a better solver is nice. Being fast is also nice. Mainly tho it feeling like it is a tool that is maintained and can be improved upon without ripping one’s hair out is a godsend.
Totally agree, UV's solver speed is exciting
This is way less than what uv and other package managers do:
- dev dependencies (or other groups) - distinguishing between direct and indirect dependencies (useful if you want to cut some fat from a project) - dependencies with optional extra dependencies (if you remove the main, it will delete the orphans when relevant)
It's not unachievable with pip and virtualenvs, but verbose and prone to human error.
Like C: if you're careful enough, it can be memory safe. But teams would rather rely on memory safe languages.
It is also manages the runtime, so you can pin a specific runtime to a project. It is very useful and worth investigating.
I think it's a great modern tool, don't get me wrong.
But the main reason shouldn't be the "lockfile". I was replying to the parent comment mainly for that particular thing.
I am on the same boat. I like uv for its speed and other niceties it brings and being a single tool to manage different things. But lockfile is not that big a deal. I never got Poetry as well. Tried it in a project once and the lockfile was a pain with the merges. I didn’t spend much time, so maybe I didn’t understand the tool and workflow or whatever, but pip and pip-tools were just fine working with requirements.txt.
Hey, just so you know, newer lockfiles are meant to fully replace old ones, you shouldn't bother with solving merge conflicts on these files, just accept all the new changes always.
What you SHOULD solve are conflicts in the packages/project file. Once solved, just create a new lockfile and replace the old one.
This applies to lockfiles on any project python or non-python.
Thank you for the tip. I don't run into them these days. The projects have matured and my area of work has shrinked, so changes to the dependencies are rare and my involvement in them is even rarer. But I will keep this mind for future.
The canonical way to do this with pip was using Constraints Files [1]. When you pollute your main requirements.txt it gets harder to see which package is an actual dependency of your project, and which ones are just sub-dependencies. Constraint files also let you not install a package if it's no longer a sub-dependency.
That being said, the uv experience is much nicer (also insanely fast).
[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files
I've never even understood the virtual env dogma. I can see how version conflicts _could_ happen, but they never have. Admittedly, I'm surprised I never have issues installing globally, especially since others keep telling me what a terrible idea it is and how they had nightmare-scenario-X happen to them.
I write Python code for a living and no two projects I work on have the exact same dependencies. This is especially true when working with microservices, or working for multiple customers.
How do you work with multiple projects with different versions of the same dependencies? If you are using the “system python” for everything?
Not system Python (not least because that's a hassle to do these days anyway, with all the safeguards OS vendors have put in), but _my_ version of globally. My (user) global PyEnv version, for example.
Now having said that, I suspect PyEnv is doing some voodoo behind the scenes, because I occasionally see messages like "Package X what's version N, but you have version N1". I've never investigated them though, since both old and new packages seem to work just fine regardless.
I see. I think your Python usage is very limited. Becaue even when I was using Pyenv, I still needed pyenv-virtualenv to make sure my repositories didn't cross install different chain of dependencies into the same version.
> How do you work with multiple projects with different versions of the same dependencies?
You don't… you use the same versions for everything :)
it's very common for different projects to have different requirements, especially for fast moving libraries like transformers. if you rarely run python stuff it might not be a big deal, but i'd rather not have to reinstall stuff (especially big stuff like pytorch builds) every time i switch projects.
That's exactly it. Imagine your company has multiple Python repos, and one depends on foo>=1.0,<2.0, and another depends on foo>=2.0. Venvs let you configure completely isolated environments for each so that they can peacefully coexist. I would not for a moment consider using Python without virtualenvs, though I'm not opinionated about which tool manages them. Uv? Great. Poetry? Fine. `python -m venv`? Whatever. They all get the job done.
Honestly, I can't think of a single good reason not to want to use a venv for Python.
Using the same version of everything lets you have a much easier time when a vulnerability is discovered?
How so? That hasn’t been my experience.
Do you monitor CVEs?
They happen /all the time/.
For a long time there were even compatibilities between the RHEL host python version, and the python version the Red Hat Ansible team were shipping.
> They happen /all the time/.
So I keep hearing ;)
Meanwhile, on my machines ...
Sure, but if you manage a diverse fleet of apps that could be a decade or two old across thousands of servers, some of those being python projects, you really quickly can run into issues between host and application.
I only ever had it a problem with large, poorly maintained projects from work. You know the kind that have two web frameworks required in the same project, and two orms, etc. ;-) That one I definitely put into a venv. But my stuff, no.
And then you're sunk the moment anyone else needs to run your code, or even if you just need to run your own code on another machine.
Never happened.
I salute you for never needing a new computer, ever.
Honestly, this feels like the difference between Cmake and cargo, sure Cmake does work and you can get to do everything you need, you just need discipline, knowledge and patience. On the other hand, you could just have a tool that does it all for you so you can get back to doing the actual work.
I don’t get the hype either. Every time I’ve tried to use tools like pyenv or pipenv they fall down when I try to install anything that doesn’t provide wheels (GDAL), so I give up and stick to pip and virtualenv. Does uv let me install GDAL without hassle?
Pyenv's a different animal. It's meant for installing multiple Python versions at once so that you're not stuck with whatever dog your base OS happens to ship.
Pipenv tried to be what uv is, but it never did seem to work right, and it had too many weird corner cases ("why is it suddenly taking 3 hours to install packages? why it is literally impossible to get it to upgrade one single dependency and not all the others?") to ever be a contender.
Oops, you forgot to sh into you venv and now your env is messed up.
It is indeed fairly simple to implement it, which is why it's so weird that it's never been implemented at a language level
Good luck if you need cross-platform `requirements.txt` files.
This is a good use case. Not sure how this is typically solved, I guess "requirements-os-version.txt"? A bit redundant and repetitive.
I would probably use something like this: https://stackoverflow.com/questions/17803829/how-to-customiz...
But then you have to m x n x o it for different combinations of Python version, OS, CPU architecture, GPU make/model... uv will solve it for you in milliseconds.
How does uv solve that? Like, if you use dependencies that do not cross platforms very well?
uv finds a dependency resolution that works for all platforms by default, and can do things like fork the resolution and choose different versions based on platform or python version requirements.