If you can install it with `pip install program-name` it's usually packaged well enough to just work. But if it's a random github repository with a requirements.txt with no or very few version numbers chances are that just running `pip install -r requirements.txt` will lead you down an hour+ rabbit hole of downgrading both your venv's python version and various packages until you get a combination that is close enough to the author`s venv to actually work
Usually happens to me when I find code for some research paper. Even something that's just three months old can be a real pain to get running
I don't disagree with you, but in my experience even having a requirements.txt file is a luxury when it comes to scientific Python code: A lot of the time I end up having to figure out dependencies based purely on whatever the script is importing
If they can't be bothered to make a requirements.txt file, I'm not seeing how uv will be of much help...
uv basically makes that a default. You don’t need to be bothered. Just uv add your dependencies and they are in your pyproject.toml.
Or use `uv add --script`. Then dependencies gets recorded in the script itself, which is great for single-file scripts. But obviously, that won't help if the author can't be bothered to take those steps
Ah, I get it now! The problem occurs when someone publishes something without version pinning, because package versions can become incompatible over time. I don't think I've ever installed something outside of what's available on PyPy, which is probably why I've never run into this issue.
Still, I would think it's rare that package versions of different packages become incompatible?