For single-file Python scripts, which 99% of mine seem to be, you can simplify your life immensely by just putting this at the top of the script:
#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.11"
# dependencies = [ "modules", "here" ]
# ///
The script now works like a standalone executable, and uv will magically install and use the specified modules.
> The script now works like a standalone executable
But whoever runs this has to install uv first, so not really standalone.
This is a PEP and not specific to uv: https://peps.python.org/pep-0723/
You need a runner for scripts that follow the PEP (actually the packaging standard established initially by the PEP, hence the note about it's historical status.)
The two main runners I am aware of are uv and pipx. (Any compliant runner can be referenced in the shebang to make a script standalone where shebangs are supported.)
The shebang line references uv.
Is it time for a Debian `alternatives`-style system for PIP 723 compliant Python wrappers yet?
I could totally see `#!/usr/bin/python723` become a thing :)
And a shell
They gotta have a computer too. And a source of power.
And my ax... Oh, this is hackernews.
"I write code and am curious I am a hacker"
"Lol, no I break into computer systems I am a hacker"
"Geeze hell no I have an axe, I am an OG hacker"
I don’t think they need a shell unless uv itself requires it, the shebang is handled by the exec syscall.
Of course. Hense the bash shebang - the shebang is the step before the shell is used. Thanks.
And an operating system
No, not a shell. Just a /usr/bin/env
Is that a dare? /s
Small price to pay for escaping python dependency hell.
As long as your `/usr/bin/env` supports `-S`, yes.
It will install and use distribution packages, to use PyPA's terminology; the term "module" generally refers to a component of an import package. Which is to say: the names you write here must be the names that you would use in a `uv pip install` command, not the names you `import` in the code, although they may align.
This is an ecosystem standard (https://peps.python.org/pep-0723/) and pipx (https://pipx.pypa.io) also supports it.
> As long as your
linux core utils have supported this since 2018 (coreutils 8.3), amusingly it is the same release that added `cp --reflink`. AFAIK I know you have to opt out by having `POSIX_CORRECT=1` or `POSIX_ME_HARDER=1` or `--pedantic` set in your environment. [1]
freebsd core utils have supported this since 2008
MacOS has basically always supported this.
---
1. Amusingly despite `POSIX_ME_HARDER` not being official a alrge swapt of core utils support it. https://www.gnu.org/prep/standards/html_node/Non_002dGNU-Sta...
If I were to put on my security hat, things like this give me shivers. It's one thing if you control the script and specified the dependencies. For any other use-case, you're trusting the script author to not install python dependencies that could be hiding all manner of defects or malicious intent.
This isn't a knock against UV, but more a criticism of dynamic dependency resolution. I'd feel much better about this if UV had a way to whitelist specific dependencies/dependency versions.
If you’re executing a script from an untrusted source, you should be examining it anyway. If it fails to execute because you haven’t installed the correct dependencies, that’s an inconvenience, not a lucky security benefit. You can write a reverse shell in Python with no dependencies and just a few lines of code.
it's a stretch to "executing a script with a build user" or "from a validated distro immutable package" to "allowing something to download evergreen code and install files everywhere on the system".
A vanilla python can write files, edit ~/.zsh to create an sudo alias that executes code next time you invoke sudo and type in your password.
uv installing deps is hardly more risky.
That's sneaky. Do any code scanners check for that class of vulnerability?
Scanning for external dependencies is common but not so much internal private libraries.
https://linuxsecurity.expert/compare/tools/linux-auditing-to... shows a few.
I've used Tiger/Saint/Satan/COPS in the distant past. But I think they're somewhat obsoleted by modern packaging and security like apparmor and selinux, not to mention docker and similar isolators.
Code scanners cannot protect you from code execution on your machine.
point is that a script executes the script in front of you.
uv executes http://somemirror.com/some-version
most people like their distro to vet these things. uv et all had a reason when Python2 and 3 were a mess. i think that time is way behind us. pip is mostly to install libraries, and even that is mostly already done by the distros.
You’re about to run an untrusted python script. The script can do whatever it wants to your system. Dependencies are the least of your worries.
The script is just a cat or vim away from audit. Its dependencies on the other hand…
This was very confusing!
I meant it’s easy to inspect your script’s logic — look it. Bunch harder to audit the code in dependencies though…
A download and a cat away?
Sorry I was half asleep! Meant that you can easily look at the code in the script and audit what it does – you can just run `cat` in it and you’re done!
But it’s much harder to inspect what the imports are going to do and be sure they’re free of any unsavory behavior.
If that’s your concern you should be auditing the script and the dependencies anyway, whether they’re in a lock file or in the script. It’s just as easy to put malicious stuff in a requirements.txt
Would you feel better with a script containing eval(requests.get(“http://pypi.org/foo.py”)) ?
It’s the script contents that count, not just dependencies.
Deno-style dependency version pinning doesn’t solve this problem unless you check every hash.
Is there anything new that uv gives you here though?
If you don't care about being ecosystem-compliant (and I am sure malware does not), it's only a few lines of Python to download the code and eval it.
I didn't see it in the comments, but FWIW you can choose specific dependencies. You can use regular [dependency specifiers](https://packaging.python.org/en/latest/specifications/depend...), see [PEP 723](https://packaging.python.org/en/latest/specifications/inline...).
uv can still be redirected to private PyPi mirror, which should be mandatory from security and reliability perspective anyway.
""" uv is straightforward to install. There are a few ways, but the easiest (in my opinion) is this one-liner command — for Linux and Mac, it’s:
curl -LsSf https://astral.sh/uv/install.sh | sh """
Also isn't great. But that's how homebrew is installed, so ... shrug ... ?
Not to bash uv/homebrew, they are better than most _easy_ alternatives.
There's a completely irrational knee-jerk reaction to curl|sh. Do you trust the source or not? People who gripe about this will think nothing of downloading a tarball and running "make install", or downloading an executable and installing it in /usr/local/bin.
I will happily copy-paste this from any source I trust, for the same reason I'll happily install their software any other way.
It really depends on the use case. A one-off install on a laptop that I don't use for anything that gets close to production - fine by me.
For anything that I want to depend on, I prefer stronger auditability to ease of install. I get it, theoretically you can do the exact same thing with curl/sh as with git download/inspecting dependencies, installing the source and so on. But in reality, I'm lazy (and per another thread, a 70s hippie) and would like to nix any temptation to cut corners in the bud.
I hate that curl $SOMETHING | sh has become normalized. One does not _have_ to blindly pipe something to a shell. It's quite possible to pull the script in a manner that allows examination. That Homebrew also endorses this behaviour doesn't make it any less of a risky abdication of administrative agency.
But then I'm a weirdo that takes personal offense at tools hijacking my rc / PATH, and keep things like homebrew at arm's length, explicitly calling shellenv when I need to use it.
[flagged]
I also recommend the flag for a max release date for $current_date - that basically locks all package versions to that date without a verbose lock file!
(sadly, uv cannot detect the release date of some packages. I'm looking at you, yaml!)
I use this but I hate it.
I want to be able to ship a bundle which needs zero network access to run, but will run.
It is still frustratingly difficult to make portable Python programs.
I wouldn't be surprised if astral's next product would be something like this. It's so obvious and there would be much interest from the ML crowd.
My current hobby language is janet. Creating a statically linked binary from a script in janet is trivial. You can even bring your own C libraries.
Zipapp comes close: https://docs.python.org/3/library/zipapp.html
I'm planning to distribute PAPER as a zipapp where the bootstrap script does some initial unpacking to put wheels in a temporary directory, then delegates to the contained code (similar to how the standard library `ensurepip` works). The code that creates the zipapp is in the repository and I'm thinking of genericizing that and making it separately available, too.
Although several variations on this theme already exist, I'm sure. https://github.com/pex-tool/pex/ is arguably one of them, but it's quite a bit bulkier than what I'm looking for.
uv doesn't support creating zipapps yet, but there is an issue with that. For that reason, I'm still using PDM.
Have you tried Nuitka? It takes a little effort but it can compile your Python program to a single executable that runs without network access.
You don't have to remember this, instead remember `uv init --script myscript.py`
Thats the same thing?
I think the point is you don't have to memorize the boilerplate.
This isn't really boilerplate; it declares dependencies needed in the environment, which cannot in general be inferred from static analysis of the code.
You can get uv to generate this and add dependencies to it, rather than writing it yourself.
Yeah, tried it with some rest client + pyfzf (CLI swagger UI sort of), it was really fun. Near instant dependency handling.. pretty cool
Why is the ‘-S’ argument to ‘env’ needed? Based on the man page it doesn’t appear to be doing anything useful here, and in practice it doesn’t either.
> Based on the man page it doesn’t appear to be doing anything useful here
The man page tells me:
Without that, the system may try to treat the entirety of "uv run --script" as the program name, and fail to find it. Depending on your env implementation and/or your shell, this may not be needed.See also: https://unix.stackexchange.com/questions/361794
Right, I didn’t think about the shebang case being different. Thanks!
Without -S, `uv run --script` would be treated as a binary name (including spaces) and you will get an error like "env: ‘uv run --script’: No such file or directory".
-S causes the string to be split on spaces and so the arguments are passed correctly.
On these systems, wouldn’t binfmt attempt to exec(“/usr/bin/env -S uv run --script”, “foo.py”) and fail anyway for the same reason?
No. The string is split to extract at most one argument. See: https://linux.die.net/man/2/execve
So in fact "-S" is not passed as a separate argument, but as a prefix in the first (and only) argument, and env then extracts it and acts accordingly:
``` $ /usr/bin/env "-S echo deadbeef" deadbeef ```
Most systems split at least the 1st space since decades.
> uv will magically install and use the specified modules.
As long as you have internet access, and whatever repository it's drawing from is online, and you may get different version of python each time, ...
And electricity and running water and oh the inconvenience. How is this worse than getting a script file that expects you to install modules?
If I download python project from someone on the same network as me and they have it written in a different python version to me and a requirements.txt I need all those things anyway.
You can constrain Python version: https://peps.python.org/pep-0723/#:~:text=requires-python
I mean, if you use == constraints instead of >= you can avoid getting different versions, and if you’ve used it (or other things which combined have a superset of the requirements) you might have everything locally in your uv cache, too.
But, yes, python scripts with in-script dependencies plus uv to run them doesn't change dependency distribution, just streamlines use compared to manual setup of a venv per script.
You can specify python version requirements in the comment, as the standard describes