For those using uv, you can at least partially protect yourself against such attacks by adding this to your pyproject.toml:

  [tool.uv]
  exclude-newer = "7 days"
or this to your ~/.config/uv/uv.toml:

  exclude-newer = "7 days"
This will prevent uv picking up any package version released within the last 7 days, hopefully allowing enough time for the community to detect any malware and yank the package version before you install it.

Pip maintainer here, to do this in pip (26.0+) now you have to manually calculate the date, e.g. --uploaded-prior-to="$(date -u -d '3 days ago' '+%Y-%m-%dT%H:%M:%SZ')"

In pip 26.1 (release scheduled for April 2026), it will support the day ISO-8601 duration format, which uv also supports, so you will be able to do --uploaded-prior-to=P3D, or via env vars or config files, as all pip options can be set in either.

Thanks!

[flagged]

I am a slow adopter of uv. I'll be honest, its speed has never been a big deal to me and, in general, it is YAPT (yet another package tool), but this one feature may make me reconsider. Pinning versions is less than perfect but I would really like to be able to stay XXX days behind exactly for this reason.

I think the python community, and really all package managers, need to promote standard cache servers as first class citizens as a broader solution to supply chain issues. What I want is a server that presents pypi with safeguards I choose. For instance, add packages to the local index that are no less than xxx days old (this uv feature), but also freeze that unless an update is requested or required by a security concern, scan security blacklists to remove/block packages and versions that have been found to have issues. Update the cache to allow a specific version bump. That kind of thing. Basically, I have several projects and I just want to do a pip install but against my own curated pypi. I know this is the intent of virtual envs/lock files, etc, but coordinating across projects and having my own server to grab from when builds happen (guaranteeing builds won't fail) is import. At a minimum it would be good to have a 'curated.json' or something similar that I could point pip/other package managers to to enforce package policies across projects. These supply chain attacks show that all it takes is a single update and your are in big trouble so we, unfortunately, need more layers of defense.

> I think the python community, and really all package managers, need to promote standard cache servers as first class citizens as a broader solution to supply chain issues. What I want is a server that presents pypi with safeguards I choose. For instance, add packages to the local index that are no less than xxx days old (this uv feature), but also freeze that unless an update is requested or required by a security concern, scan security blacklists to remove/block packages and versions that have been found to have issues. Update the cache to allow a specific version bump. That kind of thing.

FWIW, https://pypi.org/project/bandersnatch/ is the standard tool for setting up a PyPI mirror, and https://github.com/pypi/warehouse is the codebase for PyPI itself (including the actual website, account management etc.).

If "my own curated pypi" extends as far as a whitelist of build artifacts, you can just make a local "wheelhouse" directory of those, and pass `--no-index` and `--find-links /path/to/wheelhouse` in your `pip install` commands (I'm sure uv has something analogous).

if everyone waited a week, then everyb would still be installing it it the same time for the first time. This is not a solution.

A lot of automated scanners run during that week.

You don't have to run faster than the bear, you just have to run faster than the other guy.

true, but this is an automated way to ensure everyone runs at the exact same speed. So I don't know how that applies here

Let security researchers, staff and automated malware scanners take a bite first.

Does this also delay delivery of security fixes? Is there an override mechanism for a log4j type event?

It delays everything. You can manually override some packages, but the community can't push through it.

RPM (YUM? DNF? RHEL?) lets me subscribe to security updates separately from updates. Does that concept exist in language distribution?

I don't know how it would. Hackers would just claim everything is a security update.

Unless maybe you give special permission to some trusted company to designate certain releases of packages they don't own are security patches... But that sounds untenable.

It would have to be handled by the repository owner(e.g. PyPI) similar to how quarantines are done.

Rather than being hopeful why not start running 'uv' inside sandbox?

Why does your python package (cli/Web server/library) need full access to your full disk at the time of execution?

You're doing all of your software development inside containers, all the time?

That is very inconvenient.

> That is very inconvenient.

All executions (especially of random third-party code) inside the containers are not inconvenient at all for me.

Infact, I even open-sourced my setup - https://github.com/ashishb/amazing-sandbox

I'd argue it's not only not inconvenient, but also a great way of keeping your system clean of all the random system-wide dependencies you'll end up accumulating over the years.

Devcontainers are looking pretty gold right now…

Why? Just open your entire editor/whatever inside a limited namespace and that's it no?

> Why? Just open your entire editor/whatever inside a limited namespace and that's it no?

How will that prevent `npm run dev` or `uv run python` from accessing files outside your current directory?

Do you know what linux namespaces are?

I do. It wasn't obvious that that's what you were referring to. If you use it regularly then that's great.

[dead]

EDIT: This was caused by using an old version uv (0.7.3) updating with `uv self update` to the latest version (0.11.2) resolved it. Original message below:

While the first form seems to work with `pyproject.toml`, it seems like the second form in the global `uv.toml` only accepts actual dates and not relative times. Trying to put a relative time (either in the form "7 days" or "P7D") results in a failed to parse error.

Love it! Let those pip users find the compromised packages for us uv users.

Until everyone waits 7 days to install everything so the compromise is discovered on the 8th day.

End result will be everyone runs COBOL only.

Or just scan all GitHub repos, find their .toml definition. Calculate the median and then add 7 days to that. That way you are always behind.

Or Forth with scientific library, bound to the constraints. Put some HTTP library on top and some easy HTML interface from a browser with no JS/CSS3 support at all. It will look rusty but unexploitable.

Enterprise computing with custom software will make a comeback to avoid these pitfalls. I depise OpenJDK/Mono because of patents but at least they come with complete defaults and a 'normal' install it's more than enough to ship a workable application for almost every OS. Ah, well, smartphones. Serious work is never done with these tools, even with high end tables. Maybe commercials/salespeople and that's it.

It's either that... or promoting reproducible environment with Guix everywhere. Your own Guix container, isolated, importing Pip/CPAN/CTAN/NPM/OPAM and who knows else into a manifest file and ready to ship anywhere, either as a Guix package, a Docker container (Guix can do that), a single DEB/RPM, an AppImage ready to launch on any modern GNU/Linux with a desktop and a lot more.

  > Or Forth with scientific library, bound to the constraints. Put some HTTP library on top and some easy HTML interface from a browser with no JS/CSS3 support at all. It will look rusty but unexploitable.
Let this be a lesson to you youngsters that nothing in unexploitable.

Forth has no standard library for interfacing with SQLite or any other database. You're either using 8th or the C ABI. Therefore, you'll most likely be concatenating SQL queries. Are you disciplined enough to make that properly secure? Do you know all the intricacies?

GForth might have then for sure (Sqlite it's small and supported by even jimtcl) . Also, there's Factor, a Forth inspired language.

I'm already ahead of you. I'm using `exclude-newer = "8 days"`

But not all project exploited in a supply chain attack get exploited on the same day.

So when project A gets pwned on day 1 and then, following the attack, project B gets pwned on day 3, if users wait 7 days to upgrade, then that leaves two days for the maintainers of project B to fix the mess: everybody shall have noticed on the 8th day that package A was exploited and that leaves time for project B (and the other projects depending on either A or B) to adapt / fix the mess.

As a sidenote during the first 7 days it could also happen that maintainers of project A notices the shenanigans.

:-) That might not even be enough as I hear (but haven't verified) that Claude does a pretty good job of making sense out of legacy COBOL code!

I really wish uv had some sandboxing built in.

Please open an issue on the uv tracker! This is a design space we’re actively thinking about, and it’s valuable to hear user perspectives on what they would and wouldn’t want a sandbox to do.

Is “7 days” valid? Docs suggest it has to be an iso 8601 period or a rfc 3339 timestamp.

https://docs.astral.sh/uv/reference/settings/#exclude-newer

"Accepts RFC 3339 timestamps (e.g., 2006-12-02T02:07:43Z), a \"friendly\" duration (e.g., 24 hours, 1 week, 30 days), or an ISO 8601 duration (e.g., PT24H, P7D, P30D)."

Nice feature. However uv is suspect at the moment, in the sense that it is designed as a pip replacement to overcome issues that only exist when supply chains are of a size that isn't safe to have.

So any project that has UV and any developer that tries to get uv into a project is on average less safe than a project that just uses pip and a requirements.txt

Sorry - call me uninformed. But I do not really understand how choosing uv makes me less safe than using pip.

Care to explain? Would love to learn.

It is a bit of a leap. They are saying that if you are using uv, then you likely have a broad set of dependencies because you require a dependency management tool, therefore you are more susceptible to a supply chain attack by virtue of having a wider attack surface.

Ahhhhhh thanks a ton. Now I get it. Meaning I get what you are saying. Not what they were implying. But yeah. I can understand at least how one could arrive at that idea.

To me personally this idea still sounds a bit off - but as a heuristic it might have some merit in certain circumstances.

I really am not able to follow this line of reasoning, I am not sure if what you said makes sense and how it relates to uv having a security feature to be on average less safe :/

I believe they are saying that by the time you need something like uv, your project already has too many dependencies. Its the unnecessarily large supply chain that's the problem, and uv exists to solve a problem that you should try to avoid in the first place.

I think uv is great, but I somewhat agree. We see this issue with node/npm. We need smaller supply chains/less dependencies overall, not just bandaiding over the poor decisions with better dependency management tooling.

Ah this simplifies what they were saying.

I agree with it that dependency management should be made easier. To be honest, I really like how golang's dependency and how golang's community works around dependencies and how golang has a really great stdlib to work with and how the community really likes to rely on very little depenendencies for the most part as well.

Maybe second to that, Zig is interesting as although I see people using libraries, its on a much lower level compared to rust/node/python.

Sadly, rust suffers from the same dependency issue like node/python.

This line of thought is honestly a bit silly - uv is just a package manager that actually does its job for resolving dependencies. You’re talking about a completely orthogonal problem.

> uv is just a package manager that actually does its job for resolving dependencies.

Pip resolves dependencies just fine. It just also lets you try to build the environment incrementally (which is actually useful, especially for people who aren't "developers" on a "project"), and is slow (for a lot of reasons).

uv is really only something you need if you already aren't managing dependencies responsibly, imo.

This is complete nonsense. pip has all the same problems that you say uv has.

The (not very convincing, IMO) argument is that pip becomes unergonikix for a certain dependency tree size leading people to use uv instead. Of course that's not the only or main reason people use uv, presumably.

Huh?

Wanting a better pip means I am unsafe?