> But hasn't all that foundational code been stable and wrung out already over the last 30+ years?

Not necessarily. The "HTTP signature verification code" sounds like it's invoking cryptography, and the sense I've had from watching the people who maintain cryptographic libraries is that the "foundational code" is the sort of stuff you should run away screaming from. In general, it seems to me to be the cryptography folks who have beat the drum hardest for moving to Rust.

As for other kind of parsing code, the various archive file formats aren't exactly evolving, so there's little reason to update them. On the other hand, this is exactly the kind of space where there's critical infrastructure that has probably had very little investment in adversarial testing either in the past or present, and so it's not clear that their age has actually led to security-critical bugs being shaken out. Much as how OpenSSL had a trivially-exploitable, high criticality exploit for two years before anybody noticed.

Actual cryptography code, the best path is formally verified implementations of the crypto algorithms; with parsers for wrapper formats like OpenPGP or PKCS#7 implemented in a memory safe language.

You don't want the core cryptography implemented in Rust for Rust's sake when there's a formally verified Assembler version next to it. Formally verified _always_ beats anything else.

I should have clarified that I was primarily referring to the stuff dealing with all the wrapper formats (like PKIX certificate verification), not the core cryptographic algorithms themselves.

The core cryptographic algorithms, IMHO, should be written in a dedicated language for writing cryptographic algorithms so that they can get formally-verified constant-time assembly out of it without having to complain to us compiler writers that we keep figuring out how to deobfuscate their branches.

Sure. But assembly implementations by definition are not portable. And I don’t know what it takes to write a formally verified library line this, but I bet it’s very expensive.

In contrast, a rust implementation can be compiled into many architectures easily, and use intrinsically safer than a C version.

Plus cryptography and PKI is constantly evolving. So it can’t benefit from the decades old trusted implementations.

> Formally verified _always_ beats anything else.

Formally verified in an obscure language where it's difficult to find maintainers does not beat something written in a more "popular" language, even if it hasn't been formally verified (yet?).

And these days I would (unfortunately) consider assembly as an "obscure language".

(At any rate, I assume Rust versions of cryptographic primitives will still have some inline assembly to optimize for different platforms, or, at the very least, make use of compile intrinsics, which are safer than assembly, but still not fully safe.)

With crypto, you really want to just write the assembly, due to timing issues that higher level languages simply cannot guarantee.

It's insanely complex, particularly you want _verified_ crypto. Last year (or two years ago?) I had to fix a tiny typo in OpenSSL's ARM assembly for example, it was breaking APT and Postgres left and right, but only got triggered on AWS :D

You don't want to write the whole thing in assembly, just the parts that need to be constant time. Even those are better written as called subroutines called from the main implementation.

Take BLAKE3 as an example. There's asm for the critical bits, but the structural parts that are going to be read most often are written in rust like the reference impl.

Yes, for sure.

I would like a special purpose language to exist precisely for writing crytographic code where you always want the constant time algorithm. In this niche language "We found a 20% speed-up for Blemvich-Smith, oops, it actually isn't constant time on the Arrow Lake micro-code version 18 through 46" wouldn't even get into a nightly let alone be released for use.

It seems that for reasons I don't understand this idea isn't popular and people really like hand rolling assembly.

There's been plenty, like RobustIsoCrypt or FaCT:

https://github.com/PLSysSec/FaCT

They struggle to guarantee constant time for subroutines within a non-constant time application, which is how most people want to use cryptography.

I do think this is pretty much the one use case for a true "portable assembler", where it basically is assembly except the compiler will do the register allocation and instruction selection for you (so you don't have to deal with, e.g., the case that add32 y, x, 0xabcdef isn't an encodable instruction because the immediate is too large).

You can't avoid those with NASA Power of 10 sorts of restrictions?

If you mean GnuPG, that is what Snowden used. It could be better than new software that may have new bugs. Memory safety is a very small part of cryptographic safety.

(New cryptographic software can also be developed by all sorts of people. In this case I'm not familiar, but we do know that GnuPG worked for the highest profile case imaginable.)

GPG works great if you use it to encrypt and decrypt emails manually as the authors intended. The PGP/GPG algorithms were never intended for use in APIs or web interfaces.

Ironically, it was the urge not to roll your own cryptography that got people caught in GPG-related security vulnerabilities.