There's no dynamic memory allocation with (100%) Spark. That's really limiting. You can to write "unsafe" code, but that has the same problems as Ada.

I thought SPARK got dynamic memory allocation when it adopted Rust-style ownership and borrowing in 2014.

That is true for parsers like libjs, but again crypto module or even networking, can still be written in spark, which is much more safety critical.

SPARK is not used for the whole system, but for the < 5% parts, which are safety/security-related in a good architecture.