Also, it is about time to let go with GC-phobia.
https://www.withsecure.com/en/solutions/innovative-security-...
https://www.ptc.com/en/products/developer-tools/perc
Note the
> This video illustrates the use case of Perc within the Aegis Combat System, a digital command and control system capable of identifying and tracking incoming threats and providing the war fighter with a solution to address threats. Aegis, developed by Lockheed Martin, is critical to the operation of the DDG-51, and Lockheed Martin has selected Perc as the operating platform for Aegis to address real-time requirements and response times.
Not all GCs are born alike.
> Not all GCs are born alike.
True. However in the bounded-time GC space few projects share the same definitions of low-latency or real-time. So you have to find a language that meets all of your other desiderata and provides a GC that meets your timing requirements. Perc looks interesting, Metronome made similar promises about sub-ms latency. But I'd have to get over my JVM runtime phobia.
I consider one where human lifes depend on it, for good or worse depending on the side, real time enough.
Human lives often depend on processes that can afford to be quite slow. You can have a real time system requiring only sub-hour latency; the "realness" of a real-time deadline is quite distinct from the duration of that deadline.
I don’t have an issue with garbage collectors. Most code I write is GC’d.
The thing that actually convinced me to learn Rust was for something that I wanted to use less memory; my initial Clojure version, compiled with GraalVM, hovered around 100 megs. When I rewrote it in Rust, it hovered around 500kb.
It’s not completely apples to apples, and the laptop running this code has a ton of RAM anyway, but it’s still kind of awesome to see a 200x reduction in memory without significantly more complicated code.
A lot of the stuff I have to do in Rust for GC-less memory safety ends up being stuff I would have to do anyway in a GC’d language, e.g. making sure that one thread owns the memory after it has been transferred over a channel.
That GC introduces latencies of ~1000µs. The article is about eliminating ~10µs context switching latencies. Completely different performance class. The "GC-phobia" is warranted if you care about software performance, throughput, and scalability.
DoD uses languages like Java in applications where raw throughput and low-latency is not critical to success. A lot of what AEGIS does is not particularly performance sensitive.
GC is fine, what scaries me is using j*va in Aegis..
The OutOfMemoryError will happen after rocket hits the target.
Real-time GCs can only guarantee a certain number of deallocations per second. Even with a very well-designed GC, there's no free lunch. A system which manages its memory explicitly will not need to risk overloading its GC.
I think you have that backwards; they can only guarantee a certain number of allocations per second (once the application hits steady-state the two are the same, but there are times when it matters)