No amount of chest-thumping about how good of a programmer you are and telling everyone else to, "get good," has had any effect on the rate of CVE's cause by memory safety bugs that are trivial to introduce in a C program.

There are good reasons to use C. It's best to approach it with a clear mind and a practical understanding of its limitations. Be prepared to mitigate those short comings. It's no small task!

I am not sure the number of CVEs measures anything meaningful. The price for zero-days for important targets goes into the millions.

While I am sure there can not be enough security, I am not at all sure the extreme focus on memory safety is worth it, and I am also not sure the added complexity of Rust is really worth it. I would prefer to simplify the stack and make C safer.

If that's your preference you're going about it all wrong. Rust's safety is about culture and you're looking at the technology, it's not that Rust doesn't have technology but the technology isn't where you start.

This was the only notable failing of Sean's (abandoned) "Safe C++" - it delivers all the technology a safe C++ culture would have needed, but there is no safe C++ culture so it was waved away as unimportant.

The guy whose mine loses fifty miners in a roof collapse doesn't need better mining technology, inadequate technology isn't why those miners died, culture is. His mine didn't have safety culture, probably because he didn't give shit about safety, and his workers either shared this dismissal or had no choice in the matter.

Also "extreme focus" is a misinterpretation. It's not an extreme focus, it's just mandatory, it's like if you said humans have an "extreme focus" on breathing air, they really don't - they barely even think about breathing air - it was just mandatory so if you don't do it then I guess that stands out.

Let's turn it around: Do you think the mining guy that does not care about safety will start caring about a safety culture because there is a new safety tool? And if it is mandated by government, will it be implemented in a meaningful way, or just on paper?

So there's a funny thing about mouthing the words, the way the human mind works the easiest way to explain to ourselves why we're mouthing the words is that we agree with them. And so in that sense what seems like a useless paper exercise can be effective.

Also, relevantly here, nobody actually wants these terrible bugs. This is not A or B, Red or Blue, this is very much Cake or Death and like, there just aren't any people queueing up for Death, there are people who don't particularly want Cake but that's not the same thing at all.

It will certainly be implemented in a meaningful way, if the consequences for the mining guy are hard enough that there won't be a second failure done by the same person.

Hence why I am so into cybersecurity laws, and if this is the only way to make C and C++ communities embrace a safety culture, instead of downplaying it as straitjacket programming like in the C vs Pascal/Modula-2 Usenet discussion days, then so be it.

At some point, in order to make C safer, you're going to have to introduce some way of writing a more formal specification of the stack, heap and the lifetime of references into the language.

Maybe that could be through a type system. Maybe that could be through a more capable run-time system. We've tried these avenues through other languages, through experimental compilers, etc.

Without introducing anything new to the language we have a plethora of tools at our disposal:

- Coq + Iris, or some other proof automation framework with separation logic.

- TLA+, Alloy, or some form of model checking where proofs are too burdensome/unnecessary

- AFL, Valgrind and other testing and static analysis tools

- Compcert: formally verified compilers

- MISRA and other coding guidelines

... and all of this to be used in tandem in order to really say that for the parts specified and tested, we're confident there are no use-after-free memory errors or leaks. That is a lot of effort in order to make that statement. The vast, vast majority of software out there won't even use most of these tools. Most software developers argue that they'll never use formal methods in industry because it's just too hard. Maybe they'll use Valgrind if you're lucky.

Or -- you could add something to the language in order to prevent at least some of the errors by definition.

I'm not a big Rust user. Maybe it's not great and is too difficult to use, I don't know. And I do like C. I just think people need to be aware that writing safe C is really expensive and time consuming, difficult and nothing is guaranteed. It might be worth the effort to learn Rust or use another language and at least get some guarantees; it's probably not as hard as writing safe C.

(Maybe not as safe as using Rust + formal methods, but at least you'll be forced to think about your specification up front before your code goes into production... and where you do have unsafe code, hopefully it will be small and not too hard to verify for correctness)

Update: fixed markup

The problem is not tools don't exist, lint was created in 1979 at Bell Labs after all.

It is the lack of culture to use them unless there is a goverment mandate to impose them, like in high critical computing.

I agree.

Definitely, but the idea is that its unique feature set is worth it.

Yeah, there are still good reasons to use it.