I’m just going to start teaching classes of C programming to university first-year CS students. Would you teach `defer` straight away to manage allocated memory?
I’m just going to start teaching classes of C programming to university first-year CS students. Would you teach `defer` straight away to manage allocated memory?
My suggestion is no - first have them do it the hard way. This will help them build the skills to do manual memory management where defer is not available.
Once they do learn about defer they will come to appreciate it much more.
In university? No, absolutely not straight away.
The point of a CS degree is to know the fundamentals of computing, not the latest best practices in programming that abstract the fundamentals.
My university also taught best practices alongside that, everytime. I am very grateful for that.
Absolutely the wrong take. You can teach CS with just pencil and paper, but that doesn’t advance the technology, it might only benefit academia in a narrow sense. CS students should be actively engineering software in addition to doing science.
It's still only in a TS, not in ISO C, if that matters.
No, but also skip malloc/free until late in the year, and when it comes to heap allocation then don't use example code which allocates and frees single structs, instead introduce concepts like arena allocators to bundle many items with the same max lifetime, pool allocators with generation-counted slots and other memory managements strategies.
Is there any C tutorials you know that do that so I can try to learn how to do it?
Shameless plug ;)
https://floooh.github.io/2018/06/17/handles-vs-pointers.html
This only covers one aspect though (pools indexed by 'generation-counted-index-handles' to solve temporal memory safety - e.g. a runtime solution for use-after-free).
No. They need to understand memory failures. Teach them what it looks like when it's wrong. Then show them the tools to make things right. They'll never fully understand those tools if they don't understand the necessity of doing the right thing.
IMHO, it is in the best interest of your students to teach them standard C first.
There is a technical specification, so hopefully it will be standard C in the next version. And given that gcc and clang already have implementatians (and gcc has had a way to do it for a long time, although the syntax is quite different).
It is not yet a technical specification, just a draft for one, but this will hopefully change this year, and the defer patch has not been merged into GCC yet. So I guess it will become part of C at some point if experience with it is good, but at this time it is an extension.
I was under the wrong assumption that defer was approved for the next standard already.
We will likely decide in March that it will became an ISO TS. Given the simplicity of the feature and its popularity, I would assume that it will become part of the standard eventually.
That’s great news!
Yes.
If you're teaching them to write an assembler, then it may be worth teaching them C, as a fairly basic language with a straightforward/naive mapping to assembly. But for basically any other context in which you'd be teaching first-year CS students a language, C is not an ideal language to learn as a beginner. Teaching C to first-year CS students just for the heck of it is like teaching medieval alchemy to first-year chemistry students.
I think I heard this in some cppcon video, from uni teacher who had to make students know both C and Python, so he experimented for several years
learning Python first is same difficulty as learning C first (because main problem is the whole concept of programming)
and learning C after Python is harder than learning Python after C (because of pointers)
Learning C after Python made me actually understand Python semantics. Python also has pointers, you just don't get to control them.
Absolutely, it's not their first language. In our curriculum C programming is part of the Operating Systems course and comes after Computer Architecture where they see assembly. So its purpose is to be low level to understand what's under the hood. To learn programming itself they use other languages (currently Java, for better or worse, but I don't have voice on that choice).
C is the best language to learn as a beginner.
At no point in human history has C been the best language for beginners. C was, like Javascript, hacked together in a weekend by someone who wished they could be using a better language. It was burdened with flaws from the outset and considered archaic in its design almost immediately. The best thing that can be said about the design of C is that it's at least a structured programming language, which is damning with faint praise.
> C was, like Javascript, hacked together in a weekend by someone who wished they could be using a better language.
Citation needed. C was evolved from B as part of the development of a popular OS. It did take a lot more time and consideration than a weekend.