That feature is actually from Pascal, and Modula-2, before making its way into Ada.
For some strange reason people always relate to Ada for it.
That feature is actually from Pascal, and Modula-2, before making its way into Ada.
For some strange reason people always relate to Ada for it.
I would guess that Ada is simply more known. Keep in mind that tech exploded in the past ~3.5 decades whereas those languages are much older and lost the popularity contest. If you ask most people about older languages, the replies other than the obvious C and (kind of wrong but well) C++ are getting thin really quickly. COBOL, Ada, Fortran, and Lisp are probably what people are aware of the most, but other than that?
You've forgotten about BASIC, SNOBOL, APL, Forth, and PL/1. There were many interesting programming languages back then. Good times!
The first five languages I learned back in the 70s: FORTRAN, Pascal, PL/I, SNOBOL, APL. Then I was an Ada and Icon programmer in the 80s. In the 90s, it was C/C++ and I just never had the enthusiasm for it.
Icon (which came from SNOBOL) is one of the few programming languages I consider to embody truly new ideas. (Lisp, Forth, and Prolog are others that come to mind.)
Icon is an amazing language and I wish it was better known.
You probably know this but, for anyone else who is interested, the easiest way to get a feel for Icon nowadays may be through its descendant Unicon which is available at unicon.org.
I found Pascal more readable as a budding programmer. Later on, C's ability to just get out of the way to program what I wanted trumped the Pascal's verbosity and opinionatedness.
I admit that the terseness of the syntax of C can be off-putting. Still, it's just syntax, I am sorry you were disuaded by it.
True.
I dabbled in some of them during some periods when I took a break from work. And also some, during work, in my free time at home.
Pike, ElastiC (not a typo), Icon, Rebol (and later Red), Forth, Lisp, and a few others that I don't remember now.
Not all of those are from the same period, either.
Heck, I can even include Python and Ruby in the list, because I started using them (at different times, with Python being first) much before they became popular.
For me it's because I learned Ada in college.
18 year old me couldn't appreciate how beautiful a language it is but in my 40s I finally do.
2000-2005 College was teaching Ada?
2005-2010 my college most interesting (in this direction) language was Haskell. I don't think that there was any other language (like Ada) being taught)
Yes, I learned it in a course that surveyed a bunch of different languages like Ada, Standard ML, and Assembly
Ada is sometimes taught as part of a survey course in Programming Languages. That’s how I learned a bit about it.
Turbo Pascal could check ranges on assignment with the {$R+} directive, and Delphi could check arithmetic overflow with {$Q+}. Of course, nobody wanted to waste the cycles to turn those on :)
Most Pascal compilers could do that actually.
Yeah not wanting to waste cycles is how we ended up with the current system languages, while Electron gets used all over the place.
I would argue that was one of the reasons why those languages lost.
I distinctly remember arguments for functions working on array of 10. Oh, you want array of 12? Copy-paste the function to make it array of 12. What a load of BS.
It took Pascal years to drop that constraint, but by then C had already won.
I never ever wanted the compiler or runtime to check a subrange of ints. Ever. Overflow as program crash would be better, which I do find useful, but arbitrary ranges chosen by programmer? No thanks. To make matters worse, those are checked even by intermediate results.
I realize this is opinioned only on my experience, so I would appreciate a counter example where it is a benefit (and yes, I worked on production code written in Pascal, French variant even, and migrating it to C was hilariously more readable and maintainable).
> I never ever wanted the compiler or runtime to check a subrange of ints. Ever.
Requiring values to be positive, requiring an index to fall within the bounds of an array, and requiring values to be non-zero so you never divide by zero are very, very common requirements and a common source of bugs when the assumptions are violated.
Thankfully instead of overflow, C gets you the freedom of UB based optimizations.
Funny :)
It still results in overflow and while you are right that it's UB by the standard, it's still pretty certain what will happen on a particular platform with a particular compiler :)
No, optimizing compilers don't translate overflow to platform-specific behavior for signed integers - since it's UB they'll freely make arithmetic or logic assumptions that can result in behavior that can't really be humanly predicted without examining the generated machine code.
They are free to but not required. You can pick a different compiler, or you can configure your compiler to something else, if it provides such options.
I always found it surprising that people did not reject clang for aggressively optimizing based on UB, but instead complained about the language while still using clang with -O3.
Programmers don’t have much choice, since most compilers don’t really provide an option / optimization level that results in sane behavior for common UB footguns while providing reasonable levels of performance optimization.
The one exception I know of is CompCert but it comes with a non-free license.
I definitely do think the language committee should have constrained UB more to prevent standards-compliant compilers from generating code that completely breaks the expectations of even experienced programmers. Instead the language committees went the opposite route, removing C89/90 wording from subsequent standards that would have limited what compilers can do for UB.
The C89/C90 wording change story is a myth. And I am not sure I understand your point about CompCert. The correctness proof of CompCert covers programs that have no UB. And programmers do have some choice and also some voice. But I do not see them pushing for changes a lot.
The choice is going for other languages because they don't believe WG14, or WG21 will ever sort this out, as many are doing nowadays.
This is my point, programmers apparently fail to understand that they would need to push for changes at the compiler level. The committee is supposed to standardize what exist, it has no real power to change anything against the will of the compiler vendors.
FYI all major C compilers have flags to enforce the usual two's-complement rollover, even if you enable all optimizations. I always enable at least fwrapv, even when I know that the underlying CPU has well defined overflow behavior (gcc knows this so the flag presumably becomes a no-op, but I've never validated that thought).
gcc has -fwrapv and -f[no-]strict-overflow, clang copied both, and MSVC has had a plethora of flags over the years (UndefIntOverflow, for example) so your guess is as good as mine which one still works as expected.
compile time user config checking?
Sorry? That's not possible...
I've seen it plenty of times. safety critical controllers have numeric bounds of stability. why wouldn't you want to encode that into the type