> Nowadays, we have so much processing power available that the compiler can optimize the code for you, so the language doesn't have to follow hardware capabilities anymore.
That must be why builds today take just as long as in the 1990s, to produce a program that makes people wait just as long as in the 1990s, despite the hardware being thousands of times faster ;)
In reality, people just throw more work at the compiler until build times become "unbearable", and optimize their code only until it feels "fast enough". These limits of "unbearable" and "fast enough" are built into humans and don't change in a couple of decades.
Or as the ancient saying goes: "Software is a gas; it expands to fill its container."
At least we can build software systems that are a few orders of magnitude more complex than in the 90s for approximately the same price. The question is whether the extra complexity also offers extra value.
True, but a lot of that complexity is also just pointless boilerplate / busywork disguised as 'best practices'.
I am eager to have an example to explain how a "best practices" is making the software unbearable or slow?
Some C++ related 'best practices' off the top of my head:
- put each class into its own header/source file pair (a great way to explode your build times!)
- generally replace all raw pointers with shared_ptr or unique_ptr
- general software patterns like model-view-controller, a great way to turn a handful lines of code into dozens of files with hundreds of lines each
- use exceptions for error handling (although these days this is widely considered a bad idea, but it wasn't always)
- always prefer the C++ stdlib over self-rolled solutions
- etc etc etc...
It's been a while since I closely followed modern C++ development, so I'm sure there are a couple of new ones, and some which have fallen out of fashion.
> - put each class into its own header/source file pair (a great way to explode your build times!)
Only if you fail to use binary libraries in the process.
Apparently folks like to explode build times with header only libraries nowadays, as if C and C++ were scripting languages.
> - generally replace all raw pointers with shared_ptr or unique_ptr
Some folks care about safety.
I have written C applications with handles, doing two way conversions between pointers and handles, and I am not talking about Windows 16 memory model.
> - general software patterns like model-view-controller, a great way to turn a handful lines of code into dozens of files with hundreds of lines each
I am old enough to have used Yourdon Structured Method in C applications
> - use exceptions for error handling (although these days this is widely considered a bad idea, but it wasn't always)
Forced return code checks with automatic stack unwinding are still exceptions, even if they look differently.
Also what about setjmp()/longjmp() all over the place?
> - always prefer the C++ stdlib over self-rolled solutions
Overconfidence that everyone knows better than people paid to write compilers usually turns out bad, unless they are actually top developers.
There are plenty of modern best practices for C as well, that is how we try to avoid making a mess out of people think being a portable assembler, and industries rely on MISRA, ISO 26262, and similar for that matter.
> put each class into its own header/source file pair (a great way to explode your build times!)
Is that really sufficient to explode build times on its own? Especially if you're just using the more basic C++ features (no template (ab)use in particular).
Not at all, you can write in the C subset that C++ supports and anti-C++ folks will still complain.
Meanwhile the C builds done in UNIX workstations (Aix, Solaris, HP-UX) for our applications back in 2000, were taking about 1 hour per deployment target, hardly blazing fast.