It makes me wish more graphics programmers would jump over to application development - 16.7ms is a huge amount of time for them, and 60 frames per second is such a low target. 144 or bust.

I don't think graphics devs changing over would change much. They would probably not lament over 16ms, but they would quickly learn that performance does not matter much in application development, and start building their own abstraction layer cake.

It's not even that performance is unimportant in absolute terms, but rather that the general state of software is so abysmal that performance is the least of your problems as a user, so you're not going to get excited over it.

No need for graphics programmers, anyone that is still around coding since the old days, does remember on how to make use of data structures, algorithms, and how to do much with little hardware resources.

Maybe the RAM prices will help bringing those skills back.

Those practices make things hard to modify and update. As one example, I use to use binary formats for game data. The data is loaded into memory and it's ready to use, at most a few offsets need to be changed to pointers, in place.

Good: (*) no extra memory needed (*) can read directly from CD/DVD/HD to memory - not through buffered i/o (*) no parsing time

Bad: (*) change or add a single field in to any part of the saved data and all previous files are now invalid.

I'd solve this with a version number and basically print a message "old version in file, rebuild your data"

Modern apps use JSON/Protobufs, etc. They are way slower to load, take way more memory as there are usually 2 versions of the data, the data in the file and the parsed data your app is actually using. They take time to parse.

But, they continue to work even with changes and they can more trivially be used across languages and apps.

That's just one example of 100s of others where the abstraction is slower and uses more memory but provides flexibility the old ones didn't.

No need for such extremes, it would already be enough to stop using Electron crap, or React on the terminal.

Followed by actually reading a data structures and algorithms book.

Finally, using compiled languages instead of interpreted ones.

No need to count bytes to fit into 48 KB.

I disagree. I use all of those because they let me get work done faster. I've shipped Electron apps to clients that are effectively just a wrapper around a command line app because (1) I knew the client would much prefer a GUI over a command line (2) I was on Mac and they were on Windows and I knew I could more trivially get a cross platform app (3) I could make the app look good more trivially with HTML/CSS than I could with native (4) Iteration time is instant (change something /refresh). The clients don't care. Interpreted languages are fine.

And that is how we get the mess we are in today.

Exactly. It turns out there are strong business incentives that favour tall towers of abstractions. Then everyone's using Electron, so it doesn't really matter if it's slow. People just come to perceive computers as slow. I hate it.

And embedded too. But then again, they do what they do precisely because in that environment those skills are appreciated, and elsewhere they are not.

It's mostly on the business side. If business doesn't care then developers have no choice. Ofc the customers need to care too, looks like we don't care either...in general.

One tradeoff is that one of the tradeoffs graphics programmers do is about security. They typically work with raw pointers, using custom memory allocation strategies, memory safety comes after performance. There is not much in terms of sandboxing, bounds checking, etc... these things are costly in terms of performance, so they don't do it if they don't have to.

That's because performance is critical to games (where the graphics programmers usually are), and if the game crashes, no big deal as long as it doesn't happen so often as to seriously impact normal gameplay experience. Exploits are to be expected and sometimes kept deliberately if it leads to interesting gameplay, it is a staple of speedruns. Infinite money is fun in a game, but not in serious banking software...

I am all for performance, and I think the current situation is a shame, but there are tradeoffs, we need people who care about both performance and security, maybe embedded software developers who work on critical systems, but expect a 10x increase in costs.

That wouldn't make any difference. Graphics programmers spend a lot of effort on performance because spending a lot of $$$$ (time) can make an improvement that people care about. For most applications nobody cares enough about speed to pay the $$$ needed to make it fast.

Many application programmers could make things faster - but their boss says good enough, ship it, move to a new feature that is worth far more to me.