I wish I could upvote this twice, and I will go further.

Computing as a whole has a human factors problem. The people most able to fix this problem - programmers - are also the people who least recognize that there is a problem to fix. Indeed programmers are invested, both literally and subconsciously, in the current order. Programmers implicitly place themselves at the top of a computing class hierarchy that strictly categorizes "developers" and "users" by whether they pay or are paid, and developments that facilitate "normies" programming to a useful level are staunchly opposed. The current front line of this debate is LLM programming, which is a goddamned revolution as far as non-programmers are concerned, and programmers hate that non-programmers might be able to write programs, and tell scary stories about security vulnerabilities. But it's not new - before they were sneering at LLMs, they were sneering at LabVIEW, and telling scary stories about maintainability. Or Excel.

The smartphone revolution was really a user interface revolution, but in making computers accessible it also infantilized the users. This philosophy is now baked in to such an extent that not only are you not allowed root on most computers, but writing an "app" has an exponentially higher barrier to entry than writing a shell script, or even a basic Hello World in C. Programming is becoming harder, not easier. Compare to the 80s, when a child or teen's "game console" might have been a ZX Spectrum or Commodore 64, which would boot right to BASIC - a language which, incidentally, remains a much better introduction to programming than Python.

This reminds me of a tweet from a while back: https://x.com/jonathoda/status/960952613507231744

  My thesis: The theory and practice of programming is permeated with the sensibilities of high-functioning autistics like myself. De-nerding programming will unlock great benefits for all of humanity. We too will benefit, for despite our hubris we are also way over our heads.
There's a great deal of truth to that. Programming languages are made by nerds, for nerds, and that's a problem in a world that is becoming more automated every day.

> Compare to the 80s, when a child or teen's "game console" might have been a ZX Spectrum or Commodore 64, which would boot right to BASIC

Fully agreed here! I had a very unsettling conversation with a student interested in making video games. He was 18, and in a CS degree program, and he has told me he had been into video games his whole life, but he never knew you could use the computer he had at home growing up to make them.

This floored me because like you, I had experience booting up an old PC and having access to a programming language right there. Mine was QBasic on DOS, and I used it to make a rudimentary Zork type text adventure game. I was 6 or 7, and tons of people my age had that same experience getting into programming.

I would have thought in the 30 years since that time, with the proliferation of computing devices and especially game creation software, that it would be more accessible today to get into gaming. And in some ways it is, but in many ways it's also been heavily monetized and walled off, to the point that every day people are discouraged from creating their own games. It's really quite sad, because we've actually learned a lot over the years about how to make computing more accessible to people, but we haven't invested enough in putting that knowledge into real products that people use.