I've done some research on this. I've asked classrooms of kids the words they would use to describe programming. When they are young in elementary school, the are taught Scratch, and describe programming as fun, exciting, challenging, favorite activity, something they look forward to.
Then I surveyed older kids, when they get to middle school and they transition immediately from scratch to Python and Java using VS Code. The words students use to describe programming take a dark turn: hard, frustrating, scary, not for me, are the top sentients. Programming starts up there next to recess in terms of K-6 approval rating, but plummets to to math class status in just a few years.
I attribute the change to language a tool design. This change in sentiment happens exactly when they are taken from tools designed for kids and forced to use tools designed for professional programmers. There's a chasm between Scratch and "real" programming languages. As lauded as Python is for being a good beginner or learning language, it does not fill that gap. In fact, it's part of the problem -- people believing it's a good enough first language prevents other perhaps better suited languages from being adopted. It may be a good language for dev-brained individuals, but for other people they can get discouraged very easily by it and the tooling ecosystem. I teach graduate students who find it perplexing, so middle school students don't stand a chance.
I wish I could upvote this twice, and I will go further.
Computing as a whole has a human factors problem. The people most able to fix this problem - programmers - are also the people who least recognize that there is a problem to fix. Indeed programmers are invested, both literally and subconsciously, in the current order. Programmers implicitly place themselves at the top of a computing class hierarchy that strictly categorizes "developers" and "users" by whether they pay or are paid, and developments that facilitate "normies" programming to a useful level are staunchly opposed. The current front line of this debate is LLM programming, which is a goddamned revolution as far as non-programmers are concerned, and programmers hate that non-programmers might be able to write programs, and tell scary stories about security vulnerabilities. But it's not new - before they were sneering at LLMs, they were sneering at LabVIEW, and telling scary stories about maintainability. Or Excel.
The smartphone revolution was really a user interface revolution, but in making computers accessible it also infantilized the users. This philosophy is now baked in to such an extent that not only are you not allowed root on most computers, but writing an "app" has an exponentially higher barrier to entry than writing a shell script, or even a basic Hello World in C. Programming is becoming harder, not easier. Compare to the 80s, when a child or teen's "game console" might have been a ZX Spectrum or Commodore 64, which would boot right to BASIC - a language which, incidentally, remains a much better introduction to programming than Python.
This reminds me of a tweet from a while back: https://x.com/jonathoda/status/960952613507231744
There's a great deal of truth to that. Programming languages are made by nerds, for nerds, and that's a problem in a world that is becoming more automated every day.> Compare to the 80s, when a child or teen's "game console" might have been a ZX Spectrum or Commodore 64, which would boot right to BASIC
Fully agreed here! I had a very unsettling conversation with a student interested in making video games. He was 18, and in a CS degree program, and he has told me he had been into video games his whole life, but he never knew you could use the computer he had at home growing up to make them.
This floored me because like you, I had experience booting up an old PC and having access to a programming language right there. Mine was QBasic on DOS, and I used it to make a rudimentary Zork type text adventure game. I was 6 or 7, and tons of people my age had that same experience getting into programming.
I would have thought in the 30 years since that time, with the proliferation of computing devices and especially game creation software, that it would be more accessible today to get into gaming. And in some ways it is, but in many ways it's also been heavily monetized and walled off, to the point that every day people are discouraged from creating their own games. It's really quite sad, because we've actually learned a lot over the years about how to make computing more accessible to people, but we haven't invested enough in putting that knowledge into real products that people use.