The future is perpetually dealing with the fallout from all the vibe coding as the pool of people who'd have a shot at fixing it gets smaller and smaller. Shitty will be the new normal.

I feel like it will be like going back to the 80s, when PCs became a norm and most programmers and hobbyists could code without the need of a University or a Corporation. Thousands of shareware apps you had to navigate, everyone trying to solve the same problems from different angles..

I do agree quality will be missed, and shadow IT will be again a big issue like at the end of the 80s and early 90s.

I imagine a much darker future when it’s almost every enterprise system known for stability is now unstable.

Planes falling out of the sky, trains crashing into each other, pacemakers downloading updates and freezing

Coding on 8 and 16 bit home computers still required some skills that most vibe coders certainly lack.

> most programmers and hobbyists could code without the need of a University or a Corporation.

I don't think so. Back then, the pool of people doing such a thing basically self-selected for intelligent, motivated types who were capable of learning on their own. The new "programmers" "programming" via Claude Code are going to be very different from those hobbyists you're talking about.

This is a comically self-absorbed perspective.

Why are people making things with Claude Code if not because they’re motivated?

I think the point is that you had to be deeply curious and more of a "hacker" or "computer nerd" type to be able to figure things out.

But I think the same applies to not just AI but various tools that have abstracted away the complexity of things over the years.

For example, I would imagine the average person deploying some sort of web app or API today knows far less about networking and infrastructure than someone doing it 10 to 20 years ago.

Yes, exactly. I'm reminded of the articles detailing how Gen Z has fewer computer skills than previous generations because computing has become so abstracted -- turn on iPhone, tap button. "What's a directory?" -- files just kind of exist on these devices without any real notion of _where_, as far as the user knows. Stuff like that.

Compare that to say 30ish years ago. If you wanted to do something as simple as play a computer game you had to know how to navigate a command line, know about device drivers, make a boot disk, etc. Users were a whole lot closer to the realities of what makes computing work. And no internet, at least as we know it now. You really had to have a certain mindset to be a developer.

It's a far cry from "hey Claude make an app."

Knowing that genuine, disincentivized creativity is exceedingly rare (especially in the West), you can assume that the answer looks something like a carrot or a stick.

Because it's "easy"?

Because it's "easy" (until they hit a wall)

Once they hit a wall, that is where you find out whether they are motivated or not

> Once they hit a wall, that is where you find out whether they are motivated or not

Yep. That has to happen first.

Eventually there will be an incident with bad software at a hospital or bank that leaves some people dead or broke.

Then regulators will take things seriously.

This is exactly what Uncle Bob predicted in his talk "The Future Of Programming" [0] 10 years ago, way before LLMs.

[0] https://www.youtube.com/watch?v=ecIWPzGEbFc

Which is why the medical device software industry is so heavily regulated after the Therac-25 incident. Oh, wait, it's not.

https://en.wikipedia.org/wiki/Therac-25

What regulators?

> as the pool of people who'd have a shot at fixing it gets smaller and smaller

Sounds like job prospects to me.

> Shitty will be the new normal.

I’ve heard the same from the best devs, and some who thought themselves to be the best, I’ve known long before LLMs were ever a thing.

I’m sure others heard the same when JavaScript and Python became near ubiquitous. When PHP emerged. When C supplanted Fortran and COBOL. When these two took over from Assembly. When punch cards went the way of the dodo.

There’s always someone for whom shitty is becoming the new normal. If that makes it a rule, what do we make of that rule?

There are different magnitudes of shitty.

Also we went from compilers with an IDE that had a debugger, profiler, built-in help and would fit on a 3.5" disk and would load on machines with 640KiB RAM (Turbo Pascal) to chat apps or password managers that are hundreds of megabytes and regularly gobble up more than a gigabyte of memory because they ship with their own browser.

Something is lost along the way.

> I’m sure others heard the same when JavaScript and Python became near ubiquitous. When PHP emerged.

You heard right! Most JavaScript and PHP in the world _is_ profoundly shitty. It's taken 20 years of intense research to make JavaScript compilers that are almost good enough to mostly optimize away the design foibles of the language.

To be fair with how powerful our computers are, it's a pity that electron apps like bitwarden, spotify are so slow and consume so much resources. I do miss the time when a lot of apps were snappy

I am just going to justify in the future that because of LLMs, there is no reason to use JavaScript, Java, Python etc anymore because of the available workforce. Only then when the technology itself is fit for the job.

As you say - "good enough" is always the normal.

Maybe it’s a process. Many of the transitions you mentioned did bring shitty apps (not all of them, the ones replacing tech for tech were mostly ok, the ones democratizing dev did come with a quality drop), but eventually Darwinism will take effect and trim the long tail.

Coding per se is not hard. Proper engineering is. I do hope this change brings a change in focus (people train in algorithms, efficiency, solid development patterns) but I am afraid it won’t be the case.

"With a punchcard at least, I can verify what the input is! Unlike those new 'transistors' that are so unreliable!"

What do you think a transistor is

I'm working on a possibly-quixotic tool to mitigate the "cognitive debt" from AI-assisted development. Not everybody agrees that this is a problem. Maybe some teams that are only writing specs and reviewing plans still understand their products adequately. If you have an opinion either way, I'd appreciate hearing from you.