Okay, so you observed one team that had an issue with AI code quality. What's your point?
In 1998, I'm sure there were newspaper companies who failed at transitioning online, didn't get any web traffic, had unreliable servers crashed, etc. This says very little about what life would be like for the newspaper industry in 1999, 2000, 2005, 2010, and beyond.
Im arguing that code quality very much still matters and will only continue to matter.
AI will get better at making good maintainable and explainable code because that’s what it takes to actually solve problems tractably. But saying “code quality doesn’t matter because AI” is definitely not true both experientially and as a prediction. Will AI do a better job in the future? Sure. But because their code quality improves not because it’s less important.
Well then sure, we can agree there, it's just a matter of phrasing then.
Then you may want to clarify what your phrasing meant because I couldn’t find a more charitable interpretation
More and more software will be built by non-experts, software that has smaller user bases and simpler use cases and doesn't need to be maintained as much if at all. "Poor AI code quality" matters much less for these than for say, software written by developers at FAANG companies, since literally nobody will ever even look at the code.
Where we're headed is toward a world where a ton of software is ephemeral, apps literally created by AI out of thin air for a single use, and then gone.
Ephemeral in the same way the electrical wiring in an old house is ephemeral.
Which is to say, not at all.
Original wiring done by a professional, later changes by “vibe electrician” homeowners.
Every circuit might be a custom job, but they all accumulate into something a SWE calls “technical debt”.
Don’t like how the toaster and the microwave are on the same circuit even though they are in different parts of the kitchen? You’re lucky if you can even follow the wiring back to the circuit box to see how it was done. The electrical box is so much of a mess where would you even run a new circuit?
That’s the future we’re looking at.
No ephemeral as in: I'll ask the AI to check my email, and it'll create a bespoke table UI on the fly right inside my AI assistant, and populate it with relevant email data. And I'll use it, and then it will disappear. Software created and destroyed in a moment.
Not all software is meant to be some permanent building block upon which other software sits.
When new technology arrives that makes earlier ways of doing things obsolete, the consistent pattern throughout history has been that existing experts and professionals significantly underestimate the changes to come, in large part because (a) they don't like those changes, and (b) they're too used to various constraints and priorities that used to be important but no longer are. In other words, they're judging the new tech the lens of an older world, rather than through the lens of a newer world created by the new tech.
Yeah, I’ve built many one-off scripts in my day, and these days they take 100x less time.
There's almost no point in arguing about this anymore. Neither you nor the other person are going to be convinced. We just have to wait and see if a new crop of 100x productivity AI believer companies come along and unseat all the incumbents.
It seems that your opinion is based on expectations for the future then, which is notoriously difficult to predict.
It's not that hard to predict that obviously useful new technology is going to improve over time.
Guns, wheels, cars, ships, batteries, televisions, the internet, smartphones, airplanes, refrigeration, electric lighting, semiconductors, GPS, solar panels, antibiotics, printing presses, steam engines, radio, etc. The pattern is obvious, the forces are clear and well-studied.
If there is (1) a big gap between current capabilities and theoretical limits, (2) huge incentives for those who to improve things, (3) no alternative tech that will replace or outcompete it, (4) broad social acceptance and adoption, and (5) no chance of the tech being lost or forgotten, then technological improvement is basically a guarantee.
These are all obviously true of AI coding.
That list cherry picks all the successful cases where the technology improved while ignoring the many, many others where it didn't and the technology improved no further. That's dishonest.
It isn't even a good job of cherry picking: we never got mainstream supersonic passenger aircraft after the Concorde because aerospace technology hasn't advanced far enough to make it economically viable and the decrease in progress and massively increasing costs in semiconductors for cutting edge processes is very well known.
You're not factoring in the list of constraints I provided.
There's no broad social acceptance of supersonic flight because it creates incredibly loud sonic booms that the public doesn't want to deal with. And despite that, it's still a bad counterexample, as companies continue to innovate in this area e.g. Boom Supersonic.
At best you can say, "It's taking longer than expected," but my point was never that it will happen on any specific schedule. It took 400 years for guns to advance from the primitive fire lances in China to weapons with lock mechanisms in the 1400s. Those long time frames only prove my point even more strongly. Progress WILL happen, when there is appetite and acceptance and incentive and room to grow, and time is no obstacle. It's one of the more certain things in human history, and the forces behind it have been well studies.
Just as certain: the people and jobs who are obsoleted by these new technologies often remain in denial until they are forgotten.
If code quality only stops mattering in 400 years (whatever that definition happens to be) then the prediction that it makes is worthless in terms of what you should do today. You use it to argue it’s unimportant deal with it, but if it’s a 400 year payoff you’ve made the wrong bet.
Surely you don't think AI coding technology will be as slow to develop as guns were.
We're obviously talking about 1-10 years here, not 100-1000 years.
It’s really hard to predict where exponential progress will freeze. I was reading the other day that the field seems to have stagnated again in terms of no really meaningful ideas to overcome the inherent bottlenecks we’ve hit now in terms of diminishing returns for scaling. I’m not a pessimist or unbridled optimist but I think it’s fundamentally difficult to predict and the law of averages suggests someone will end up crowing about being right
In contrast to AI/AI companies, which have no negative externalities?
But hindsight is 20/20 as they say. In 2020 people predicted that Facebook Horizon would only go one direction, always improve and become as pervasive as the internet. So when you predict that the design and architecture capabilities of models will continue to improve, thus making code quality irrelevant, you sound very confident. And if in five years you are right, you will brag about it here. If not, well I for one will not track you down and rub it in your face. Peace out.
You're confusing betting on a company/product vs betting on technological improvement in general.
It is absolutely the case that virtual reality technology will only get better over time. Maybe it'll take 5, or 10, or 20, or 40 years, but it's almost a certainty that we'll eventually see better AR/VR tech in the future than we have in the past.
Would you bet against that? You'd be crazy to imo.
There's a kid outside the window of the place I'm staying who's been in the yard playing and talking with people online through his VR headset for like 2+ hours. He's living in the future. Whatever happens, he and his friends are going to continue to be interested in more of this.
Whether what they're using in 20 years is produced by the company formerly known as Facebook or not is a whole different question.
The newspaper industry is the perfect analogy, because it is effectively dead. Wholesale dead. Here and there, the biggest, most world-renowned papers are still alive, on life-support... NYT, WSJ, etc. But they're all dead. Their death has caused the absolute destruction of an entire industry sector and has given gangrene to adjacent industries that they will soon succumb to. The point about 1998 wasn't that there was this transition that demanded careful attention and wise strategy, but that death was coming for it no matter what anyone did to stop it.
The death of newspapers is quite the spectacle too. No one seems to understand how bad it is... the youngest generation can't even seem to recognize that anything is missing. We've effectively amateurized journalism so that only grifters and talentless hacks want to attempt it, and only in tiny little soundbites on Twitter or other social media (and they're quickly finding out how it might be more lucrative to do propaganda for foreign governments or MLM charlatanism). When the death of the software industry is complete, it too will have been completely amateurized, the youngest generation will not even appreciate that people used to make it for a living, and the few amateurs doing it will start to comprehend how much more lucrative it will be to just make poorly disguised malware.