The acceleration of AI has thrown into sharp relief that we have long lumped all sorts of highly distinct practices under this giant umbrella called "coding". I use CC extensively, and yet I still find myself constantly editing by hand. Turns out CC is really bad at writing kubernetes operators. I'd bet it's equally bad at things like database engines or most cutting edge systems design problems. Maybe it will get better at these specific things with time, but it seems like there will always be a cutting edge that requires plenty of human thought to get right. But if you're doing something that's basically already been done thousands of times in slightly different ways, CC will totally do it with 95% reliability. I'm ok with that.

It's also important to step back and realize that it goes way beyond coding. Coding is just the deepest tooth of the jagged frontier. In 3 years there will be blog posts lamenting the "death of law firms" and the "death of telemedicine". Maybe in 10 years it will be the death of everything. We're all in the same boat, and this boat is taking us to a world where everyone is more empowered, not less. But still, there will be that cutting edge in any field that will require real ingenuity to push forward.

I think there's clearly a difference in opinion based on what you work on. Some people were working on things that pre-CC models also couldn't handle and then CC could, and it changed their opinions quickly. I expect (but cannot prove of course) that the same will happen with the area you are describing. And then your opinion may change.

I expect it to, eventually. But then the cutting edge will have simply moved to something else.

I agree that it's very destabilizing. It's sort of like inflation for expertise. You spend all this time and effort saving up expertise, and then those savings rapidly lose value. At the same time, your ability to acquire new expertise has accelerated (because LLMs are often excellent private tutors), which is analogous to an inflation-adjusted wage increase.

There are a ton of variables. Will hallucinations ever become negligible? My money is on "no" as long as the architecture is basically just transformers. How will compiling training data evolve with time? My money is on "it will get more expensive". How will legislators react? I sure hope not by suppressing competition. As long as markets and VC are functioning properly, it should only become easier to become a founder, so outsized corporate profits will be harder to lock down.