A few days ago I read an article from humnanlayer. They mentioned shipping a weeks worth of collaborative work in less than a day. That was one data point on a project.

- Has anyone found claude code been able to documentation for parts of the code which does not:

(a). Explode in maintenance time exponentially to help claude understand and iterate without falling over/hallucinating/design poorly?

(b). Use it to make code reviewers life easy? If so how?

I think the key issue for me is the time the human takes to *verify*/*maintain* plans is not much less than what it might take them to come up with a plan that is detailed enough that many AI models could easily implement.

It is pretty tiresome with the hype tweets and not being able to judge the vibe code cruft and demoware factor.

Especially on bootstrap/setup, AIs are fantastic for cutting out massive amounts of time, which is a huge boon for our profession. But core logic? I think that's where the not-really-saving-time studies are coming from.

I'm surprised there aren't faux academic B-school productivity studies coming out to counter that (sponsored by AI funding of course) already, but then again I don't read B-school journals.

I actually wonder if the halflife decay of the critical mass of vibecode will almost perfectly coincide with the crash/vroosh of labor leaving the profession to clean it up. It might be a mini-y2k event, without such a dramatic single day.