Indeed. Capacity to do the hard parts of software engineering well may well be our best indicator of AGI.

I don't think LLMs alone are going to get there. They might be a key component in a more powerful system, but they might also be a very impressive dead end.

Sometimes I think we’re like cats that stumbled upon the ability to make mirrors. Many cats react like there’s another cat in the mirror, and I wonder if AGI is just us believing we can make more cats if we make the perfect mirror.