IMO LLMs have demonstrated how uncomplicated the man behind the curtain really is. If we do happen to achieve AGI it will likely have many of the problems associated with the real thing which often fails in spectacular fashion.