> This kind of circuitry—to plan forwards and back—was learned by the model without explicit instruction; it just emerged from trying to predict the next word in other poems.

This author has no idea what's going on.

The AI didn't just start trying to predict the next word in other poems, it was explicitly instructed to do so. It then sucked in a bunch of poems and parroted them out.

And... the author drastically over-represents its success with a likely cherry-picked example. When I gave Claude lines to rhyme with, it gave me back "flicker" to rhyme with "killer" and "function" to rhyme with "destruction". Of the 10 rhymes I tried, only two actually matched two syllables ("later/creator" and "working"/"shirking")I'm not sure how many iterations the author had to run to find a truly unusual rhyme like "rabbit/grab it", but it pretty obviously is selection bias.

And...

I actually agree with the other poster who says that part of this stochastic parrot argument is about humans wanting to feel special. Exceptionalism runs deep: we want to believe our group (be it our nation, our species, etc.) are better than other groups. It's often wrong: I don't think we're particularly unique in a lot of aspects--it's sort of a combination of things that makes us special if we are at all.

AI are obviously stochastic parrots if you know how they work. The research is largely public and unless there's something going on in non-public research, they're all just varieties of stochastic parroting.

But, these systems were designed in part off of how the human brain works. I do no think it's in evidence at all that humans aren't stochastic parrots. The problem is that we don't have a clear definition of what it means to understand something that's clearly distinct from being a stochastic parrot. At a certain level of complexity of stochastic parroting, a stochastic parrot is likely indistinguishable from someone who truly understands concepts.

I think ultimately, the big challenge for AI isn't that it is a stochastic parrot (and it is a stochastic parrot)--I think a sufficiently complex and sufficiently trained stochastic parrot can probably be just as intelligent as a human.

I think the bigger challenge is simply that entire classes of data simply have not been made available to AI, and can't be made available with current technology. Sensory data. The kind of data a baby gets from doing something and seeing what happens. Real-time experimentation. I think a big part of why humans are still ahead of AI is that we have a lot of implicit training we haven't been able to articulate, let alone pass on to AI.