I wouldn't say words are metaphors, I think of a metaphor as like a pointer, words are the actual mapping of arbitrary sounds to concepts that are relatively universally relatable to human experience. Children can learn language easily by associating those sounds with contemporaneous objects and situations from their experience. I feel like AIs will continue to be regurgitative borg-brains-in-a-vat until they have some semblance of a relatable, "lived" experience that they can map words to instead of just analyzing the patterns.

That's folk science, you've not studied linguistics.

That's a fallacy, you've not studied logic.

https://en.wikipedia.org/wiki/Argument_from_authority

Besides, I was largely agreeing and building on what you said (with a quibble over one word choice), so I'm not really sure what you're objecting to.

Of course I've studied logic, particularly Eastern lack of requirement of logic. Language is inherently paradoxical, its inception, development defies logic in its endless contradictions. It can't be revealed by logic. The very idea that language has something called words that are scientifically visible in the label conduit metaphor paradox indicates that language is probably our biggest contradiction.

Logic shows that you can prove anything from a contradiction. No logic means everything is true. So I don't think the rejection of logic is conductive to constructive conversation.

"Multiple contradictions in logic are complex statements where two or more propositions, taken together, assert the logical impossibility of their own truth, often resulting in two mutually exclusive conclusions"

if you understand logic, you may grasp logic was not required to reach any analytic conclusions in multiple contradictions, scientific discourse does just fine without it and probably needs to jettison logic en route to new formats.