They made it implicitly, otherwise this:
>(2) language only somewhat models the world
is completely irrelevant.
Everyone is only 'somewhat modeling' the world. Humans, Animals, and LLMs.
They made it implicitly, otherwise this:
>(2) language only somewhat models the world
is completely irrelevant.
Everyone is only 'somewhat modeling' the world. Humans, Animals, and LLMs.
Completely relevant, because LLMs only "somewhat model" humans' "somewhat modeling" of the world...
LLMs aren't modeling "humans modeling the world" - they're modeling patterns in data that reflect the world directly. When an LLM learns physics from textbooks, scientific papers, and code, it's learning the same compressed representations of reality that humans use, not a "model of a model."
Your argument would suggest that because you learned about quantum mechanics through language (textbooks, lectures), you only have access to "humans' modeling of humans' modeling of quantum mechanics" - an infinite regress that's clearly absurd.