The language used to describe LLM behaviour such as "training" and "reasoning" has led people to treat them the same as humans, instead of a new and different entity that requires us to update our set of rules.

If I was the first person to invent a car, for example, and I named its method of locomotion "walking", would you treat it the same as a human and let it "walk" in all the same places humans walk? After all, it's simply using kinetic energy and friction to propel itself along the ground, as we do.

Because a car is so obviously different to a human, we intuitively understand it requires an alteration to our rules in order for us to coexist peacefully. Since LLMs are so abstract, we don't intuitively understand this distinction, and so continue to treat them as if they should be bound by the same rules and laws as us.