You are describing Searle's "Chinese Room argument"[1] to some extent.
It's been discussed a lot recently, but anyone who has interacted with LLMs at a deeper level will tell you that there is something there; not sure if you'd call it "intelligence" or what. There is plenty of evidence to the contrary too. I guess this is a long-winded way of saying "we don't really know what's going on"...
If an LLM was intelligent, wouldn't it get bored?
Why should it?