While training something on TinyStories one of the early stage outputs was
"Once upon a time, in a small town, there was a huge town"
Which I thought was the brilliant start for an interesting story. Just that slight deviation from expectation provides a excellent scaffold for your imagination to take hold.
Of course the rest of the generation was utter rubbish, but if you could on-demand deviate from standard at key points while keeping the majority internally consistent then it would be great.
Poetry is full of figures of speech which are words and phrases that intentionally differ from common logic and common knowledge so LLMs can actually be probabilistic and accidental generators of figures of speech. Every nonsense that LLMs generate can be interpreted as a protentional metaphor or an oxymoron. The sentence that you mentioned sounds like a badass metaphor to me "Once upon a time, in a small town, there was a huge town" or in another words: town might be small in size but it has soul.