> So it's a bit like asking sales people technical questions.
Maybe that's the best anthropomorphic analogy of LLMs. Like good sales people completely disconnected from reality, but finely tuned to give you just the answer you want.
> So it's a bit like asking sales people technical questions.
Maybe that's the best anthropomorphic analogy of LLMs. Like good sales people completely disconnected from reality, but finely tuned to give you just the answer you want.
Well no, the problem was that the bot didn't give them the answer they wanted. It's more like "finely tuned to waffle around pretending to be knowledgeable, but lacking technical substance".
Kind of like a bad salesperson, the best salespeople I've had the pleasure of knowing were not afraid to learn the technical background of their products.
The best anthropomorphic analogy for LLMs is no anthropomorphic analogy :)
Anthropomorphizing sales people involves the same constraints, so I'd allow it.