> Personifying the LLM as being capable of knowing truths seems like a risky pattern to me.

I can see why I got downvoted now. People must think I'm a Blake Lemoine at Google saying LLMs are sentient.

> If you find truth in what the LLM says, that comes from YOU, it's not because the LLM in some way can knows what is true

I thought that goes without saying. I assign the truthiness of LLM output according to my educational background and experience. What I'm saying is that sometimes it helps to take a good hard look in the mirror. I didn't think that would controversial when talking about LLMs, with people rushing to remind me that the mirror is not sentient. It feels like an insecurity on the part of many.

> I didn't think that would controversial when talking about LLMs, with people rushing to remind me that the mirror is not sentient. It feels like an insecurity on the part of many.

For what it's worth I never thought you perceived the LLM as sentient. Though I see the overlap - one of the reasons I don't consider LLM output to be "truth" is that that there is no sense in which the LLM _knows_ what is true or not. So it's just ... stuff, and often sycophantic stuff at that.

The mirror is a better metaphor. If there is any "uncomfortable truth" surfaced in the way I think you have described, it is only the meaning you make from the inanimate stream of words received from the LLM. And in as much as the output is interesting of useful for you, great.