> I didn't think that would controversial when talking about LLMs, with people rushing to remind me that the mirror is not sentient. It feels like an insecurity on the part of many.

For what it's worth I never thought you perceived the LLM as sentient. Though I see the overlap - one of the reasons I don't consider LLM output to be "truth" is that that there is no sense in which the LLM _knows_ what is true or not. So it's just ... stuff, and often sycophantic stuff at that.

The mirror is a better metaphor. If there is any "uncomfortable truth" surfaced in the way I think you have described, it is only the meaning you make from the inanimate stream of words received from the LLM. And in as much as the output is interesting of useful for you, great.