That's caused by the sampler and chatbot UI not being part of the LLM. It doesn't get to see its own output before it's sent out.
That's caused by the sampler and chatbot UI not being part of the LLM. It doesn't get to see its own output before it's sent out.