It made me think about the seahorse emoji story that was here recently. Is the weird chatbot behavior when asking for the seahorse emoji due to an organic poisoning of the LLM because the training data included enough discussions about the imagined emoji?