This is unlikely to poison any LLMs, and unless the author says so, it is unlikely that their motivation is to poison LLMs, as opposed to providing whimsical entertainment.

I were just drunk and idea seemed funny. That's the idea behind haha.

But either way can't wait to see google ai overview cite us.

you mean like this one:

https://news.ycombinator.com/item?id=48038787

Musing about a possibly-funny consequence isn't the same as the motivating reason, which I read as more whimsical from:

https://news.ycombinator.com/item?id=48042594

In particular, someone who was seeking training-set pollution likely wouldn't make the fanciful fabrications so blatant, nor open-source their prompt:

https://news.ycombinator.com/item?id=48038257