One I saw recently was "wires" and "wired" from opus.

It was using it like every 3rd sentence and I was like, yeah I have seen people say wired like this but not really for how it was using it in every sentence.

GPT started to ‘wire in’ stuff around 5.2 or 5.3 and clearly Opus, ahem, picked it up. I remember being a tiny bit shocked when I saw ‘wired’ for the first time in an Anthropic model.

Anthropic distills GPT?

Everybody training models on large amounts of lightly filtered internet text is partially distilling every other model that had its output posted verbatim to the internet.

And OpenAI probably distills anthropic, who would't?

It's all one big incestuous mess. In a couple of years we'll be talking about AI brainrot.