> It's a bit disappointing that people are still re-hashing the same "it's in the training data" old thing from 3 years ago.
They only have to keep reiterating this because people are still pretending the training data doesn't contain all the information that it does.
> It's not like any LLM could 1for1 regurgitate millions of LoC from any training set... This is not how it works.
Maybe not any old LLM, but Claude gets really close.