"AI models collapse when trained on recursively generated data"

https://news.ycombinator.com/item?id=41058194

That's not what I asked for as it's not relevant.

The claim was made that the models are "suffering", at this exact moment, because they have been recursively feeding themselves, RIGHT now.

I want evidence the current models are "suffering" right now, and I want further evidence that suggests this suffering is due to recursive data ingestion.

Some year old article with no relevance to today talking about hypotheticals of indiscriminate gorging of recursive data is not evidence of either of the things I asked for.

Did you mean the current models that are still stuck in 2023?

> what's the latest year of data you're trained on

> ChatGPT said: My training goes up to April 2023.

There's a reason they're not willing to update the training corpus even with GPT-5.

> Some year old article with no relevance to today

The current models are based on training even older so I guess you should disregard those too if you're choosing to judge things purely based on their age.