AI Is Eating Itself to Death — and Nobody Is Stopping It

2 min read Original article ↗

526 citations. Zero solutions deployed at scale. Inside the research that terrifies AI labs.

Delanoe Pirard

Press enter or click to view image in full size

Nine mirrors. Nine generations. Each reflection loses what made the original unique. By the ninth, only noise remains. This is model collapse.

Frederic Bartlett’s Telephone Game

In 1932, British psychologist Frederic Bartlett conducted an experiment that would become legendary. He asked a group of participants to read a Native American folk tale, “The War of the Ghosts,” then retell it from memory to another person, who retold it in turn, and so on.

By the seventh reproduction, the story was unrecognizable. The spirits and canoes had vanished. The elements culturally foreign to the British participants had been replaced by bland generalizations. The tale had been “normalized,” stripped of everything that made it unique.

Bartlett called this method serial reproduction. The takeaway: each link in the chain preserves what feels familiar and discards the rest. Rare information disappears first.

In 2024, a team of researchers from Cambridge, Toronto and Oxford published a disturbing discovery in Nature: artificial intelligence models do exactly the same thing (Shumailov et al., Nature 2024). They called it model collapse: a progressive, irreversible degradation of AI models trained on synthetic data. Except AI does it…