“Our primary conclusion across all scenarios is that without enough fresh real data in each generation of an autophagous loop, future generative models are doomed to have their quality (precision) or diversity (recall) progressively decrease,” they added. “We term this condition Model Autophagy Disorder (MAD).”
Interestingly, this might be a more challenging problem as we increase the use of generative AI models online.
Key point here being that humans train on other humans, not on themselves. They are also always exposed to the real world.
If you lock a human in a box and only let them interact with themselves they go a bit funny in the head very quickly.
The reason is different from what is happening with AI, though. Sensory deprivation or extreme isolation and the Ganzfeld effect lead to hallucinations because our brain seems to have to constantly react to stimuli in order to keep functioning. Our brain starts creating things from imagination.
With AI it is the other way around. They lose information when presented with the same data again and again because their statistical models look for probabilities.