Generative AI relies on a massive body of training material, primarily made up of human-authored content haphazardly scraped from the internet.
Scientists are still trying to better understand what will happen when these AI models run out of that content and have to rely on synthetic, AI-generated data instead, closing a potentially dangerous loop. Studies have found that AI models start cannibalizing this AI-generated data, which can eventually turn their neural networks into mush. As the AI iterates on recycled content, it starts to spit out increasingly bland and often mangled outputs.
Read Full Article: https://futurism.com/artificial-intelligence/ai-cultural-stagnation
