Showing 1 - 1 of 1
Oped, Karen Rønde, Published on 06/02/2025
» As AI slop spreads across the internet, concerns about the future of high-quality information are growing. Without accurate and relevant human-generated data, model collapse -- whereby generative artificial intelligence trains on its own output and gradually degrades -- seems inevitable. The tech giants, well aware of this risk, have cut corners and skirted copyright law in their pursuit of training data for their large language models.