Ex-GTA Boss Dan Houser Likens AI To Mad Cow Disease
Briefly

Ex-GTA Boss Dan Houser Likens AI To Mad Cow Disease
""I personally don't think it will because AI is going to eventually eat itself. As far as I understand it--which is really a superficial understanding--the models scour the internet for information, but the internet is going to get more and more full of information made by the models. So, it's sort of like when we fed cows with cows and got mad cow disease.""
""I can't see how the information will get better if they're running out of data. [AI] will do some tasks brilliantly, but it's not going to do every task brilliantly, and it's going to become this sort of mirror of itself.""
AI systems that train on internet data risk a feedback loop as the internet becomes increasingly populated with AI-generated content. Models that scour online sources will increasingly ingest outputs produced by other models, reducing the diversity and quality of original training data. That feedback may lead to degraded performance on tasks outside narrow capabilities and create a mirrored echo of model behavior. The idea relates to the "dead internet" concept and research suggesting wider social and economic implications, including manipulation, political influence, and profiteering. AI can still deliver rapid, task-specific information but faces long-term data-quality limits.
Read at GameSpot
Unable to calculate read time
[
|
]