'Model collapse': Scientists warn against letting AI eat its own tail | TechCrunch
Briefly

We discover that indiscriminately learning from data produced by other models causes 'model collapse' - a degenerative process whereby, over time, models forget the true underlying data distribution ...
AI models are pattern-matching systems at heart: They learn patterns in their training data, then match prompts to those patterns, filling in the most likely next dots on the line...
But the thing is, models gravitate toward the most common output. It won't give you a controversial snickerdoodle recipe but the most popular, ordinary one...
Read at TechCrunch
[
]
[
|
]