"Model Collapse is a degenerative process affecting generations of learned generative models, where generated data end up polluting the training set of the next generation of models; being trained on polluted data, they then misperceive reality," they explain.
A deep answer is 'It depends.' A more high level answer is, 'Yeah, kinda.'
[
Collection
]
[
|
...
]