The growth of AI models, particularly ChatGPT, has led to a detrimental phenomenon termed 'model collapse,' where generative AI outputs diminish in quality as they learn from increasingly inferior data. The article draws parallels to the demand for 'low-background steel' produced before nuclear detonations, indicating a similar scarcity in high-quality information that predates this AI surge. Historical naval battleships, specifically from WW1 and WW2, are highlighted as vital sources of this steel, underscoring the unexpected consequences of technological advancements on resource availability and quality.
The rapid rise of AI-generated content has created a scenario of 'model collapse,' where content quality deteriorates as systems learn from their own inferior outputs.
The finite amount of pre-ChatGPT data is becoming increasingly valuable, similar to 'low-background steel' which is essential for certain scientific applications.
The pollution of useful data by AI models has parallels to the creation of low-background steel, stemming from the aftermath of nuclear explosions.
Maurice Chiodo emphasizes the historical significance of scuttled battleships in providing an 'almost infinite supply' of valuable low-background steel for modern applications.
Collection
[
|
...
]