Music production
fromwww.scientificamerican.com
3 days agoAI music is booming, and the player piano saw it coming
Listeners struggle to distinguish AI-generated music from human-made, indicating AI's serious role in the music industry.
Galen Buckwalter, a 69-year-old research psychologist and quadriplegic, participated in a brain implant study to contribute to science that aids those with paralysis. The six chips in his brain decode movement intention, allowing him to operate a computer and feel sensations in his fingers again.
By the early 1900s, player pianos had evolved to more fully reproduce a human performance, including subtle dynamics like tempo changes and the introduction of a damper pedal. The human role went from deskilled to fully deprecated as electric motors replaced foot-powered bellows. With the Seeburg Lilliputian Model L, the only job left for humans who wanted to play the piano in the 1920s was to put in a coin.
One scientist at MIT, Cyrus Clarke, is working to do just that. Alongside a team of fellow researchers, Clarke has developed a physical machine called the Anemoia Device, which uses a generative AI model to analyze an archival photograph, describe it in a short sentence, and, following the user's own inputs, convert that description into a unique fragrance. The word "anemoia" was coined by author John Koenig and included in his 2021 book, The Dictionary of Obscure Sorrows.
The results show that generative AI systems themselves tend toward homogenization when used autonomously and repeatedly. They even suggest that AI systems are currently operating in this way by default. This experiment may appear beside the point: Most people don't ask AI systems to endlessly describe and regenerate their own images. The convergence to a set of bland, stock images happened without retraining. No new data was added. Nothing was learned. The collapse emerged purely from repeated use.