A study by Consumer Reports reveals that many voice cloning tools, from companies like Descript and ElevenLabs, lack meaningful safeguards against misuse. Only Descript and Resemble AI implement measures to prevent unauthorized voice cloning, while others merely require users to attest to their legal right to clone a voice. This raises significant concerns, as AI voice cloning technologies could enhance impersonation scams, highlighting the need for stricter regulations and responsible usage among developers to protect individuals' voices.
"Our assessment shows that there are basic steps companies can take to make it harder to clone someone's voice without their knowledge - but some companies aren't taking them."
"AI voice cloning tools have the potential to 'supercharge' impersonation scams if adequate safety measures aren't put in place."
Collection
[
|
...
]