Most AI voice cloning tools aren't safe from scammers, Consumer Reports finds
Briefly

Recent advancements in AI voice cloning technology have enabled the recreation of realistic audio samples from minimal input. While this has beneficial applications like audiobooks and marketing, the technology can also be exploited for scams, such as impersonating loved ones to solicit money. Consumer Reports evaluated six leading voice cloning tools and found that four lacked sufficient safeguards against non-consensual cloning. Only Descript and Resemble AI implemented stronger measures, but loopholes still exist. The findings emphasize the need for stringent protections within voice cloning applications to mitigate fraud risks.
The advances in AI voice cloning technology allow for highly realistic audio creation, but this technology poses serious risks of misuse for scams and fraud.
Despite positive applications in audiobooks and marketing, many voice cloning tools lack critical safeguards to prevent unauthorized voice replication.
Consumer Reports found that four out of six leading voice cloning tools failed to implement adequate mechanisms to prevent non-consensual cloning.
Descript and Resemble AI implement stricter safeguards by requiring consent statements and using real-time audio to create clones, but are not foolproof.
Read at ZDNET
[
|
]