The FTC Wants Your Help Fighting AI Vocal Cloning Scams.
Briefly

The contest is looking for ideas for preventing, monitoring, and evaluating malicious AI vocal cloning abuses.
Deepfaked audio already appears capable of fooling as many as 1-in-4 unsuspecting listeners into thinking a voice is human-generated.
Con artists attempted to target a mother in Arizona for ransom by using AI audio deepfakes to fabricate her daughter's kidnapping. Creative professionals like musicians and actors could face threats to their livelihoods due to AI imitations.
Read at Acm
[
]
[
|
]