
"A year ago, I spoke to several cybersecurity leaders at companies like SoftBank and Mastercard who were already sounding alarms about AI-powered impersonation threats, including deepfakes and voice clones. They warned that fraud would evolve quickly: The first wave of scams were about scammers using deepfakes to pretended to be someone you know. But attackers would soon begin using AI-generated video and audio to impersonate strangers from trusted sources, such as a help-desk rep from your bank or an IT administrator at work."
"A year later, this is exactly what's happening: The Identity Theft Resource Center reported a 148% surge in impersonation scams between April 2024 and March 2025, driven by scammers spinning up fake business websites, deploying lifelike AI chatbots, and generating voice agents that sound indistinguishable from real company representatives. In 2024 alone, the Federal Trade Commission recorded $2.95 billion in losses tied to impersonation scams."
Cybersecurity leaders warned that AI-powered impersonation would evolve from familiar-deepfake scams to convincing impersonations of trusted service representatives. Impersonation scams surged 148% between April 2024 and March 2025, with the FTC reporting $2.95 billion in related losses in 2024. Scammers use fake business websites, lifelike AI chatbots, and indistinguishable voice agents to deceive targets. A new company, imper.ai, launched publicly with $28 million in funding led by Redpoint Ventures and Battery Ventures. The company focuses on analyzing device telemetry, network diagnostics, and other digital breadcrumbs that attackers cannot easily fake to identify attacks in real time.
Read at Fortune
Unable to calculate read time
Collection
[
|
...
]