
"Before artificial intelligence tools proliferated - making it possible to realistically impersonate someone, in photos, sound and video - "proof of life" could simply mean sending a grainy image of a person who's been abducted. That's no longer true. "With AI these days you can make videos that appear to be very real. So we can't just take a video and trust that that's proof of life because of advancements in AI," Heith Janke, the FBI chief in Phoenix, said at a news conference Thursday."
"Hoaxes - whether high or low-tech - have long challenged law enforcement, especially when it comes to high-profile cases such as Nancy Guthrie's disappearance last weekend from her home in the Tucson area. As technology has advanced, criminals have grown savvy and used it to their benefit, confusing police and the public and masking their identities. The FBI in December warned that people posing as kidnappers can provide what appears to be a real photo or video of a loved one, along with demands for money."
A daughter publicly appealed to a kidnapper for proof of life after her 84-year-old mother disappeared, while raising concern about deepfakes. Advances in artificial intelligence enable realistic impersonation in photos, audio and video, making traditional proof of life unreliable. Law enforcement warns that images or videos alone can be manipulated and should not be trusted as sole verification. Hoaxes and technological manipulation have complicated investigations, with criminals using digital tools to confuse police and the public and to mask identities. Investigators report no confirmed deepfake images in the case, have received purported ransom notes, and have not identified suspects.
Read at Fortune
Unable to calculate read time
Collection
[
|
...
]