
"Marina vd Roest hadn't faced the man who abused her in decades when she first sat down in front of the laptop. Confronted with his realistic, blinking, speaking face, she felt "scared ... like a little child again." "Sometimes I had to close the laptop and get my breath back before opening it and continuing with the conversation," she says. Vd Roest is one of the first people to have tried out a radical new form of therapy."
"Many people now count chatbots among their friends, therapists, and lovers, while griefbots mimic deceased loved ones. The technology can be dangerous; chatbots have been tied to some psychotic episodes and suicides. Deepfake therapy of the kind vd Roest tried is closely monitored by clinicians, and the avatar is voiced by a trained clinician. The same approach could prove hugely risky if attempted solo."
"Vd Roest had suffered from decades of post-traumatic stress disorder following her abuse. She had tried traditional therapy, as well as interventions like eye movement desensitization and reprocessing therapy, where a patient is asked to recall traumatic events while experiencing auditory, visual, or tactile stimuli. While it was temporarily effective, her PTSD returned, prompting her to try out the experimental approach."
Marina vd Roest confronted an A.I.-generated deepfake of her abuser and experienced intense fear, describing herself as feeling like a small child again. The experimental therapy places survivors face-to-face with realistic, blinking, speaking avatars that are voiced and monitored by trained clinicians to reduce risk. The therapy follows decades of vd Roest's PTSD after abuse; traditional therapy and eye movement desensitization and reprocessing provided only temporary relief. A two-person pilot study in 2022 produced positive participant responses, and a larger clinical study in the Netherlands is currently underway with results expected next year.
Read at Slate Magazine
Unable to calculate read time
Collection
[
|
...
]