When Face Recognition Doesn't Know Your Face Is a Face
Briefly

When Face Recognition Doesn't Know Your Face Is a Face
"Autumn Gardiner thought updating her driving license would be straightforward. After getting married last year, she headed to the local Department of Motor Vehicles office in Connecticut to get her name changed on her license. While she was there, Gardiner recalls, officials said she needed to update her photo. That's when things started to go wrong. Every time staff tried to take her photo, Gardiner says, the system would reject it."
"Gardiner, who works as a grant manager for an environmental conservation charity, is one of a small number of people globally who live with Freeman-Sheldon syndrome. As more staff members at the DMV were called to help, Gardiner says she started to believe the rejected photos were being caused by her facial difference. The camera system didn't seem to work for her, she says. "It was humiliating and weird. Here's this machine telling me that I don't have a human face," Gardiner says."
"Gardiner isn't alone. Around half a dozen people living with a variety of facial differences-from birthmarks to craniofacial conditions-tell WIRED they are increasingly struggling to participate in modern life as identity verification software, which often is powered by machine learning, is quickly becoming commonplace. Some of those living with facial differences tell WIRED they have undergone multiple surgeries and experienced stigma for their entire lives, which is now being echoed by the technology they are forced to interact with."
An estimated 100 million people live with facial differences. Face recognition and automated identity verification systems are becoming widespread and are now used for access to services and identification. People with congenital or acquired facial differences, including Freeman-Sheldon syndrome, birthmarks, and craniofacial conditions, experience repeated rejections from camera-based systems when updating IDs or accessing services. These verification failures cause humiliation, require extra staff intervention, and lead to practical exclusion from essential processes. Many affected individuals have undergone multiple surgeries and endured lifelong stigma that is now reflected in technological barriers. Machine-learning verification tools can reproduce bias and create access gaps without accessible alternatives.
Read at WIRED
Unable to calculate read time
[
|
]