
"The app is named Mobile Fortify. Simply pointing a phone's camera at their intended target and scanning the person's face allows Mobile Fortify to pull data on an individual from multiple federal and state databases, some of which federal courts have deemed too inaccurate for arrest warrants. The US Department of Homeland Security has used Mobile Fortify to scan faces and fingerprints in the field more than 100,000 times, according to a lawsuit brought by Illinois and Chicago against the federal agency, earlier this month."
"Here we have ICE using this technology in exactly the confluence of conditions that lead to the highest false match rates, says Nathan Freed Wessler, deputy director of the ACLU's speech, privacy and technology project. A false result from this technology can turn somebody's life totally upside down. The larger implications for democracy are chilling, too, he notes: ICE is effectively trying to create a biometric checkpoint society."
"Use of the app has inspired backlash on the streets, in courts, and on Capitol Hill. Protesters are using a variety of tactics to fight back. They include recording masked agents, using burner phones and donated dashboard cameras, according to the Washington Post. Underpinning resistance to ICE's use of facial recognition are doubts about the technology's efficacy. Research has uncovered higher error rates in identifying women and people of color than for scans of white faces."
Mobile Fortify is a smartphone app that uses facial recognition to scan faces and fingerprints and pull records from multiple federal and state databases. DHS has reportedly used the app in the field more than 100,000 times. The app marks a shift from prior facial-recognition use largely limited to investigations and ports of entry. Some source databases have been deemed too inaccurate for arrest warrants. Civil-rights advocates warn that false matches can devastate individuals and could create a biometric "checkpoint" society. Use of the app has prompted protests, lawsuits, congressional scrutiny, and concerns about bias and higher error rates for women and people of color.
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]