"Computers once performed facial recognition rather imprecisely, by identifying people's facial features and measuring the distances among them - a crude method that did not reliably result in matches. But recently, the technology has improved significantly, because of advances in artificial intelligence. A.I. software can analyze countless photos of people's faces and learn to make impressive predictions about which images are of the same person; the more faces it inspects, the better it gets. Clearview is deploying this approach using billions of photos from the public internet. By testing legal and ethical limits around the collection and use of those images, it has become the front-runner in the field. "
"The lack of answers the Jacksonville sheriff's office have provided in Lynch's case is representative of the problems that facial recognition poses across the country. "It's considered an imperfect biometric," said Garvie, who in 2016 created a study on facial recognition software, published by the Center on Privacy and Technology at Georgetown Law, called The Perpetual Line-Up. "There's no consensus in the scientific community that it provides a positive identification of somebody.""
"However, there is some concern about how accurate these new procedures will be. Apparently the facial recognition technology doesn't recognize all people will the same accuracy. White women and black people aren't as easily recognized as white men, meaning there could be some mismatching of identities. Some are also concerned that this is crossing the line in terms of passenger privacy."
"North America, Central America, and Caribbean
In the U.S., a 2016 study showed that already half of American adults were captured in some kind of facial recognition network. More recently, the Department of Homeland Security unveiled its "Biometric Exit" plan, which aims to use facial recognition technology on nearly all air travel passengers by 2023, to identify compliance with visa status."
"But researchers at New York University's AI Now Institute have issued a strong warning against not only ubiquitous facial recognition, but its more sinister cousin: so-called affect recognition, technology that claims it can find hidden meaning in the shape of your nose, the contours of your mouth, and the way you smile. If that sounds like something dredged up from the 19th century, that's because it sort of is."
"The facial recognition system is currently being rolled out across six of Prof Shen's classes.
"The new system saves time and reduces the workload of students," Prof Shen told the Beijing News. "Out of one hundred students, it usually only fails to recognise one student."
But obviously, not everyone is a fan."
"People in Soho, Piccadilly Circus, and Leicester Square are being told by the London Metropolitan Police to submit to a trial of the force's notoriously inaccurate, racially biased facial recognition system, which clocks in an impressive error-rate of 98% (the system has been decried by Professor Paul Wiles, the British biometrics commissioner, as an unregulated mess)."
"Affectiva is running a program that pays drivers to help train its emotion-recognition system. The company sends drivers a kit including cameras and other sensors to place within their vehicles. These record a person's facial expressions, gestures, and tone of voice on the road. That data is then labeled by trained specialists for a range of emotions, and fed into deep neural networks."
"Activists from Fight for the Future prowled the halls of Congress in "jumpsuits with phone strapped to their heads conducting live facial recognition surveillance" to "show why this tech should be banned.""
"Georgetown University's Center on Privacy and Technology highlighted the April 2017 episode in Garbage In, Garbage Out, a report on what it says are flawed practices in law enforcement's use of facial recognition.
The report says security footage of the thief was too pixelated and produced no matches while high-quality images of Harrelson returned several possible matches and led to one arrest."
"Cellebrite, the Israeli-founded and now Japanese-owned company behind some of the software, claims a wider rollout would solve problems over failures to disclose crucial digital evidence that have led to the collapse of a series of rape trials and other prosecutions in the past year. However, the move by police has prompted concerns over privacy and the potential for software to introduce bias into processing of criminal evidence."