Skip to main content

Home/ Digit_al Society/ Group items tagged facial recognition bias

Rss Feed Group items tagged

1More

Rite Aid facial recognition misidentified Black, Latino and Asian people as 'likely' sh... - 0 views

  •  
    "Rite Aid facial recognition misidentified Black, Latino and Asian people as 'likely' shoplifters Surveillance systems incorrectly and without customer consent marked shoppers as 'persons of interest', an FTC settlement says Johana Bhuiyan and agencies Wed 20 Dec 2023 14.29 EST Last modified on Thu 21 Dec 2023 12.04 EST Rite Aid used facial recognition systems to identify shoppers that were previously deemed "likely to engage" in shoplifting without customer consent and misidentified people - particularly women and Black, Latino or Asian people - on "numerous" occasions, according to a new settlement with the Federal Trade Commission. As part of the settlement, Rite Aid has been forbidden from deploying facial recognition technology in its stores for five years."
1More

Parents Against Facial Recognition - 0 views

  •  
    "To Lawmakers and School Administrators: As parents and caregivers, there is nothing more important to us than our children's safety. That's why we're calling for an outright ban on the use of facial recognition in schools. We're concerned about this technology spreading to our schools, infringing on our kids' rights and putting them in danger. We don't even know the psychological impacts this constant surveillance can have on our children, but we do know that violating their basic rights will create an environment of mistrust and will make it hard for students to succeed and grow. The images collected by this technology will become a target for those wishing to harm our children, and could put them in physical danger or at risk of having their biometric information stolen or sold. The well-known bias built into this technology will put Black and brown children, girls, and gender noncomforming kids in specific danger. Facial recognition creates more harm than good and should not be used on the children we have been entrusted to protect. It should instead be immediately banned."
1More

In facial recognition challenge, top-ranking algorithms show bias against Black women |... - 0 views

  •  
    "The results are unfortunately not surprising - countless studies have shown that facial recognition is susceptible to bias. A paper last fall by University of Colorado, Boulder researchers demonstrated that AI from Amazon, Clarifai, Microsoft, and others maintained accuracy rates above 95% for cisgender men and women but misidentified trans men as women 38% of the time."
1More

Recognising (and addressing) bias in facial recognition tech - the Gender Shades Audit ... - 0 views

  •  
    "What if facial recognition technology isn't as good at recognising faces as it has sometimes been claimed to be? If the technology is being used in the criminal justice system, and gets the identification wrong, this can cause serious problems for people (see Robert Williams' story in "Facing up to the problems of recognising faces")."
1More

Smile, Your Face Is Now in a Database - Benjamin Powers - Medium - 0 views

  •  
    ""My concern is that a facial recognition rejection can [create] bias," said Rudolph. "So, if someone has a lot of faith in this technology and thinks that it's foolproof, and someone is rejected by this system, that customs officer or gate agent may be predisposed to saying this person is traveling with fraudulent credentials. That's a crime and a serious issue.""
1More

The coded gaze: biased and understudied facial recognition technology / Boing Boing - 0 views

  •  
    " "Why isn't my face being detected? We have to look at how we give machines sight," she said in a TED Talk late last year. "Computer vision uses machine-learning techniques to do facial recognition. You create a training set with examples of faces. However, if the training sets aren't really that diverse, any face that deviates too much from the established norm will be harder to detect.""
1More

Surveillance Technology: Everything, Everywhere, All at Once - 0 views

  •  
    "Countries around the world are deploying technologies-like digital IDs, facial recognition systems, GPS devices, and spyware-that are meant to improve governance and reduce crime. But there has been little evidence to back these claims, all while introducing a high risk of exclusion, bias, misidentification, and privacy violations. It's important to note that these impacts are not equal. They fall disproportionately on religious, ethnic, and sexual minorities, migrants and refugees, as well as human rights activists and political dissidents."
1More

Speech recognition algorithms may also have racial bias | Ars Technica - 0 views

  •  
    "These systems weren't set up to be biased; it's likely that they were simply trained on a subset of the diversity of accents and usages present in the United States. But, as we become ever more reliant on these systems, making them less frustrating for all their users should be a priority."
1More

Dressing for the Surveillance Age | The New Yorker - 0 views

  •  
    "Apart from biases in the training databases, it's hard to know how well face-recognition systems actually perform in the real world, in spite of recent gains. Anil Jain, a professor of computer science at Michigan State University who has worked on face recognition for more than thirty years, told me, "Most of the testing on the private venders' products is done in a laboratory environment under controlled settings. In real practice, you're walking around in the streets of New York. It's a cold winter day, you have a scarf around your face, a cap, maybe your coat is pulled up so your chin is partially hidden, the illumination may not be the most favorable, and the camera isn't capturing a frontal view.""
1More

Emojify - 0 views

  •  
    "We want to start a conversation about emotion recognition technology. Explore the site, watch the video, play a game and add your thoughts to our research. Or turn on your camera to activate our very own emotion recognition machine...will it 'emojify' you? "
1More

Police trial AI software to help process mobile phone evidence | UK news | The Guardian - 0 views

  •  
    "Cellebrite, the Israeli-founded and now Japanese-owned company behind some of the software, claims a wider rollout would solve problems over failures to disclose crucial digital evidence that have led to the collapse of a series of rape trials and other prosecutions in the past year. However, the move by police has prompted concerns over privacy and the potential for software to introduce bias into processing of criminal evidence."
1More

Algorithms associating appearance and criminality have a dark past | Aeon Ideas - 0 views

  •  
    "However, the recent study's seemingly high-tech attempt to pick out facial features associated with criminality borrows directly from the 'photographic composite method' developed by the Victorian jack-of-all-trades Francis Galton - which involved overlaying the faces of multiple people in a certain category to find the features indicative of qualities like health, disease, beauty and criminality."
1 - 12 of 12
Showing 20 items per page