Skip to main content

Home/ Public Service Internet/ Group items tagged machine learning

Rss Feed Group items tagged

Ian Forrester

[1607.06520] Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Emb... - 0 views

  •  
    The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with word embedding, a popular framework to represent text data as vectors which has been used in many machine learning and natural language processing tasks. We show that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent. This raises concerns because their widespread use, as we describe, often tends to amplify these biases. Geometrically, gender bias is first shown to be captured by a direction in the word embedding. Second, gender neutral words are shown to be linearly separable from gender definition words in the word embedding. Using these properties, we provide a methodology for modifying an embedding to remove gender stereotypes, such as the association between between the words receptionist and female, while maintaining desired associations such as between the words queen and female. We define metrics to quantify both direct and indirect gender biases in embeddings, and develop algorithms to "debias" the embedding. Using crowd-worker evaluation as well as standard benchmarks, we empirically demonstrate that our algorithms significantly reduce gender bias in embeddings while preserving the its useful properties such as the ability to cluster related concepts and to solve analogy tasks. The resulting embeddings can be used in applications without amplifying gender bias.
Ian Forrester

The Partnership on AI - 0 views

  •  
    In support of our mission to benefit people and society, the Partnership on AI intends to conduct research, organize discussions, share insights, provide thought leadership, consult with relevant third parties, respond to questions from the public and media, and create educational material that advances the understanding of AI technologies including machine perception, learning, and automated reasoning.
Ian Forrester

London cops urged to scrap use of 'biased' facial recognition at Notting Hill Carnival ... - 0 views

  •  
    London's Metropolitan Police have been urged to back down on plans to once again use facial recognition software at next weekend's Notting Hill Carnival. Privacy groups including Big Brother Watch, Liberty and Privacy International have written to police commissioner Cressida Dick (PDF) calling for a U-turn on the use of the tech. Automated facial recognition technology will snap the party-goers' faces, and run them against a database. The aim is to alert cops to people who are banned from the festival or are wanted by the police, presumably so they can take immediate action. The tech was first tested at the festival - where relations between police and revellers are often strained - last year, but it failed to identify anyone.
1 - 3 of 3
Showing 20 items per page