Skip to main content

Home/ Public Service Internet/ Group items tagged results

Rss Feed Group items tagged

Ian Forrester

Google faulted for racial bias in image search results for black teenagers - The Washin... - 0 views

  •  
    If you searched for "three white teenagers" on Google Images earlier this month, the result spat up shiny, happy people in droves - an R.E.M. song in JPG format. The images, mostly stock photos, displayed young Caucasian men and women laughing, holding sports equipment or caught whimsically mid-selfie. If you searched for "three black teenagers," the algorithm offered an array of mug shots.
Ian Forrester

Do Google's 'unprofessional hair' results show it is racist? | Technology | The Guardian - 0 views

  •  
    Search term brings back mainly results of black women, which some say is evidence of bias. But algorithms may just be reflecting the wider social landscape
Ian Forrester

Patient Home Monitoring Service Leaks Private Medical Data O - 0 views

  •  
    Kromtech Security Researchers have discovered another publically accessible Amazon S3 repository. This time it contained medical data in 316,363 PDF reports in the form of weekly blood test results. Many of these were multiple reports on individual patients. It appears that each patient had weekly test results totaling around 20 files each. That would still be an estimated 150,000+ people affected by the leak.
Ian Forrester

FitnessSyncer joins your health and fitness clouds into one Dashboard and Str... - 0 views

  •  
    FitnessSyncer unifies your data in one convenient place so you can make informed decisions toward better results. Analyze your data in our customizable dashboard, stream, exportable calendar, daily analyzer, and more! You're already doing the hard work
Ian Forrester

AWS Service Terms - 57.10! - 0 views

  •  
    "57.10 Acceptable Use; Safety-Critical Systems. Your use of the Lumberyard Materials must comply with the AWS Acceptable Use Policy. The Lumberyard Materials are not intended for use with life-critical or safety-critical systems, such as use in operation of medical equipment, automated transportation systems, autonomous vehicles, aircraft or air traffic control, nuclear facilities, manned spacecraft, or military use in connection with live combat. However, this restriction will not apply in the event of the occurrence (certified by the United States Centers for Disease Control or successor body) of a widespread viral infection transmitted via bites or contact with bodily fluids that causes human corpses to reanimate and seek to consume living human flesh, blood, brain or nerve tissue and is likely to result in the fall of organized civilization."
Ian Forrester

[1607.06520] Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Emb... - 0 views

  •  
    The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with word embedding, a popular framework to represent text data as vectors which has been used in many machine learning and natural language processing tasks. We show that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent. This raises concerns because their widespread use, as we describe, often tends to amplify these biases. Geometrically, gender bias is first shown to be captured by a direction in the word embedding. Second, gender neutral words are shown to be linearly separable from gender definition words in the word embedding. Using these properties, we provide a methodology for modifying an embedding to remove gender stereotypes, such as the association between between the words receptionist and female, while maintaining desired associations such as between the words queen and female. We define metrics to quantify both direct and indirect gender biases in embeddings, and develop algorithms to "debias" the embedding. Using crowd-worker evaluation as well as standard benchmarks, we empirically demonstrate that our algorithms significantly reduce gender bias in embeddings while preserving the its useful properties such as the ability to cluster related concepts and to solve analogy tasks. The resulting embeddings can be used in applications without amplifying gender bias.
Ian Forrester

Machine Bias: There's Software Used Across the Country to Predict Future Criminals. And... - 0 views

  •  
    Algorithms used in American criminal justice is found to be biased against blacks
Ian Forrester

Google engineer apologizes after Photos app tags two black people as gorillas | The Verge - 0 views

  •  
    Google came under fire this week after its new Photos app categorized photos in one of the most racist ways possible. On June 28th, computer programmer Jacky Alciné found that the feature kept tagging pictures of him and his girlfriend as "gorillas."
1 - 10 of 10
Showing 20 items per page