Skip to main content

Home/ ITGSonline/ Group items tagged ai bias

Rss Feed Group items tagged

dr tech

A Robot, A Recruiter & A REST API Walk Into A Bar… - Peterson Technology Part... - 0 views

  •  
    "One great way to tell the difference is to ask AI recruiting companies what they use artificial intelligence, machine learning and/or deep learning for. Hopefully the hiring firm can what it's using the new technology for and not just that it is. If not it's time to dig a bit deeper."
dr tech

Smart lie-detection system to tight... - Information Centre - Research & Innovation - E... - 0 views

  •  
    "The unique approach to 'deception detection' analyses the micro-gestures of travellers to figure out if the interviewee is lying."
dr tech

Technologist Vivienne Ming: 'AI is a human right' | Technology | The Guardian - 0 views

  •  
    "At the heart of the problem that troubles Ming is the training that computer engineers receive and their uncritical faith in AI. Too often, she says, their approach to a problem is to train a neural network on a mass of data and expect the result to work fine. She berates companies for failing to engage with the problem first - applying what is already known about good employees and successful students, for example - before applying the AI."
dr tech

I Tried Predictim AI That Scans for 'Risky' Babysitters - 0 views

  •  
    "The founders of Predictim want to be clear with me: Their product-an algorithm that scans the online footprint of a prospective babysitter to determine their "risk" levels for parents-is not racist. It is not biased. "We take ethics and bias extremely seriously," Sal Parsa, Predictim's CEO, tells me warily over the phone. "In fact, in the last 18 months we trained our product, our machine, our algorithm to make sure it was ethical and not biased. We took sensitive attributes, protected classes, sex, gender, race, away from our training set. We continuously audit our model. And on top of that we added a human review process.""
dr tech

Police trial AI software to help process mobile phone evidence | UK news | The Guardian - 0 views

  •  
    "Cellebrite, the Israeli-founded and now Japanese-owned company behind some of the software, claims a wider rollout would solve problems over failures to disclose crucial digital evidence that have led to the collapse of a series of rape trials and other prosecutions in the past year. However, the move by police has prompted concerns over privacy and the potential for software to introduce bias into processing of criminal evidence."
dr tech

Computer says no: why making AIs fair, accountable and transparent is crucial | Science... - 0 views

  •  
    "In October, American teachers prevailed in a lawsuit with their school district over a computer program that assessed their performance. The system rated teachers in Houston by comparing their students' test scores against state averages. Those with high ratings won praise and even bonuses. Those who fared poorly faced the sack. The program did not please everyone. Some teachers felt that the system marked them down without good reason. But they had no way of checking if the program was fair or faulty: the company that built the software, the SAS Institute, regards its algorithm a trade secret and would not disclose its workings."
dr tech

The Age of the Algorithm - 99% Invisible - 0 views

  •  
    "But the answer to how he was chosen is actually an algorithm, a computer program that crunched through reams of data, looking at how much each passenger had paid for their ticket, what time they checked in, how often they flew on United, and whether they were part of a rewards program. The algorithm likely determined that Dr. Dao was one of the least valuable customers on the flight at the time."
dr tech

The coded gaze: biased and understudied facial recognition technology / Boing Boing - 0 views

  •  
    " "Why isn't my face being detected? We have to look at how we give machines sight," she said in a TED Talk late last year. "Computer vision uses machine-learning techniques to do facial recognition. You create a training set with examples of faces. However, if the training sets aren't really that diverse, any face that deviates too much from the established norm will be harder to detect.""
1 - 8 of 8
Showing 20 items per page