Skip to main content

Home/ ITGSonline/ Group items tagged people and machines machine learning bias

Rss Feed Group items tagged


I Tried Predictim AI That Scans for 'Risky' Babysitters - 0 views

    "The founders of Predictim want to be clear with me: Their product-an algorithm that scans the online footprint of a prospective babysitter to determine their "risk" levels for parents-is not racist. It is not biased. "We take ethics and bias extremely seriously," Sal Parsa, Predictim's CEO, tells me warily over the phone. "In fact, in the last 18 months we trained our product, our machine, our algorithm to make sure it was ethical and not biased. We took sensitive attributes, protected classes, sex, gender, race, away from our training set. We continuously audit our model. And on top of that we added a human review process.""
1 - 1 of 1
Showing 20 items per page