Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged default

Rss Feed Group items tagged

Steve Bosserman

How We Made AI As Racist and Sexist As Humans - 0 views

  • Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
  • The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
  • Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
  • ...8 more annotations...
  • The power of the system is its “ability to recognize that correlations occur between gender and professions,” says Kathryn Hume. “The downside is that there’s no intentionality behind the system—it’s just math picking up on correlations. It doesn’t know this is a sensitive issue.” There’s a tension between the futuristic and the archaic at play in this technology. AI is evolving much more rapidly than the data it has to work with, so it’s destined not just to reflect and replicate biases but also to prolong and reinforce them.
  • And sometimes, even when ample data exists, those who build the training sets don’t take deliberate measures to ensure its diversity
  • But not everyone will be equally represented in that data.
  • Accordingly, groups that have been the target of systemic discrimination by institutions that include police forces and courts don’t fare any better when judgment is handed over to a machine.
  • A growing field of research, in fact, now looks to apply algorithmic solutions to the problems of algorithmic bias.
  • Still, algorithmic interventions only do so much; addressing bias also demands diversity in the programmers who are training machines in the first place.
  • A growing awareness of algorithmic bias isn’t only a chance to intervene in our approaches to building AI systems. It’s an opportunity to interrogate why the data we’ve created looks like this and what prejudices continue to shape a society that allows these patterns in the data to emerge.
  • Of course, there’s another solution, elegant in its simplicity and fundamentally fair: get better data.
Steve Bosserman

The Fourth Industrial Revolution: Proceedings of a Workshop-in Brief - 0 views

  •  
    The Forum's perspective on present and future technological and societal changes is captured in their 'Principled Framework for the Fourth Industrial Revolution.' Philbeck explained the four principles that characterize the Fourth Industrial Revolution. * Think systems, not technologies. Individual technologies are interesting, but it is their systemic impact that matters. Emerging technologies challenge our societal values and norms, sometimes for good, but sometimes also in negative ways; the Fourth Industrial Revolution will have civilization-changing impact-on species, on the planet, on geopolitics, and on the global economy. Philbeck suggested that wealth creation and aggregation supported by this phase of technological innovation may challenge societal commitments to accessibility, inclusivity, and fairness and create the need for relentless worker re-education. As Philbeck stated, "The costs for greater productivity are often externalized to stakeholders who are not involved in a particular technology's development." * Empowering, not determining. The Forum urges an approach to the Fourth Industrial Revolution that honors existing social principles. "We need to take a stance toward technology and technological systems that empowers society and acts counter to fatalistic and deterministic views, so that society and its agency is not nullified," said Philbeck. "Technologies are not forces; we have the ability to shape them and decide on how they are applied." * Future by design, and not by default. Seeking a future by design requires active governance. There are many types of governance-by individuals, by governments, by civic society, and by companies. Philbeck argued that failure to pay attention to critical governance questions in consideration of the Fourth Industrial Revolution means societies are likely to allow undemocratic, random, and potentially malicious forces to shape the future of technological systems and th
Steve Bosserman

When the state is unjust, citizens may use justifiable violence | Aeon Ideas - 0 views

  • Here’s a philosophical exercise. Imagine a situation in which a civilian commits an injustice, the kind against which you believe it is permissible to use deception, subterfuge or violence to defend yourself or others. For instance, imagine your friend makes an improper stop at a red light, and his dad, in anger, yanks him out of the car, beats the hell out of him, and continues to strike the back of his skull even after your friend lies subdued and prostrate. May you use violence, if it’s necessary to stop the father? Now imagine the same scene, except this time the attacker is a police officer in Ohio, and the victim is Richard Hubbard III, who in 2017 experienced just such an attack as described. Does that change things? Must you let the police officer possibly kill Hubbard rather than intervene?
  • Most people answer yes, believing that we are forbidden from stopping government agents who violate our rights. I find this puzzling. On this view, my neighbours can eliminate our right of self-defence and our rights to defend others by granting someone an office or passing a bad law. On this view, our rights to life, liberty, due process and security of person can disappear by political fiat – or even when a cop has a bad day. In When All Else Fails: The Ethics of Resistance to State Injustice (2019), I argue instead that we may act defensively against government agents under the same conditions in which we may act defensively against civilians. In my view, civilian and government agents are on a par, and we have identical rights of self-defence (and defence of others) against both. We should presume, by default, that government agents have no special immunity against self-defence, unless we can discover good reason to think otherwise. But it turns out that the leading arguments for special immunity are weak.
1 - 8 of 8
Showing 20 items per page