Skip to main content

Home/ KI-Network/ Group items tagged bias

Rss Feed Group items tagged

1More

Cognitive bias cheat sheet - Better Humans - 0 views

  •  
    Type of cognitive bias categorised into for main problem areas.
1More

Convict-spotting algorithm criticised - BBC News - 0 views

  •  
    Researchers trained an algorithm using more than 1,500 photos of Chinese citizens, hundreds of them convicts.They said the program was then able to correctly identify criminals in further photos 89% of the time. But the research, which has not been peer reviewed, has been criticised by criminology experts who say the AI may reflect bias in the justice system. #
1More

Bias, not robots on the rampage, is the key test of artificial intelligence | Business ... - 0 views

  •  
    Hidden biases may be written inadvertently into the algorithms used to decide who gets a job interview or who qualifies for a loan or for parole. If a data set considers the word "programmer" closer to the word "man" than "woman," or if you build a system that learns from Wikipedia, where only 17 per cent of profiles of notable people are women, these biases will be perpetuated in the machine.
1More

Data Bias Is Becoming A Massive Problem | Digital Tonto - 0 views

  •  
    Machines, even virtual ones, have biases. They are designed, necessarily, to favour some kinds of data over others. Unfortunately, we rarely question the judgments of mathematical models and, in many cases, their biases can pervade and distort operational reality, creating unintended consequences that are hard to undo.
1More

Thinking, Fast and Slow - Wikipedia, the free encyclopedia - 0 views

  •  
    "Thinking, Fast and Slow is a 2011 book by Nobel Prize winner in Economics Daniel Kahneman which summarizes research that he conducted over decades, often in collaboration with Amos Tversky.[1][2] It covers all three phases of his career: his early days working on cognitive bias, his work on prospect theory, and his later work on happiness. The book's central thesis is a dichotomy between two modes of thought: System 1 is fast, instinctive and emotional; System 2 is slower, more deliberative, and more logical. The book delineates cognitive biases associated with each type of thinking, starting with Kahneman's own research on loss aversion. From framing choices to substitution, the book highlights several decades of academic research to suggest that we place too much confidence in human judgment."
1More

World Bank report - Bias and Behaviour - 0 views

  •  
    Chapter from the World Developmnent Report on Mind, Behaviour
1More

Machine Learning And Human Bias: An Uneasy Pair | TechCrunch - 1 views

  •  
    Humans are biased, and the biases we encode into machines are then scaled and automated. This is not inherently bad (or good), but it raises the question: how do we operate in a world increasingly consumed with "personal analytics" that can predict race, religion, gender, age, sexual orientation, health status and much more.
1More

5 Common Mental Errors That Sway You From Making Good Decisions - The Mission - Medium - 0 views

  •  
    Common decision biases, with some nice graphics
1More

SAPVoice: Make Sure Your Hiring Algorithms Are Legal: Four Machine Learning Questions T... - 0 views

4More

Why Facts Don't Change Our Minds - The New Yorker - 0 views

  • In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. (Toilets, it turns out, are more complicated than they appear.) Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do. What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins. “One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.
  • ween one person’s ideas and knowledge” and “those of other members” of the group.
  • ween one person’s ideas and knowledge” and “those of other members” of the group.
  • ...1 more annotation...
  • ween one person’s ideas and knowledge” and “those of other members” of the group.
1More

Durham Police AI to help with custody decisions - BBC News - 0 views

  •  
    "Police in Durham are preparing to go live with an artificial intelligence (AI) system designed to help officers decide whether or not a suspect should be kept in custody."
1More

Google (GOOG) explains how artificial intelligence becomes biased against women and min... - 0 views

  •  
    Google is trying to educate the masses on how AI can accidentally perpetuate the biases held by its makers
1More

How white engineers built racist code - and why it's dangerous for black people | Techn... - 1 views

  •  
    Researchers at the MIT Media Lab, think that facial recognition software has problems recognizing black faces because its algorithms are usually written by white engineers who dominate the technology sector. These engineers build on pre-existing code libraries, typically written by other white engineers.
1 - 14 of 14
Showing 20 items per page