Skip to main content

Home/ TOK Friends/ Group items tagged bias

Rss Feed Group items tagged

Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
honordearlove

Is This How Discrimination Ends? A New Approach to Implicit Bias - The Atlantic - 0 views

  • “There are a lot of people who are very sincere in their renunciation of prejudice,” she said. “Yet they are vulnerable to habits of mind. Intentions aren’t good enough.”
  • the psychological case for implicit racial bias—the idea, broadly, is that it’s possible to act in prejudicial ways while sincerely rejecting prejudiced ideas. She demonstrated that even if people don’t believe racist stereotypes are true, those stereotypes, once absorbed, can influence people’s behavior without their awareness or intent.
  • While police in many cases maintain that they used appropriate measures to protect lives and their own personal safety, the concept of implicit bias suggests that in these crucial moments, the officers saw these people not as individuals—a gentle father, an unarmed teenager, a 12-year-old child—but as members of a group they had learned to associate with fear.
  • ...8 more annotations...
  • In fact, studies demonstrate bias across nearly every field and for nearly every group of people. If you’re Latino, you’ll get less pain medication than a white patient. If you’re an elderly woman, you’ll receive fewer life-saving interventions than an elderly man. If you are a man being evaluated for a job as a lab manager, you will be given more mentorship, judged as more capable, and offered a higher starting salary than if you were a woman. If you are an obese child, your teacher is more likely to assume you’re less intelligent than if you were slim. If you are a black student, you are more likely to be punished than a white student behaving the same way.
  • Mike Pence, for instance, bristled during the 2016 vice-presidential debate: “Enough of this seeking every opportunity to demean law enforcement broadly by making the accusation of implicit bias whenever tragedy occurs.” And two days after the first presidential debate, in which Hillary Clinton proclaimed the need to address implicit bias, Donald Trump asserted that she was “essentially suggesting that everyone, including our police, are basically racist and prejudiced.”
  • Still other people, particularly those who have been the victims of police violence, also reject implicit bias—on the grounds that there’s nothing implicit about it at all.
  • Bias is woven through culture like a silver cord woven through cloth. In some lights, it’s brightly visible. In others, it’s hard to distinguish. And your position relative to that glinting thread determines whether you see it at all.
  • All of which is to say that while bias in the world is plainly evident, the exact sequence of mental events that cause it is still a roiling question.  Devine, for her part, told me that she is no longer comfortable even calling this phenomenon “implicit bias.” Instead, she prefers “unintentional bias.” The term implicit bias, she said, “has become so broad that it almost has no meaning.”
  • Weeks afterwards, students who had participated noticed bias more in others than did students who hadn’t participated, and they were more likely to label the bias they perceived as wrong. Notably, the impact seemed to last: Two years later, students who took part in a public forum on race were more likely to speak out against bias if they had participated in the training.
  • This hierarchy matters, because the more central a layer is to self-concept, the more resistant it is to change. It’s hard, for instance, to alter whether or not a person values the environment. But if you do manage to shift one of these central layers, Forscher explained, the effect is far-reaching.
  • And if there’s one thing the Madison workshops do truly shift, it is people’s concern that discrimination is a widespread and serious problem. As people become more concerned, the data show, their awareness of bias in the world grows, too.
huffem4

Does Unconscious Bias Training Really Work? - 1 views

  • The first step towards impacting unconscious bias is awareness. We must have an understanding that this issue exists in the first place—no one is exempt from having bias and being prejudice.
  • In order for unconscious bias training to be effective, it has to be ongoing and long-term
  • It’s essential to look at the linkage between unconscious bias and behaviors. Because unconscious bias is not something we are actively aware of, it in itself is difficult to actually eradicate. It is a more effective practice to analyze how unconscious bias can manifest in the workplace when hiring employees, evaluating employee performance and in the overall treatment of employees.
  • ...3 more annotations...
  • Bias can be thought of as a malleable and quickly-adapting entity. It is important to anticipate situations which are likely to lead to bias or have led to discrimination in the past, and create systems to eliminate or lessen the likelihood of these behaviors from occurring.
  • Another aspect of unconscious bias training should include the standardization of company policies, protocol, and procedures.
  • when unconscious bias training is implemented, it is imperative to have measures in place to assess incremental changes and progress. How will you then learn whether the training was successful if you don’t know what point you started at? Data should be collected at several stages of the training intervention, which can ensure the effectiveness of the training.
huffem4

Does Unconscious Bias Training Really Work? - 0 views

  • The first step towards impacting unconscious bias is awareness. We must have an understanding that this issue exists in the first place—no one is exempt from having bias and being prejudice.
  • It’s essential to look at the linkage between unconscious bias and behaviors.
  • If we understand some of the many ways in which our biases seep into our work behaviors, we may be better equipped to improve those behaviors.
  • ...3 more annotations...
  • Bias can be thought of as a malleable and quickly-adapting entity. It is important to anticipate situations which are likely to lead to bias or have led to discrimination in the past, and create systems to eliminate or lessen the likelihood of these behaviors from occurring.
  • Another aspect of unconscious bias training should include the standardization of company policies, protocol, and procedures.
  • when unconscious bias training is implemented, it is imperative to have measures in place to assess incremental changes and progress.
oliviaodon

Identifying and Avoiding Bias in Research - 0 views

  • Bias can occur in the planning, data collection, analysis, and publication phases of research. Understanding research bias allows readers to critically and independently review the scientific literature and avoid treatments which are suboptimal or potentially harmful. A thorough understanding of bias and how it affects study results is essential for the practice of evidence-based medicine.
  • Bias is not a dichotomous variable. Interpretation of bias cannot be limited to a simple inquisition: is bias present or not? Instead, reviewers of the literature must consider the degree to which bias was prevented by proper study design and implementation. As some degree of bias is nearly always present in a published study, readers must also consider how bias might influence a study's conclusions
  • Chance and confounding can be quantified and/or eliminated through proper study design and data analysis. However, only the most rigorously conducted trials can completely exclude bias as an alternate explanation for an association.
Javier E

You're Not Going to Change Your Mind - The New York Times - 0 views

  • A troubling feature of political disagreement in the United States today is that many issues on which liberals and conservatives hold divergent views are questions not of value but of fact. Is human activity responsible for global warming? Do guns make society safer? Is immigration harmful to the economy? Though undoubtedly complicated, these questions turn on empirical evidence.
  • Unfortunately, people do not always revise their beliefs in light of new information. On the contrary, they often stubbornly maintain their views. Certain disagreements stay entrenched and polarized.
  • A common explanation is confirmation bias
  • ...11 more annotations...
  • the psychological tendency to favor information that confirms our beliefs
  • If this explanation is right, then there is a relatively straightforward solution to political polarization: We need to consciously expose ourselves to evidence that challenges our beliefs to compensate for our inclination to discount it.
  • But what if confirmation bias isn’t the only culprit? It recently struck us that confirmation bias is often conflated with “telling people what they want to hear,” which is actually a distinct phenomenon known as desirability bias, or the tendency to credit information you want to believe.
  • we decided to conduct an experiment that would isolate these biases
  • The results, which we report in a forthcoming paper in the Journal of Experimental Psychology: General, were clear and robust. Those people who received desirable evidence — polls suggesting that their preferred candidate was going to win — took note and incorporated the information into their subsequent belief
  • . In contrast, those people who received undesirable evidence barely changed their belief about which candidate was most likely to win.
  • we observed a general bias toward the desirable evidence.
  • What about confirmation bias? To our surprise, those people who received confirming evidence — polls supporting their prior belief about which candidate was most likely to win — showed no bias in favor of this information.
  • They tended to incorporate this evidence into their subsequent belief to the same extent as those people who had their prior belief disconfirmed. In other words, we observed little to no bias toward the confirming evidence.
  • Our study suggests that political belief polarization may emerge because of peoples’ conflicting desires, not their conflicting beliefs per se
  • This is rather troubling, as it implies that even if we were to escape from our political echo chambers, it wouldn’t help much. Short of changing what people want to believe, we must find other ways to unify our perceptions of reality.
kushnerha

Facebook's Bias Is Built-In, and Bears Watching - The New York Times - 2 views

  • Facebook is the world’s most influential source of news.That’s true according to every available measure of size — the billion-plus people who devour its News Feed every day, the cargo ships of profit it keeps raking in, and the tsunami of online traffic it sends to other news sites.
  • But Facebook has also acquired a more subtle power to shape the wider news business. Across the industry, reporters, editors and media executives now look to Facebook the same way nesting baby chicks look to their engorged mother — as the source of all knowledge and nourishment, the model for how to behave in this scary new-media world. Case in point: The New York Times, among others, recently began an initiative to broadcast live video. Why do you suppose that might be? Yup, the F word. The deal includes payments from Facebook to news outlets, including The Times.
  • Yet few Americans think of Facebook as a powerful media organization, one that can alter events in the real world. When blowhards rant about the mainstream media, they do not usually mean Facebook, the mainstreamiest of all social networks. That’s because Facebook operates under a veneer of empiricism. Many people believe that what you see on Facebook represents some kind of data-mined objective truth unmolested by the subjective attitudes of fair-and-balanced human beings.
  • ...11 more annotations...
  • None of that is true. This week, Facebook rushed to deny a report in Gizmodo that said the team in charge of its “trending” news list routinely suppressed conservative points of view. Last month, Gizmodo also reported that Facebook employees asked Mark Zuckerberg, the social network’s chief executive, if the company had a responsibility to “help prevent President Trump in 2017.” Facebook denied it would ever try to manipulate elections.
  • Even if you believe that Facebook isn’t monkeying with the trending list or actively trying to swing the vote, the reports serve as timely reminders of the ever-increasing potential dangers of Facebook’s hold on the news.
  • The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favor certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.
  • There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn’t seem to recognize its own power, and doesn’t think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.
  • That myth should die. It’s true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.
  • “With Facebook, humans are never not involved. Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”Everything you see on Facebook is therefore the product of these people’s expertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It’s often hard to know which, because Facebook’s editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.
  • Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable.
  • You could argue that none of this is unusual. Many large media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren’t shy about their policy agendas — Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.But there are some reasons to be even more wary of Facebook’s bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what’s acceptable and what’s not in the news business.
  • Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people’s previously held points of view. If News Feed shows news that we’re each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook’s scientists asserted the echo chamber effect was muted.
  • are Facebook’s engineering decisions subject to ethical review? Nobody knows.
  • The other reason to be wary of Facebook’s bias has to do with sheer size. Ms. Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets. To determine if The Times is ignoring a certain story unfairly, look at competitors like The Washington Post and The Wall Street Journal. If those outlets are covering a story and The Times isn’t, there could be something amiss about the Times’s news judgment.Such comparative studies are nearly impossible for Facebook. Facebook is personalized, in that what you see on your News Feed is different from what I see on mine, so the only entity in a position to look for systemic bias across all of Facebook is Facebook itself. Even if you could determine the spread of stories across all of Facebook’s readers, what would you compare it to?
ilanaprincilus06

You're more biased than you think - even when you know you're biased | News | The Guardian - 0 views

  • there’s plenty of evidence to suggest that we’re all at least somewhat subject to bias
  • Tell Republicans that some imaginary policy is a Republican one, as the psychologist Geoffrey Cohen did in 2003, and they’re much more likely to support it, even if it runs counter to Republican values. But ask them why they support it, and they’ll deny that party affiliation played a role. (Cohen found something similar for Democrats.
  • those who saw the names were biased in favour of famous artists. But even though they acknowledged the risk of bias, when asked to assess their own objectivity, they didn’t view their judgments as any more biased as a result.
  • ...7 more annotations...
  • Even when the risk of bias was explicitly pointed out to them, people remained confident that they weren’t susceptible to it
  • “Even when people acknowledge that what they are about to do is biased,” the researchers write, “they still are inclined to see their resulting decisions as objective.”
  • we have a cognitive bias to the effect that we’re uniquely immune to cognitive biases.
  • It turns out the bias also applies to bias. In other words, we’re convinced that we’re better than most at not falling victim to bias.
  • “used a strategy that they thought was biased,” the researchers note, “and thus they probably expected to feel some bias when using it. The absence of that feeling may have made them more confident in their objectivity.”
  • why it’s often better for companies to hire people, or colleges to admit students, using objective checklists, rather than interviews that rely on gut feelings.
  • Bias spares nobody.
Javier E

Our Biased Brains - NYTimes.com - 0 views

  • The human brain seems to be wired so that it categorizes people by race in the first one-fifth of a second after seeing a face
  • Racial bias also begins astonishingly early: Even infants often show a preference for their own racial group. In one study, 3-month-old white infants were shown photos of faces of white adults and black adults; they preferred the faces of whites. For 3-month-old black infants living in Africa, it was the reverse.
  • in evolutionary times we became hard-wired to make instantaneous judgments about whether someone is in our “in group” or not — because that could be lifesaving. A child who didn’t prefer his or her own group might have been at risk of being clubbed to death.
  • ...7 more annotations...
  • I encourage you to test yourself at implicit.harvard.edu. It’s sobering to discover that whatever you believe intellectually, you’re biased about race, gender, age or disability.
  • unconscious racial bias turns up in children as soon as they have the verbal skills to be tested for it, at about age 4. The degree of unconscious bias then seems pretty constant: In tests, this unconscious bias turns out to be roughly the same for a 4- or 6-year-old as for a senior citizen who grew up in more racially oppressive times.
  • Many of these experiments on in-group bias have been conducted around the world, and almost every ethnic group shows a bias favoring its own. One exception: African-Americans.
  • in contrast to other groups, African-Americans do not have an unconscious bias toward their own. From young children to adults, they are essentially neutral and favor neither whites nor blacks.
  • even if we humans have evolved to have a penchant for racial preferences from a very young age, this is not destiny. We can resist the legacy that evolution has bequeathed us.
  • “We wouldn’t have survived if our ancestors hadn’t developed bodies that store sugar and fat,” Banaji says. “What made them survive is what kills us.” Yet we fight the battle of the bulge and sometimes win — and, likewise, we can resist a predisposition for bias against other groups.
  • Deep friendships, especially romantic relationships with someone of another race, also seem to mute bias
katedriscoll

Confirmation bias - Catalog of Bias - 0 views

  • Confirmation bias occurs when an individual looks for and uses the information to support their own ideas or beliefs. It also means that information not supporting their ideas or beliefs is disregarded. Confirmation bias often happens when we want certain ideas to be true. This leads individuals to stop gathering information when the retrieved evidence confirms their own viewpoints, which can lead to preconceived opinions (prejudices) that are not based on reason or factual knowledge. Individuals then pick out the bits of information that confirm their prejudices. Confirmation bias has a long history. In 1620, Francis Bacon described confirmation bias as: “Once a man’s understanding has settled on something (either because it is an accepted belief or because it pleases him), it draws everything else also to support and agree with it. And if it encounters a larger number of more powerful countervailing examples, it either fails to notice them, or disregards them, or makes fine distinctions to dismiss and reject them, and all this with much dangerous prejudice, to preserve the authority of its first Conceptions.” (Bacon 1620)
  • The impact of confirmation bias can be at the level of the individual all the way up to institution level. DuBroff showed that confirmation bias influenced expert guidelines on cholesterol and was highly prevalent when conflicts of interests were present (DuBroff 2017). He found that confirmation bias occurred due to a failure to incorporate evidence, or through misrepresentation of the evidence, which had the potential  to skew guideline recommendations
Javier E

Opinion | Bias Is a Big Problem. But So Is 'Noise.' - The New York Times - 1 views

  • The word “bias” commonly appears in conversations about mistaken judgments and unfortunate decisions. We use it when there is discrimination, for instance against women or in favor of Ivy League graduates
  • the meaning of the word is broader: A bias is any predictable error that inclines your judgment in a particular direction. For instance, we speak of bias when forecasts of sales are consistently optimistic or investment decisions overly cautious.
  • Society has devoted a lot of attention to the problem of bias — and rightly so
  • ...26 more annotations...
  • when it comes to mistaken judgments and unfortunate decisions, there is another type of error that attracts far less attention: noise.
  • To see the difference between bias and noise, consider your bathroom scale. If on average the readings it gives are too high (or too low), the scale is biased
  • It is hard to escape the conclusion that sentencing is in part a lottery, because the punishment can vary by many years depending on which judge is assigned to the case and on the judge’s state of mind on that day. The judicial system is unacceptably noisy.
  • While bias is the average of errors, noise is their variability.
  • Although it is often ignored, noise is a large source of malfunction in society.
  • The average difference between the sentences that two randomly chosen judges gave for the same crime was more than 3.5 years. Considering that the mean sentence was seven years, that was a disconcerting amount of noise.
  • If it shows different readings when you step on it several times in quick succession, the scale is noisy.
  • How much of a difference would you expect to find between the premium values that two competent underwriters assigned to the same risk?
  • Executives in the insurance company said they expected about a 10 percent difference.
  • But the typical difference we found between two underwriters was an astonishing 55 percent of their average premium — more than five times as large as the executives had expected.
  • Many other studies demonstrate noise in professional judgments. Radiologists disagree on their readings of images and cardiologists on their surgery decisions
  • Wherever there is judgment, there is noise — and more of it than you think.
  • Noise causes error, as does bias, but the two kinds of error are separate and independent.
  • A company’s hiring decisions could be unbiased overall if some of its recruiters favor men and others favor women. However, its hiring decisions would be noisy, and the company would make many bad choices
  • Where does noise come from?
  • There is much evidence that irrelevant circumstances can affect judgments.
  • for instance, a judge’s mood, fatigue and even the weather can all have modest but detectable effects on judicial decisions.
  • people can have different general tendencies. Judges often vary in the severity of the sentences they mete out: There are “hanging” judges and lenient ones.
  • People can have not only different general tendencies (say, whether they are harsh or lenient) but also different patterns of assessment (say, which types of cases they believe merit being harsh or lenient about).
  • Underwriters differ in their views of what is risky, and doctors in their views of which ailments require treatment.
  • Once you become aware of noise, you can look for ways to reduce it.
  • independent judgments from a number of people can be averaged (a frequent practice in forecasting)
  • Guidelines, such as those often used in medicine, can help professionals reach better and more uniform decisions
  • imposing structure and discipline in interviews and other forms of assessment tends to improve judgments of job candidates.
  • No noise-reduction techniques will be deployed, however, if we do not first recognize the existence of noise.
  • Organizations and institutions, public and private, will make better decisions if they take noise seriously.
caelengrubb

How Cognitive Bias Affects Your Business - 0 views

  • Human beings often act in irrational and unexpected ways when it comes to business decisions, money, and finance.
  • Behavioral finance tries to explain the difference between what economic theory predicts people will do and what they actually do in the heat of the moment. 
  • There are two main types of biases that people commit causing them to deviate from rational decision-making: cognitive and emotional.
  • ...13 more annotations...
  • Cognitive errors result from incomplete information or the inability to analyze the information that is available. These cognitive errors can be classified as either belief perseverance or processing errors
  • Processing errors occur when an individual fails to manage and organize information properly, which can be due in part to the mental effort required to compute and analyze data.
  • Conservatism bias, where people emphasize original, pre-existing information over new data.
  • Base rate neglect is the opposite effect, whereby people put too little emphasis on the original information. 
  • Confirmation bias, where people seek information that affirms existing beliefs while discounting or discarding information that might contradict them.
  • Anchoring and Adjustment happens when somebody fixates on a target number, such as the result of a calculation or valuation.
  • Hindsight bias occurs when people perceive actual outcomes as reasonable and expected, but only after the fact.
  • Sample size neglect is an error made when people infer too much from a too-small sample size.
  • Mental accounting is when people earmark certain funds for certain goals and keep them separate. When this happens, the risk and reward of projects undertaken to achieve these goals are not considered as an overall portfolio and the effect of one on another is ignored.
  • Availability bias, or recency bias skews perceived future probabilities based on memorable past events
  • Framing bias is when a person will process the same information differently depending on how it is presented and received.
  • Cognitive errors in the way people process and analyze information can lead them to make irrational decisions which can negatively impact business or investing decisions
  • . These information processing errors could have arisen to help primitive humans survive in a time before money or finance came into existence.
huffem4

The Zero-Sum Bias: When People Think that Everything is a Competition - Effectiviology - 1 views

  • The zero-sum bias is a cognitive bias that causes people to mistakenly view certain situations as being zero-sum, meaning that they incorrectly believe that one party’s gains are directly balanced by other parties’ losses.
  • This bias can shape people’s thinking and behavior in a variety of situations, both on an individual scale as well as on a societal one, so it’s important to understand it.
  • this bias encourages belief in an antagonistic nature of social relationships. It can generally be said to affect people on two scales:Individual scale. This means that the zero-sum bias causes people to mistakenly assume that there is intra-group competition for a certain resource, between them and other members of a certain social group.Group scale. This means that the zero-sum bias causes people to mistakenly assume that there is inter-group competition for a certain resource, between their group and other groups.
  • ...1 more annotation...
  • the issue with the zero-sum bias is that it causes people to believe that situations are zero-sum, when that’s not actually the case.
peterconnelly

Why You Need To Beat Confirmation Bias To Win Your Customers - 0 views

  • Confirmation bias is the tendency to interpret information in a way that is always consistent with existing beliefs. Simply, it occurs when someone views information in a positive, affirming light, even though the information could be telling a drastically different story.
  • While confirmation bias can often be chalked up to human nature, in a business setting, failure to adequately evaluate and respond to information can be a legitimate issue for a company, and particularly its marketing team.
  • This is where confirmation bias can be dangerous because it’s easy for brands to assume customers view the company through the same lens they do and have a similar opinion of the company, but this isn’t necessarily true.
  • ...7 more annotations...
  • To beat confirmation bias, it’s vital for brands to face reality when measuring the success of their campaigns and gauging customer brand perception.
  • As a marketer, it can be easy to fall victim to simply measuring customer behavioural data (like click-through rates, ad engagement, unsubscribes, etc.). However, the best marketing teams lean on an outside-in perspective to improve customer experiences and create a better brand reputation.
  • Additionally, marketing teams would be wise to connect with potential customers who did not convert. They will be most able to point out the pain points in a marketing campaign that are dissuading buyers. It’s just as important to ask consumers why they didn’t interact with a brand as it is to ask why they do.
  • Obtaining these insights allows marketers to gauge specifics into how their competitors are beating them out, and conversely, how they can improve their product and boost customer retention.
  • At the end of the day, customers want to know that their feedback is valued and has the power to drive change. Companies that choose to neglect customer feedback are prime examples of the dangers of confirmation bias.
  • It doesn’t matter how marketing teams or their company define success, it’s up to the customer.
  • For example, in industries like video tech and security, where the community is extremely tight knit, marketing teams must have a deep understanding of their audience’s business needs to have any chance of selling to them. Brands that understand consumer perception and needs, will be able to personalize messages to their target audience and create a more positive customer experience. By gauging and adapting to direct feedback, marketing teams can avoid the dangers of confirmation bias, and make wholesale changes that will turn customers into brand champions.
katedriscoll

Avoiding Psychological Bias in Decision Making - From MindTools.com - 0 views

  • In this scenario, your decision was affected by
  • confirmation bias. With this, you interpret market information in a way that confirms your preconceptions – instead of seeing it objectively – and you make wrong decisions as a result. Confirmation bias is one of many psychological biases to which we're all susceptible when we make decisions. In this article, we'll look at common types of bias, and we'll outline what you can do to avoid them.
  • Psychologists Daniel Kahneman, Paul Slovic, and Amos Tversky introduced the concept of psychological bias in the early 1970s. They published their findings in their 1982 book, "Judgment Under Uncertainty." They explained that psychological bias – also known as cognitive bias – is the tendency to make decisions or take action in an illogical way. For example, you might subconsciously make selective use of data, or you might feel pressured to make a decision by powerful colleagues. Psychological bias is the opposite of common sense and clear, measured judgment. It can lead to missed opportunities and poor decision making.
  • ...1 more annotation...
  • Below, we outline five psychological biases that are common in business decision making. We also look at how you can overcome them, and thereby make better decisions.
caelengrubb

Believing in Overcoming Cognitive Biases | Journal of Ethics | American Medical Associa... - 0 views

  • Cognitive biases contribute significantly to diagnostic and treatment errors
  • A 2016 review of their roles in decision making lists 4 domains of concern for physicians: gathering and interpreting evidence, taking action, and evaluating decisions
  • Confirmation bias is the selective gathering and interpretation of evidence consistent with current beliefs and the neglect of evidence that contradicts them.
  • ...14 more annotations...
  • It can occur when a physician refuses to consider alternative diagnoses once an initial diagnosis has been established, despite contradicting data, such as lab results. This bias leads physicians to see what they want to see
  • Anchoring bias is closely related to confirmation bias and comes into play when interpreting evidence. It refers to physicians’ practices of prioritizing information and data that support their initial impressions, even when first impressions are wrong
  • When physicians move from deliberation to action, they are sometimes swayed by emotional reactions rather than rational deliberation about risks and benefits. This is called the affect heuristic, and, while heuristics can often serve as efficient approaches to problem solving, they can sometimes lead to bias
  • Further down the treatment pathway, outcomes bias can come into play. This bias refers to the practice of believing that good or bad results are always attributable to prior decisions, even when there is no valid reason to do so
  • The dual-process theory, a cognitive model of reasoning, can be particularly relevant in matters of clinical decision making
  • This theory is based on the argument that we use 2 different cognitive systems, intuitive and analytical, when reasoning. The former is quick and uses information that is readily available; the latter is slower and more deliberate.
  • Consideration should be given to the difficulty physicians face in employing analytical thinking exclusively. Beyond constraints of time, information, and resources, many physicians are also likely to be sleep deprived, work in an environment full of distractions, and be required to respond quickly while managing heavy cognitive loads
  • Simply increasing physicians’ familiarity with the many types of cognitive biases—and how to avoid them—may be one of the best strategies to decrease bias-related errors
  • The same review suggests that cognitive forcing strategies may also have some success in improving diagnostic outcomes
  • Afterwards, the resident physicians were debriefed on both case-specific details and on cognitive forcing strategies, interviewed, and asked to complete a written survey. The results suggested that resident physicians further along in their training (ie, postgraduate year three) gained more awareness of cognitive strategies than resident physicians in earlier years of training, suggesting that this tool could be more useful after a certain level of training has been completed
  • A 2013 study examined the effect of a 3-part, 1-year curriculum on recognition and knowledge of cognitive biases and debiasing strategies in second-year residents
  • Cognitive biases in clinical practice have a significant impact on care, often in negative ways. They sometimes manifest as physicians seeing what they want to see rather than what is actually there. Or they come into play when physicians make snap decisions and then prioritize evidence that supports their conclusions, as opposed to drawing conclusions from evidence
  • Fortunately, cognitive psychology provides insight into how to prevent biases. Guided reflection and cognitive forcing strategies deflect bias through close examination of our own thinking processes.
  • During medical education and consistently thereafter, we must provide physicians with a full appreciation of the cost of biases and the potential benefits of combatting them.
pier-paolo

Opinion | The Paradox of Disclosure - The New York Times - 0 views

  • A POPULAR remedy for a conflict of interest is disclosure — informing the buyer (or the patient, etc.) of the potential bias of the seller (or the doctor, etc.)
  • disclosure: It often has the opposite of its intended effect, not only increasing bias in advisers but also making advisees more likely to follow biased advice.
  • But my research has found that people are still more likely to follow this advice because the disclosure creates increased pressure to follow the adviser’s recommendation.
  • ...5 more annotations...
  • For example, surgeons are more likely to recommend surgery than non-surgeons. Radiation-oncologists recommend radiation more than other physicians. This is known as specialty bias.
  • patients with localized prostate cancer (a condition that has multiple effective treatment options) who heard their surgeon disclose his or her specialty bias were nearly three times more likely to have surgery than those patients who did not hear their surgeon reveal such a bias.
  • To be sure, physicians who disclose a financial conflict of interest or a specialty bias do not necessarily give poor advice.
  • When bias is unavoidable, as with specialty bias, options such as patient educational materials could alert patients to this problem without hearing it directly from the physician. Another solution could be multidisciplinary treatment consultations, in which patients meet multiple specialists at the same time.
  • Consumers should be aware of their reactions to disclosure and take time out to reconsider their options and seek second opinions. And advisers and policy makers must understand the potential unintended consequences when using disclosure as a solution to manage bias.
Javier E

Research Shows That the Smarter People Are, the More Susceptible They Are to Cognitive ... - 0 views

  • While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.
  • When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions.
  • in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.
  • ...7 more annotations...
  • they wanted to understand how these biases correlated with human intelligence.
  • self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.”
  • Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.”
  • This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves.
  • it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.
  • intelligence seems to make things worse.
  • the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.
Javier E

The Conservative War on Liberal Media Has a Long History - Nicole Hemmer - The Atlantic - 0 views

  • Ailes made conservative news popular and profitable, but he was not the first to mingle partisanship with news. The twinned concepts of balance and bias were not his legacy but his inheritance. Long before Fox News, before Ailes and Rush Limbaugh and Sean Hannity, there was a conservative media complex in the United States refining a theory of liberal media bias.
  • The idea of “fair and balanced” partisan media has its roots in the 1940s and 1950s. Human Events, the right-wing newsweekly founded in 1944, was dedicated to publishing the “facts” other outlets overlooked.
  • By the early 1960s, Human Events arrived at this formulation of its mission: In reporting the news, Human Events is objective; it aims for accurate representation of the facts. But it is not impartial. It looks at events through eyes that are biased in favor of limited constitutional government, local self-government, private enterprise, and individual freedom.
  • ...9 more annotations...
  • In distinguishing between objectivity and impartiality, Human Events’ editors created a space where “bias” was an appropriate journalistic value, one that could work in tandem with objectivity.
  • two events in the early 1960s convinced the right that creating conservative media wasn’t enough to achieve balance. Conservatives would also have to discredit existing media.
  • Conservative discontent with the FCC focused on the Fairness Doctrine
  • Conservatives felt the Fairness Doctrine unfairly tilted the playing field against them. Though devised to encourage controversial broadcasting, in practice the doctrine often led broadcasters to avoid controversy so they wouldn’t have to give away free airtime. To conservatives, avoiding controversy inevitably meant silencing right-wing voices.
  • the right repeatedly challenged the central assumptions the FCC—and Americans more broadly—made about journalism. For much of the 20th century, journalists cleaved to the idea of objectivity. Opinion and analysis had their place, but that place was distinct and separate from the news. Conservative broadcasts, on the other hand, were by their very nature opinion. Fairness dictated these partisan broadcasters provide airtime for a response.
  • Conservatives saw the media landscape differently. They viewed objectivity as a mask concealing entrenched liberal bias, hiding the slanted reporting that dominated American media. Because of this, the right believed fairness did not require a response to conservative broadcasts; conservative broadcasts were the response. Unable to bring the FCC around to their position, conservatives increasingly saw the commission as a powerful government agency dedicated to maintaining media’s liberal tilt.
  • In calling coverage of Goldwater “unfounded in fact,” Manion was making another argument to which conservatives anchored their charges of liberal bias: Established media did not just slant the news—they fabricated it. And if established media couldn’t be counted on for truth, the argument went, then surely they should be required to offer both sides of the argument. In the years that followed, conservatives began an active campaign against liberal bias
  • The combined forces of the administration and its conservative media-research wing had an effect. By 1971 CBS Radio had launched Spectrum, a debate show featuring conservatives like Stan Evans, James Kilpatrick, and Phyllis Schlafly. That same year 60 Minutes pitted conservative Kilpatrick against liberal Nicholas von Hoffman in a regular segment called “Point/Counterpoint.” By then, even the publisher of Human Events, in the midst of selling his paper as an alternative to liberal media, had to admit that conservatives were popping up all over established media—even the editorial pages of “that holy house organ of Liberalism—the New York Times.”
  • So balance and bias became part of the American news diet long before Ailes entered the conservative media game. Why does that matter? It makes Ailes’s successes at Fox News far more understandable—and far less Ailes-centric. By the time Ailes entered the game, the American right had spent a generation seeking out conservative alternatives to the “liberal media,” and America’s news media was already in the midst of a revolution that made Fox News possible.
katedriscoll

Confirmation Bias | Simply Psychology - 0 views

  • Confirmation Bias is the tendency to look for information that supports, rather than rejects, one’s preconceptions, typically by interpreting evidence to confirm existing beliefs while rejecting or ignoring any conflicting data (American Psychological Association).
  • experiment by Peter Watson (1960) in which the subjects were to find the experimenter’s rule for sequencing numbers.Its results showed that the subjects chose responses that supported their hypotheses while rejecting contradictory evidence, and even though their hypotheses were not correct, they became confident in them quickly (Gray, 2010, p. 356).Though such evidence of the confirmation bias has appeared in psychological literature throughout history, the term ‘confirmation bias’ was first used in a 1977 paper detailing an experimental study on the topic (Mynatt, Doherty, & Tweney, 1977).
  • This type of confirmation bias explains people’s search for evidence in a one-sided way to support their hypotheses or theories.Experiments have shown that people provide tests/questions that are designed to yield “yes” if their favored hypothesis was true, and ignore alternative hypotheses that are likely to give the same result.This is also known as congruence heuristic (Baron, 2000, p.162-64). Though the preference for affirmative questions itself may not be bias, there are experiments that have shown that congruence bias does exist.
1 - 20 of 662 Next › Last »
Showing 20 items per page