Skip to main content

Home/ TOK Friends/ Group items tagged cognitive-bias

Rss Feed Group items tagged

Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
caelengrubb

Believing in Overcoming Cognitive Biases | Journal of Ethics | American Medical Associa... - 0 views

  • Cognitive biases contribute significantly to diagnostic and treatment errors
  • A 2016 review of their roles in decision making lists 4 domains of concern for physicians: gathering and interpreting evidence, taking action, and evaluating decisions
  • Confirmation bias is the selective gathering and interpretation of evidence consistent with current beliefs and the neglect of evidence that contradicts them.
  • ...14 more annotations...
  • It can occur when a physician refuses to consider alternative diagnoses once an initial diagnosis has been established, despite contradicting data, such as lab results. This bias leads physicians to see what they want to see
  • Anchoring bias is closely related to confirmation bias and comes into play when interpreting evidence. It refers to physicians’ practices of prioritizing information and data that support their initial impressions, even when first impressions are wrong
  • When physicians move from deliberation to action, they are sometimes swayed by emotional reactions rather than rational deliberation about risks and benefits. This is called the affect heuristic, and, while heuristics can often serve as efficient approaches to problem solving, they can sometimes lead to bias
  • Further down the treatment pathway, outcomes bias can come into play. This bias refers to the practice of believing that good or bad results are always attributable to prior decisions, even when there is no valid reason to do so
  • The dual-process theory, a cognitive model of reasoning, can be particularly relevant in matters of clinical decision making
  • This theory is based on the argument that we use 2 different cognitive systems, intuitive and analytical, when reasoning. The former is quick and uses information that is readily available; the latter is slower and more deliberate.
  • Consideration should be given to the difficulty physicians face in employing analytical thinking exclusively. Beyond constraints of time, information, and resources, many physicians are also likely to be sleep deprived, work in an environment full of distractions, and be required to respond quickly while managing heavy cognitive loads
  • Simply increasing physicians’ familiarity with the many types of cognitive biases—and how to avoid them—may be one of the best strategies to decrease bias-related errors
  • The same review suggests that cognitive forcing strategies may also have some success in improving diagnostic outcomes
  • Afterwards, the resident physicians were debriefed on both case-specific details and on cognitive forcing strategies, interviewed, and asked to complete a written survey. The results suggested that resident physicians further along in their training (ie, postgraduate year three) gained more awareness of cognitive strategies than resident physicians in earlier years of training, suggesting that this tool could be more useful after a certain level of training has been completed
  • A 2013 study examined the effect of a 3-part, 1-year curriculum on recognition and knowledge of cognitive biases and debiasing strategies in second-year residents
  • Cognitive biases in clinical practice have a significant impact on care, often in negative ways. They sometimes manifest as physicians seeing what they want to see rather than what is actually there. Or they come into play when physicians make snap decisions and then prioritize evidence that supports their conclusions, as opposed to drawing conclusions from evidence
  • Fortunately, cognitive psychology provides insight into how to prevent biases. Guided reflection and cognitive forcing strategies deflect bias through close examination of our own thinking processes.
  • During medical education and consistently thereafter, we must provide physicians with a full appreciation of the cost of biases and the potential benefits of combatting them.
caelengrubb

How Cognitive Bias Affects Your Business - 0 views

  • Human beings often act in irrational and unexpected ways when it comes to business decisions, money, and finance.
  • Behavioral finance tries to explain the difference between what economic theory predicts people will do and what they actually do in the heat of the moment. 
  • There are two main types of biases that people commit causing them to deviate from rational decision-making: cognitive and emotional.
  • ...13 more annotations...
  • Cognitive errors result from incomplete information or the inability to analyze the information that is available. These cognitive errors can be classified as either belief perseverance or processing errors
  • Processing errors occur when an individual fails to manage and organize information properly, which can be due in part to the mental effort required to compute and analyze data.
  • Conservatism bias, where people emphasize original, pre-existing information over new data.
  • Base rate neglect is the opposite effect, whereby people put too little emphasis on the original information. 
  • Confirmation bias, where people seek information that affirms existing beliefs while discounting or discarding information that might contradict them.
  • Anchoring and Adjustment happens when somebody fixates on a target number, such as the result of a calculation or valuation.
  • Hindsight bias occurs when people perceive actual outcomes as reasonable and expected, but only after the fact.
  • Sample size neglect is an error made when people infer too much from a too-small sample size.
  • Mental accounting is when people earmark certain funds for certain goals and keep them separate. When this happens, the risk and reward of projects undertaken to achieve these goals are not considered as an overall portfolio and the effect of one on another is ignored.
  • Availability bias, or recency bias skews perceived future probabilities based on memorable past events
  • Framing bias is when a person will process the same information differently depending on how it is presented and received.
  • Cognitive errors in the way people process and analyze information can lead them to make irrational decisions which can negatively impact business or investing decisions
  • . These information processing errors could have arisen to help primitive humans survive in a time before money or finance came into existence.
ilanaprincilus06

You're more biased than you think - even when you know you're biased | News | The Guardian - 0 views

  • there’s plenty of evidence to suggest that we’re all at least somewhat subject to bias
  • Tell Republicans that some imaginary policy is a Republican one, as the psychologist Geoffrey Cohen did in 2003, and they’re much more likely to support it, even if it runs counter to Republican values. But ask them why they support it, and they’ll deny that party affiliation played a role. (Cohen found something similar for Democrats.
  • those who saw the names were biased in favour of famous artists. But even though they acknowledged the risk of bias, when asked to assess their own objectivity, they didn’t view their judgments as any more biased as a result.
  • ...7 more annotations...
  • Even when the risk of bias was explicitly pointed out to them, people remained confident that they weren’t susceptible to it
  • “Even when people acknowledge that what they are about to do is biased,” the researchers write, “they still are inclined to see their resulting decisions as objective.”
  • we have a cognitive bias to the effect that we’re uniquely immune to cognitive biases.
  • It turns out the bias also applies to bias. In other words, we’re convinced that we’re better than most at not falling victim to bias.
  • “used a strategy that they thought was biased,” the researchers note, “and thus they probably expected to feel some bias when using it. The absence of that feeling may have made them more confident in their objectivity.”
  • why it’s often better for companies to hire people, or colleges to admit students, using objective checklists, rather than interviews that rely on gut feelings.
  • Bias spares nobody.
huffem4

The Zero-Sum Bias: When People Think that Everything is a Competition - Effectiviology - 1 views

  • The zero-sum bias is a cognitive bias that causes people to mistakenly view certain situations as being zero-sum, meaning that they incorrectly believe that one party’s gains are directly balanced by other parties’ losses.
  • This bias can shape people’s thinking and behavior in a variety of situations, both on an individual scale as well as on a societal one, so it’s important to understand it.
  • this bias encourages belief in an antagonistic nature of social relationships. It can generally be said to affect people on two scales:Individual scale. This means that the zero-sum bias causes people to mistakenly assume that there is intra-group competition for a certain resource, between them and other members of a certain social group.Group scale. This means that the zero-sum bias causes people to mistakenly assume that there is inter-group competition for a certain resource, between their group and other groups.
  • ...1 more annotation...
  • the issue with the zero-sum bias is that it causes people to believe that situations are zero-sum, when that’s not actually the case.
katedriscoll

Frontiers | A Neural Network Framework for Cognitive Bias | Psychology - 0 views

  • Human decision-making shows systematic simplifications and deviations from the tenets of rationality (‘heuristics’) that may lead to suboptimal decisional outcomes (‘cognitive biases’). There are currently three prevailing theoretical perspectives on the origin of heuristics and cognitive biases: a cognitive-psychological, an ecological and an evolutionary perspective. However, these perspectives are mainly descriptive and none of them provides an overall explanatory framework for
  • the underlying mechanisms of cognitive biases. To enhance our understanding of cognitive heuristics and biases we propose a neural network framework for cognitive biases, which explains why our brain systematically tends to default to heuristic (‘Type 1’) decision making. We argue that many cognitive biases arise from intrinsic brain mechanisms that are fundamental for the working of biological neural networks. To substantiate our viewpoint, we discern and explain four basic neural network principles: (1) Association, (2) Compatibility, (3) Retainment, and (4) Focus. These principles are inherent to (all) neural networks which were originally optimized to perform concrete biological, perceptual, and motor functions. They form the basis for our inclinations to associate and combine (unrelated) information, to prioritize information that is compatible with our present state (such as knowledge, opinions, and expectations), to retain given information that sometimes could better be ignored, and to focus on dominant information while ignoring relevant information that is not directly activated. The supposed mechanisms are complementary and not mutually exclusive. For different cognitive biases they may all contribute in varying degrees to distortion of information. The present viewpoint not only complements the earlier three viewpoints, but also provides a unifying and binding framework for many cognitive bias phenomena.
  • The cognitive-psychological (or heuristics and biases) perspective (Evans, 2008; Kahneman and Klein, 2009) attributes cognitive biases to limitations in the available data and in the human information processing capacity (Simon, 1955; Broadbent, 1958; Kahneman, 1973, 2003; Norman and Bobrow, 1975)
katedriscoll

Avoiding Psychological Bias in Decision Making - From MindTools.com - 0 views

  • In this scenario, your decision was affected by
  • confirmation bias. With this, you interpret market information in a way that confirms your preconceptions – instead of seeing it objectively – and you make wrong decisions as a result. Confirmation bias is one of many psychological biases to which we're all susceptible when we make decisions. In this article, we'll look at common types of bias, and we'll outline what you can do to avoid them.
  • Psychologists Daniel Kahneman, Paul Slovic, and Amos Tversky introduced the concept of psychological bias in the early 1970s. They published their findings in their 1982 book, "Judgment Under Uncertainty." They explained that psychological bias – also known as cognitive bias – is the tendency to make decisions or take action in an illogical way. For example, you might subconsciously make selective use of data, or you might feel pressured to make a decision by powerful colleagues. Psychological bias is the opposite of common sense and clear, measured judgment. It can lead to missed opportunities and poor decision making.
  • ...1 more annotation...
  • Below, we outline five psychological biases that are common in business decision making. We also look at how you can overcome them, and thereby make better decisions.
Javier E

Research Shows That the Smarter People Are, the More Susceptible They Are to Cognitive ... - 0 views

  • While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.
  • When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions.
  • in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.
  • ...7 more annotations...
  • they wanted to understand how these biases correlated with human intelligence.
  • self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.”
  • Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.”
  • This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves.
  • it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.
  • intelligence seems to make things worse.
  • the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.
jaxredd10

What Is Cognitive Bias? - 0 views

  • Because of this, subtle biases can creep in and influence the way you see and think about the world. The concept of cognitive bias was first introduced by researchers Amos Tversky and Daniel Kahneman in 1972. Since then, researchers have described a number of different types of biases that affect decision-making in a wide range of areas including social behavior, cognition, behavioral economics, education, management, healthcare, business, and finance.
  • People sometimes confuse cognitive biases with logical fallacies, but the two are not the same. A logical fallacy stems from an error in a logical argument, while a cognitive bias is rooted in thought processing errors often arising from problems with memory, attention, attribution, and other mental mistakes.
katedriscoll

Confirmation bias - 0 views

  • In psychology and cognitive science, confirmation bias (or confirmatory bias) is a tendency to search for or interpret information in a way that confirms one's preconceptions, leading to statistical errors. Confirmation bias is a type of cognitive bias and represents an error of inductive inference toward confirmation of the hypothesis under study. Confirmation bias is a phenomenon wherein decision makers have been shown to actively seek out and assign more weight to evidence that confirms their hypothesis, and ignore or underweigh evidence that could disconfirm their hypothesis. As such, it can be thought of as a form of selection bias in collecting evidence.
katedriscoll

The Confirmation Bias: Why People See What They Want to See - Effectiviology - 0 views

  • The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs. For example, if someone is presented with a lot of information on a certain topic, the confirmation bias can cause them to only remember the bits of information that confirm what they already thought.The confirmation bias influences people’s judgment and decision-making in many areas of life, so it’s important to understand it. As such, in the following article you will first learn more about the confirmation bias, and then see how you can reduce its influence, both in other people’s thought process as well as in your own.
katedriscoll

What Is the Function of Confirmation Bias? | SpringerLink - 1 views

  • Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss three recent proposals of this kind before developing a novel alternative, what I call the ‘reality-matching account’
  • Confirmation bias is typically viewed as an epistemically pernicious tendency. For instance, Mercier and Sperber (2017: 215) maintain that the bias impedes the formation of well-founded beliefs, reduces people’s ability to correct their mistaken views, and makes them, when they reason on their own, “become overconfident” (Mercier 2016: 110).
huffem4

Infographic: 11 Cognitive Biases That Influence Political Outcomes - 1 views

  • when searching for facts, our own cognitive biases often get in the way.
  • The media, for example, can exploit our tendency to assign stereotypes to others by only providing catchy, surface-level information.
  • People exhibit confirmation bias when they seek information that only affirms their pre-existing beliefs. This can cause them to become overly rigid in their political opinions, even when presented with conflicting ideas or evidence.
  • ...2 more annotations...
  • In one experiment, participants chose to either support or oppose a given sociopolitical issue. They were then presented with evidence that was conflicting, affirming, or a combination of both. In all scenarios, participants were most likely to stick with their initial decisions. Of those presented with conflicting evidence, just one in five changed their stance. Furthermore, participants who maintained their initial positions became even more confident in the superiority of their decision—a testament to how influential confirmation bias can be.
  • Coverage bias, in the context of politics, is a form of media bias where certain politicians or topics are disproportionately covered.
katedriscoll

Cognitive Bias: Understanding How It Affects Your Decisions - 0 views

  • A cognitive bias is a flaw in your reasoning that leads you to misinterpret information from the world around you and to come to an inaccurate conclusion. Because you are flooded with information from millions of sources throughout the day, your brain develops ranking systems to decide which information deserves your attention and which information is important enough to store in memory. It also creates shortcuts meant to cut down on the time it takes for you to process information. The problem is that the shortcuts and ranking systems aren’t always perfectly objective because their architecture is uniquely adapted to your life experiences
  • Anchoring bias is the tendency to rely heavily on the first information you learn when you are evaluating something. In other words, what you learn early in an investigation often has a greater impact on your judgment than information you learn later. In one study, for example, researchers gave two groups of study participants some written background information about a person in a photograph. Then they asked them to describe how they thought the people in the photos were feeling. People who read more negative background information tended to infer more negative feelings, and people who read positive background information tended to infer more positive feelings. Their first impressions heavily influenced their ability to infer emotions in others.
  • Another common bias is the tendency to give greater credence to ideas that come to mind easily. If you can immediately think of several facts that support a judgment, you may be inclined to think that judgment is correct. For example, if a person sees multiple headlines about shark attacks in a coastal area, that person might form a belief that the risk of shark attacks is higher than it is.The American Psychological Association points out that when information is readily available around you, you’re more likely to remember it. Information that is easy to access in your memory seems more reliable.
caelengrubb

Looking inward in an era of 'fake news': Addressing cognitive bias | YLAI Network - 0 views

  • In an era when everyone seems eager to point out instances of “fake news,” it is easy to forget that knowing how we make sense of the news is as important as knowing how to spot incorrect or biased content
  • While the ability to analyze the credibility of a source and the veracity of its content remains an essential and often-discussed aspect of news literacy, it is equally important to understand how we as news consumers engage with and react to the information we find online, in our feeds, and on our apps
  • People process information they receive from the news in the same way they process all information around them — in the shortest, quickest way possible
  • ...11 more annotations...
  • When we consider how we engage with the news, some shortcuts we may want to pay close attention to, and reflect carefully on, are cognitive biases.
  • In fact, without these heuristics, it would be impossible for us to process all the information we receive daily. However, the use of these shortcuts can lead to “blind spots,” or unintentional ways we respond to information that can have negative consequences for how we engage with, digest, and share the information we encounter
  • These shortcuts, also called heuristics, streamline our problem-solving process and help us make relatively quick decisions.
  • Confirmation bias is the tendency to seek out and value information that confirms our pre-existing beliefs while discarding information that proves our ideas wrong.
  • Cognitive biases are best described as glitches in how we process information
  • Echo chamber effect refers to a situation in which we are primarily exposed to information, people, events, and ideas that already align with our point of view.
  • Anchoring bias, also known as “anchoring,” refers to people’s tendency to consider the first piece of information they receive about a topic as the most reliable
  • The framing effect is what happens when we make decisions based on how information is presented or discussed, rather than its actual substance.
  • Fluency heuristic occurs when a piece of information is deemed more valuable because it is easier to process or recall
  • Everyone operates under one or more cognitive biases. So, when searching for and reading the news (or other information), it is important to be aware of how these biases might shape how we make sense of this information.
  • In conclusion, we may not be able to control the content of the news — whether it is fake, reliable, or somewhere in between — but we can learn to be aware of how we respond to it and adjust our evaluations of the news accordingly.
katedriscoll

Cognitive Biases: What They Are and How They Affect People - Effectiviology - 0 views

  • A cognitive bias is a systematic pattern of deviation from rationality, which occurs due to the way our cognitive system works. Accordingly, cognitive biases cause us to be irrational in the way we search for, evaluate, interpret, judge, use, and remember information, as well as in the way we make decisions.
  • Cognitive biases affect every area of our life, from how we form our memories, to how we shape our beliefs, and to how we form relationships with other people. In doing so, they can lead to both relatively minor issues, such as forgetting a small detail from a past event, as well as to major ones, such as choosing to avoid an important medical treatment that could save our life.Because cognitive biases can have such a powerful and pervasive influence on ourselves and on others, it’s important to understand them. As such, in the following article you will learn more about cognitive biases, understand why we experience them, see what types of them exist, and find out what you can do in order to mitigate them successfully.
katedriscoll

What Is Confirmation Bias? | Psychology Today - 0 views

  • Confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea or concept to be true, they end up believing it to be true. They are motivated by wishful thinking. This error leads the individual to stop gathering information when the evidence gathered so far confirms the views or prejudices one would like to be true.
  • Confirmation bias can also be found in anxious individuals, who view the world as dangerous. For example, a person with low self-esteem is highly sensitive to being ignored by other people, and they constantly monitor for signs that people might not like them. Thus, if you are worried that someone is annoyed with you, you are biased toward all the negative information about how that person acts toward you. You interpret neutral behavior as indicative of something negative.
Javier E

Why the Past 10 Years of American Life Have Been Uniquely Stupid - The Atlantic - 0 views

  • Social scientists have identified at least three major forces that collectively bind together successful democracies: social capital (extensive social networks with high levels of trust), strong institutions, and shared stories.
  • Social media has weakened all three.
  • gradually, social-media users became more comfortable sharing intimate details of their lives with strangers and corporations. As I wrote in a 2019 Atlantic article with Tobias Rose-Stockwell, they became more adept at putting on performances and managing their personal brand—activities that might impress others but that do not deepen friendships in the way that a private phone conversation will.
  • ...118 more annotations...
  • the stage was set for the major transformation, which began in 2009: the intensification of viral dynamics.
  • Before 2009, Facebook had given users a simple timeline––a never-ending stream of content generated by their friends and connections, with the newest posts at the top and the oldest ones at the bottom
  • That began to change in 2009, when Facebook offered users a way to publicly “like” posts with the click of a button. That same year, Twitter introduced something even more powerful: the “Retweet” button, which allowed users to publicly endorse a post while also sharing it with all of their followers.
  • “Like” and “Share” buttons quickly became standard features of most other platforms.
  • Facebook developed algorithms to bring each user the content most likely to generate a “like” or some other interaction, eventually including the “share” as well.
  • Later research showed that posts that trigger emotions––especially anger at out-groups––are the most likely to be shared.
  • By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous”
  • If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.
  • This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment,
  • As a social psychologist who studies emotion, morality, and politics, I saw this happening too. The newly tweaked platforms were almost perfectly designed to bring out our most moralistic and least reflective selves. The volume of outrage was shocking.
  • It was just this kind of twitchy and explosive spread of anger that James Madison had tried to protect us from as he was drafting the U.S. Constitution.
  • The Framers of the Constitution were excellent social psychologists. They knew that democracy had an Achilles’ heel because it depended on the collective judgment of the people, and democratic communities are subject to “the turbulency and weakness of unruly passions.”
  • The key to designing a sustainable republic, therefore, was to build in mechanisms to slow things down, cool passions, require compromise, and give leaders some insulation from the mania of the moment while still holding them accountable to the people periodically, on Election Day.
  • The tech companies that enhanced virality from 2009 to 2012 brought us deep into Madison’s nightmare.
  • a less quoted yet equally important insight, about democracy’s vulnerability to triviality.
  • Madison notes that people are so prone to factionalism that “where no substantial occasion presents itself, the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts.”
  • Social media has both magnified and weaponized the frivolous.
  • It’s not just the waste of time and scarce attention that matters; it’s the continual chipping-away of trust.
  • a democracy depends on widely internalized acceptance of the legitimacy of rules, norms, and institutions.
  • when citizens lose trust in elected leaders, health authorities, the courts, the police, universities, and the integrity of elections, then every decision becomes contested; every election becomes a life-and-death struggle to save the country from the other side
  • The most recent Edelman Trust Barometer (an international measure of citizens’ trust in government, business, media, and nongovernmental organizations) showed stable and competent autocracies (China and the United Arab Emirates) at the top of the list, while contentious democracies such as the United States, the United Kingdom, Spain, and South Korea scored near the bottom (albeit above Russia).
  • The literature is complex—some studies show benefits, particularly in less developed democracies—but the review found that, on balance, social media amplifies political polarization; foments populism, especially right-wing populism; and is associated with the spread of misinformation.
  • When people lose trust in institutions, they lose trust in the stories told by those institutions. That’s particularly true of the institutions entrusted with the education of children.
  • Facebook and Twitter make it possible for parents to become outraged every day over a new snippet from their children’s history lessons––and math lessons and literature selections, and any new pedagogical shifts anywhere in the country
  • The motives of teachers and administrators come into question, and overreaching laws or curricular reforms sometimes follow, dumbing down education and reducing trust in it further.
  • young people educated in the post-Babel era are less likely to arrive at a coherent story of who we are as a people, and less likely to share any such story with those who attended different schools or who were educated in a different decade.
  • former CIA analyst Martin Gurri predicted these fracturing effects in his 2014 book, The Revolt of the Public. Gurri’s analysis focused on the authority-subverting effects of information’s exponential growth, beginning with the internet in the 1990s. Writing nearly a decade ago, Gurri could already see the power of social media as a universal solvent, breaking down bonds and weakening institutions everywhere it reached.
  • he notes a constructive feature of the pre-digital era: a single “mass audience,” all consuming the same content, as if they were all looking into the same gigantic mirror at the reflection of their own society. I
  • The digital revolution has shattered that mirror, and now the public inhabits those broken pieces of glass. So the public isn’t one thing; it’s highly fragmented, and it’s basically mutually hostile
  • Facebook, Twitter, YouTube, and a few other large platforms unwittingly dissolved the mortar of trust, belief in institutions, and shared stories that had held a large and diverse secular democracy together.
  • I think we can date the fall of the tower to the years between 2011 (Gurri’s focal year of “nihilistic” protests) and 2015, a year marked by the “great awokening” on the left and the ascendancy of Donald Trump on the right.
  • Twitter can overpower all the newspapers in the country, and stories cannot be shared (or at least trusted) across more than a few adjacent fragments—so truth cannot achieve widespread adherence.
  • fter Babel, nothing really means anything anymore––at least not in a way that is durable and on which people widely agree.
  • Politics After Babel
  • “Politics is the art of the possible,” the German statesman Otto von Bismarck said in 1867. In a post-Babel democracy, not much may be possible.
  • The ideological distance between the two parties began increasing faster in the 1990s. Fox News and the 1994 “Republican Revolution” converted the GOP into a more combative party.
  • So cross-party relationships were already strained before 2009. But the enhanced virality of social media thereafter made it more hazardous to be seen fraternizing with the enemy or even failing to attack the enemy with sufficient vigor.
  • What changed in the 2010s? Let’s revisit that Twitter engineer’s metaphor of handing a loaded gun to a 4-year-old. A mean tweet doesn’t kill anyone; it is an attempt to shame or punish someone publicly while broadcasting one’s own virtue, brilliance, or tribal loyalties. It’s more a dart than a bullet
  • from 2009 to 2012, Facebook and Twitter passed out roughly 1 billion dart guns globally. We’ve been shooting one another ever since.
  • “devoted conservatives,” comprised 6 percent of the U.S. population.
  • the warped “accountability” of social media has also brought injustice—and political dysfunction—in three ways.
  • First, the dart guns of social media give more power to trolls and provocateurs while silencing good citizens.
  • a small subset of people on social-media platforms are highly concerned with gaining status and are willing to use aggression to do so.
  • Across eight studies, Bor and Petersen found that being online did not make most people more aggressive or hostile; rather, it allowed a small number of aggressive people to attack a much larger set of victims. Even a small number of jerks were able to dominate discussion forums,
  • Additional research finds that women and Black people are harassed disproportionately, so the digital public square is less welcoming to their voices.
  • Second, the dart guns of social media give more power and voice to the political extremes while reducing the power and voice of the moderate majority.
  • The “Hidden Tribes” study, by the pro-democracy group More in Common, surveyed 8,000 Americans in 2017 and 2018 and identified seven groups that shared beliefs and behaviors.
  • Social media has given voice to some people who had little previously, and it has made it easier to hold powerful people accountable for their misdeeds
  • The group furthest to the left, the “progressive activists,” comprised 8 percent of the population. The progressive activists were by far the most prolific group on social media: 70 percent had shared political content over the previous year. The devoted conservatives followed, at 56 percent.
  • These two extreme groups are similar in surprising ways. They are the whitest and richest of the seven groups, which suggests that America is being torn apart by a battle between two subsets of the elite who are not representative of the broader society.
  • they are the two groups that show the greatest homogeneity in their moral and political attitudes.
  • likely a result of thought-policing on social media:
  • political extremists don’t just shoot darts at their enemies; they spend a lot of their ammunition targeting dissenters or nuanced thinkers on their own team.
  • Finally, by giving everyone a dart gun, social media deputizes everyone to administer justice with no due process. Platforms like Twitter devolve into the Wild West, with no accountability for vigilantes.
  • Enhanced-virality platforms thereby facilitate massive collective punishment for small or imagined offenses, with real-world consequences, including innocent people losing their jobs and being shamed into suicide
  • we don’t get justice and inclusion; we get a society that ignores context, proportionality, mercy, and truth.
  • Since the tower fell, debates of all kinds have grown more and more confused. The most pervasive obstacle to good thinking is confirmation bias, which refers to the human tendency to search only for evidence that confirms our preferred beliefs
  • search engines were supercharging confirmation bias, making it far easier for people to find evidence for absurd beliefs and conspiracy theorie
  • The most reliable cure for confirmation bias is interaction with people who don’t share your beliefs. They confront you with counterevidence and counterargument.
  • In his book The Constitution of Knowledge, Jonathan Rauch describes the historical breakthrough in which Western societies developed an “epistemic operating system”—that is, a set of institutions for generating knowledge from the interactions of biased and cognitively flawed individuals
  • English law developed the adversarial system so that biased advocates could present both sides of a case to an impartial jury.
  • Newspapers full of lies evolved into professional journalistic enterprises, with norms that required seeking out multiple sides of a story, followed by editorial review, followed by fact-checking.
  • Universities evolved from cloistered medieval institutions into research powerhouses, creating a structure in which scholars put forth evidence-backed claims with the knowledge that other scholars around the world would be motivated to gain prestige by finding contrary evidence.
  • Part of America’s greatness in the 20th century came from having developed the most capable, vibrant, and productive network of knowledge-producing institutions in all of human history
  • But this arrangement, Rauch notes, “is not self-maintaining; it relies on an array of sometimes delicate social settings and understandings, and those need to be understood, affirmed, and protected.”
  • This, I believe, is what happened to many of America’s key institutions in the mid-to-late 2010s. They got stupider en masse because social media instilled in their members a chronic fear of getting darted
  • it was so pervasive that it established new behavioral norms backed by new policies seemingly overnight
  • Participants in our key institutions began self-censoring to an unhealthy degree, holding back critiques of policies and ideas—even those presented in class by their students—that they believed to be ill-supported or wrong.
  • The stupefying process plays out differently on the right and the left because their activist wings subscribe to different narratives with different sacred values.
  • The “Hidden Tribes” study tells us that the “devoted conservatives” score highest on beliefs related to authoritarianism. They share a narrative in which America is eternally under threat from enemies outside and subversives within; they see life as a battle between patriots and traitors.
  • they are psychologically different from the larger group of “traditional conservatives” (19 percent of the population), who emphasize order, decorum, and slow rather than radical change.
  • The traditional punishment for treason is death, hence the battle cry on January 6: “Hang Mike Pence.”
  • Right-wing death threats, many delivered by anonymous accounts, are proving effective in cowing traditional conservatives
  • The wave of threats delivered to dissenting Republican members of Congress has similarly pushed many of the remaining moderates to quit or go silent, giving us a party ever more divorced from the conservative tradition, constitutional responsibility, and reality.
  • The stupidity on the right is most visible in the many conspiracy theories spreading across right-wing media and now into Congress.
  • The Democrats have also been hit hard by structural stupidity, though in a different way. In the Democratic Party, the struggle between the progressive wing and the more moderate factions is open and ongoing, and often the moderates win.
  • The problem is that the left controls the commanding heights of the culture: universities, news organizations, Hollywood, art museums, advertising, much of Silicon Valley, and the teachers’ unions and teaching colleges that shape K–12 education. And in many of those institutions, dissent has been stifled:
  • Liberals in the late 20th century shared a belief that the sociologist Christian Smith called the “liberal progress” narrative, in which America used to be horrifically unjust and repressive, but, thanks to the struggles of activists and heroes, has made (and continues to make) progress toward realizing the noble promise of its founding.
  • It is also the view of the “traditional liberals” in the “Hidden Tribes” study (11 percent of the population), who have strong humanitarian values, are older than average, and are largely the people leading America’s cultural and intellectual institutions.
  • when the newly viralized social-media platforms gave everyone a dart gun, it was younger progressive activists who did the most shooting, and they aimed a disproportionate number of their darts at these older liberal leaders.
  • Confused and fearful, the leaders rarely challenged the activists or their nonliberal narrative in which life at every institution is an eternal battle among identity groups over a zero-sum pie, and the people on top got there by oppressing the people on the bottom. This new narrative is rigidly egalitarian––focused on equality of outcomes, not of rights or opportunities. It is unconcerned with individual rights.
  • The universal charge against people who disagree with this narrative is not “traitor”; it is “racist,” “transphobe,” “Karen,” or some related scarlet letter marking the perpetrator as one who hates or harms a marginalized group.
  • The punishment that feels right for such crimes is not execution; it is public shaming and social death.
  • anyone on Twitter had already seen dozens of examples teaching the basic lesson: Don’t question your own side’s beliefs, policies, or actions. And when traditional liberals go silent, as so many did in the summer of 2020, the progressive activists’ more radical narrative takes over as the governing narrative of an organization.
  • This is why so many epistemic institutions seemed to “go woke” in rapid succession that year and the next, beginning with a wave of controversies and resignations at The New York Times and other newspapers, and continuing on to social-justice pronouncements by groups of doctors and medical associations
  • The problem is structural. Thanks to enhanced-virality social media, dissent is punished within many of our institutions, which means that bad ideas get elevated into official policy.
  • In a 2018 interview, Steve Bannon, the former adviser to Donald Trump, said that the way to deal with the media is “to flood the zone with shit.” He was describing the “firehose of falsehood” tactic pioneered by Russian disinformation programs to keep Americans confused, disoriented, and angry.
  • artificial intelligence is close to enabling the limitless spread of highly believable disinformation. The AI program GPT-3 is already so good that you can give it a topic and a tone and it will spit out as many essays as you like, typically with perfect grammar and a surprising level of coherence.
  • Renée DiResta, the research manager at the Stanford Internet Observatory, explained that spreading falsehoods—whether through text, images, or deep-fake videos—will quickly become inconceivably easy. (She co-wrote the essay with GPT-3.)
  • American factions won’t be the only ones using AI and social media to generate attack content; our adversaries will too.
  • In the 20th century, America’s shared identity as the country leading the fight to make the world safe for democracy was a strong force that helped keep the culture and the polity together.
  • In the 21st century, America’s tech companies have rewired the world and created products that now appear to be corrosive to democracy, obstacles to shared understanding, and destroyers of the modern tower.
  • What changes are needed?
  • I can suggest three categories of reforms––three goals that must be achieved if democracy is to remain viable in the post-Babel era.
  • We must harden democratic institutions so that they can withstand chronic anger and mistrust, reform social media so that it becomes less socially corrosive, and better prepare the next generation for democratic citizenship in this new age.
  • Harden Democratic Institutions
  • we must reform key institutions so that they can continue to function even if levels of anger, misinformation, and violence increase far above those we have today.
  • Reforms should reduce the outsize influence of angry extremists and make legislators more responsive to the average voter in their district.
  • One example of such a reform is to end closed party primaries, replacing them with a single, nonpartisan, open primary from which the top several candidates advance to a general election that also uses ranked-choice voting
  • A second way to harden democratic institutions is to reduce the power of either political party to game the system in its favor, for example by drawing its preferred electoral districts or selecting the officials who will supervise elections
  • These jobs should all be done in a nonpartisan way.
  • Reform Social Media
  • Social media’s empowerment of the far left, the far right, domestic trolls, and foreign agents is creating a system that looks less like democracy and more like rule by the most aggressive.
  • it is within our power to reduce social media’s ability to dissolve trust and foment structural stupidity. Reforms should limit the platforms’ amplification of the aggressive fringes while giving more voice to what More in Common calls “the exhausted majority.”
  • the main problem with social media is not that some people post fake or toxic stuff; it’s that fake and outrage-inducing content can now attain a level of reach and influence that was not possible before
  • Perhaps the biggest single change that would reduce the toxicity of existing platforms would be user verification as a precondition for gaining the algorithmic amplification that social media offers.
  • One of the first orders of business should be compelling the platforms to share their data and their algorithms with academic researchers.
  • Prepare the Next Generation
  • Childhood has become more tightly circumscribed in recent generations––with less opportunity for free, unstructured play; less unsupervised time outside; more time online. Whatever else the effects of these shifts, they have likely impeded the development of abilities needed for effective self-governance for many young adults
  • Depression makes people less likely to want to engage with new people, ideas, and experiences. Anxiety makes new things seem more threatening. As these conditions have risen and as the lessons on nuanced social behavior learned through free play have been delayed, tolerance for diverse viewpoints and the ability to work out disputes have diminished among many young people
  • Students did not just say that they disagreed with visiting speakers; some said that those lectures would be dangerous, emotionally devastating, a form of violence. Because rates of teen depression and anxiety have continued to rise into the 2020s, we should expect these views to continue in the generations to follow, and indeed to become more severe.
  • The most important change we can make to reduce the damaging effects of social media on children is to delay entry until they have passed through puberty.
  • The age should be raised to at least 16, and companies should be held responsible for enforcing it.
  • et them out to play. Stop starving children of the experiences they most need to become good citizens: free play in mixed-age groups of children with minimal adult supervision
  • while social media has eroded the art of association throughout society, it may be leaving its deepest and most enduring marks on adolescents. A surge in rates of anxiety, depression, and self-harm among American teens began suddenly in the early 2010s. (The same thing happened to Canadian and British teens, at the same time.) The cause is not known, but the timing points to social media as a substantial contributor—the surge began just as the large majority of American teens became daily users of the major platforms.
  • What would it be like to live in Babel in the days after its destruction? We know. It is a time of confusion and loss. But it is also a time to reflect, listen, and build.
  • In recent years, Americans have started hundreds of groups and organizations dedicated to building trust and friendship across the political divide, including BridgeUSA, Braver Angels (on whose board I serve), and many others listed at BridgeAlliance.us. We cannot expect Congress and the tech companies to save us. We must change ourselves and our communities.
  • when we look away from our dysfunctional federal government, disconnect from social media, and talk with our neighbors directly, things seem more hopeful. Most Americans in the More in Common report are members of the “exhausted majority,” which is tired of the fighting and is willing to listen to the other side and compromise. Most Americans now see that social media is having a negative impact on the country, and are becoming more aware of its damaging effects on children.
Blair Peterson

Cognitive bias cheat sheet - 2 views

  • Information overload sucks, so we aggressively filter. Noise becomes signal.Lack of meaning is confusing, so we fill in the gaps. Signal becomes a story.Need to act fast lest we lose our chance, so we jump to conclusions. Stories become decisions.This isn’t getting easier, so we try to remember the important bits. Decisions inform our mental models of the world.
1 - 20 of 94 Next › Last »
Showing 20 items per page