Skip to main content

Home/ TOK Friends/ Group items tagged confirmation

Rss Feed Group items tagged

katedriscoll

Confirmation bias - Catalog of Bias - 0 views

  • Confirmation bias occurs when an individual looks for and uses the information to support their own ideas or beliefs. It also means that information not supporting their ideas or beliefs is disregarded. Confirmation bias often happens when we want certain ideas to be true. This leads individuals to stop gathering information when the retrieved evidence confirms their own viewpoints, which can lead to preconceived opinions (prejudices) that are not based on reason or factual knowledge. Individuals then pick out the bits of information that confirm their prejudices. Confirmation bias has a long history. In 1620, Francis Bacon described confirmation bias as: “Once a man’s understanding has settled on something (either because it is an accepted belief or because it pleases him), it draws everything else also to support and agree with it. And if it encounters a larger number of more powerful countervailing examples, it either fails to notice them, or disregards them, or makes fine distinctions to dismiss and reject them, and all this with much dangerous prejudice, to preserve the authority of its first Conceptions.” (Bacon 1620)
  • The impact of confirmation bias can be at the level of the individual all the way up to institution level. DuBroff showed that confirmation bias influenced expert guidelines on cholesterol and was highly prevalent when conflicts of interests were present (DuBroff 2017). He found that confirmation bias occurred due to a failure to incorporate evidence, or through misrepresentation of the evidence, which had the potential  to skew guideline recommendations
peterconnelly

Why You Need To Beat Confirmation Bias To Win Your Customers - 0 views

  • Confirmation bias is the tendency to interpret information in a way that is always consistent with existing beliefs. Simply, it occurs when someone views information in a positive, affirming light, even though the information could be telling a drastically different story.
  • While confirmation bias can often be chalked up to human nature, in a business setting, failure to adequately evaluate and respond to information can be a legitimate issue for a company, and particularly its marketing team.
  • This is where confirmation bias can be dangerous because it’s easy for brands to assume customers view the company through the same lens they do and have a similar opinion of the company, but this isn’t necessarily true.
  • ...7 more annotations...
  • To beat confirmation bias, it’s vital for brands to face reality when measuring the success of their campaigns and gauging customer brand perception.
  • As a marketer, it can be easy to fall victim to simply measuring customer behavioural data (like click-through rates, ad engagement, unsubscribes, etc.). However, the best marketing teams lean on an outside-in perspective to improve customer experiences and create a better brand reputation.
  • Additionally, marketing teams would be wise to connect with potential customers who did not convert. They will be most able to point out the pain points in a marketing campaign that are dissuading buyers. It’s just as important to ask consumers why they didn’t interact with a brand as it is to ask why they do.
  • Obtaining these insights allows marketers to gauge specifics into how their competitors are beating them out, and conversely, how they can improve their product and boost customer retention.
  • At the end of the day, customers want to know that their feedback is valued and has the power to drive change. Companies that choose to neglect customer feedback are prime examples of the dangers of confirmation bias.
  • It doesn’t matter how marketing teams or their company define success, it’s up to the customer.
  • For example, in industries like video tech and security, where the community is extremely tight knit, marketing teams must have a deep understanding of their audience’s business needs to have any chance of selling to them. Brands that understand consumer perception and needs, will be able to personalize messages to their target audience and create a more positive customer experience. By gauging and adapting to direct feedback, marketing teams can avoid the dangers of confirmation bias, and make wholesale changes that will turn customers into brand champions.
Javier E

You're Not Going to Change Your Mind - The New York Times - 0 views

  • A troubling feature of political disagreement in the United States today is that many issues on which liberals and conservatives hold divergent views are questions not of value but of fact. Is human activity responsible for global warming? Do guns make society safer? Is immigration harmful to the economy? Though undoubtedly complicated, these questions turn on empirical evidence.
  • Unfortunately, people do not always revise their beliefs in light of new information. On the contrary, they often stubbornly maintain their views. Certain disagreements stay entrenched and polarized.
  • A common explanation is confirmation bias
  • ...11 more annotations...
  • the psychological tendency to favor information that confirms our beliefs
  • If this explanation is right, then there is a relatively straightforward solution to political polarization: We need to consciously expose ourselves to evidence that challenges our beliefs to compensate for our inclination to discount it.
  • But what if confirmation bias isn’t the only culprit? It recently struck us that confirmation bias is often conflated with “telling people what they want to hear,” which is actually a distinct phenomenon known as desirability bias, or the tendency to credit information you want to believe.
  • we decided to conduct an experiment that would isolate these biases
  • The results, which we report in a forthcoming paper in the Journal of Experimental Psychology: General, were clear and robust. Those people who received desirable evidence — polls suggesting that their preferred candidate was going to win — took note and incorporated the information into their subsequent belief
  • . In contrast, those people who received undesirable evidence barely changed their belief about which candidate was most likely to win.
  • we observed a general bias toward the desirable evidence.
  • What about confirmation bias? To our surprise, those people who received confirming evidence — polls supporting their prior belief about which candidate was most likely to win — showed no bias in favor of this information.
  • They tended to incorporate this evidence into their subsequent belief to the same extent as those people who had their prior belief disconfirmed. In other words, we observed little to no bias toward the confirming evidence.
  • Our study suggests that political belief polarization may emerge because of peoples’ conflicting desires, not their conflicting beliefs per se
  • This is rather troubling, as it implies that even if we were to escape from our political echo chambers, it wouldn’t help much. Short of changing what people want to believe, we must find other ways to unify our perceptions of reality.
katedriscoll

The Confirmation Bias: Why People See What They Want to See - Effectiviology - 0 views

  • The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs. For example, if someone is presented with a lot of information on a certain topic, the confirmation bias can cause them to only remember the bits of information that confirm what they already thought.The confirmation bias influences people’s judgment and decision-making in many areas of life, so it’s important to understand it. As such, in the following article you will first learn more about the confirmation bias, and then see how you can reduce its influence, both in other people’s thought process as well as in your own.
oliviaodon

Confirmation bias in science: how to avoid it | Ars Technica - 0 views

  • One of the most common arguments against a scientific finding is confirmation bias: the scientist or scientists only look for data that confirms a desired conclusion. Confirmation bias is remarkably common—it is used by psychics, mediums, mentalists, and homeopaths, just to name a few. As you may guess from such a list, deliberate use of confirmation bias is held in low esteem by scientists, and allowing confirmation bias to get the better of your results is regarded as a particularly sad form of incompetence. Yet, whenever science meets some ideological barrier, scientists are accused of, at best, self-deception, and, at worst, deliberate fraud.
katedriscoll

Confirmation bias - 0 views

  • In psychology and cognitive science, confirmation bias (or confirmatory bias) is a tendency to search for or interpret information in a way that confirms one's preconceptions, leading to statistical errors. Confirmation bias is a type of cognitive bias and represents an error of inductive inference toward confirmation of the hypothesis under study. Confirmation bias is a phenomenon wherein decision makers have been shown to actively seek out and assign more weight to evidence that confirms their hypothesis, and ignore or underweigh evidence that could disconfirm their hypothesis. As such, it can be thought of as a form of selection bias in collecting evidence.
katedriscoll

Confirmation Bias | Simply Psychology - 0 views

  • Confirmation Bias is the tendency to look for information that supports, rather than rejects, one’s preconceptions, typically by interpreting evidence to confirm existing beliefs while rejecting or ignoring any conflicting data (American Psychological Association).
  • experiment by Peter Watson (1960) in which the subjects were to find the experimenter’s rule for sequencing numbers.Its results showed that the subjects chose responses that supported their hypotheses while rejecting contradictory evidence, and even though their hypotheses were not correct, they became confident in them quickly (Gray, 2010, p. 356).Though such evidence of the confirmation bias has appeared in psychological literature throughout history, the term ‘confirmation bias’ was first used in a 1977 paper detailing an experimental study on the topic (Mynatt, Doherty, & Tweney, 1977).
  • This type of confirmation bias explains people’s search for evidence in a one-sided way to support their hypotheses or theories.Experiments have shown that people provide tests/questions that are designed to yield “yes” if their favored hypothesis was true, and ignore alternative hypotheses that are likely to give the same result.This is also known as congruence heuristic (Baron, 2000, p.162-64). Though the preference for affirmative questions itself may not be bias, there are experiments that have shown that congruence bias does exist.
Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
katedriscoll

What Is Confirmation Bias? | Psychology Today - 0 views

  • Confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea or concept to be true, they end up believing it to be true. They are motivated by wishful thinking. This error leads the individual to stop gathering information when the evidence gathered so far confirms the views or prejudices one would like to be true.
  • Confirmation bias can also be found in anxious individuals, who view the world as dangerous. For example, a person with low self-esteem is highly sensitive to being ignored by other people, and they constantly monitor for signs that people might not like them. Thus, if you are worried that someone is annoyed with you, you are biased toward all the negative information about how that person acts toward you. You interpret neutral behavior as indicative of something negative.
katedriscoll

Confirmation bias | BehavioralEconomics.com | The BE Hub - 0 views

  • Confirmation bias (Wason, 1960) occurs when people seek out or evaluate information in a way that fits with their existing thinking and preconceptions. The domain of science, where theories should advance based on both falsifying and supporting evidence, has not been immune to bias, which is often associated with people processing hypotheses in ways that end up confirming them (Oswald & Grosjean, 2004). Similarly, a consumer who likes a particular brand and researches a new purchase may be motivated to seek out customer reviews on the internet that favor that brand. Confirmation bias has also been related to unmotivated processes, including primacy effects and anchoring, evident in a reliance on information that is encountered early in a process (Nickerson, 1998).
katedriscoll

Frontiers | A Digital Nudge to Counter Confirmation Bias | Big Data - 1 views

  • Information disorder in current information ecosystems arises not only from the publication of “fake news,” but also from individuals' subjective reading of news and from their propagating news to others. Sometimes the difference between real and fake information is apparent. However, often a message is written to evoke certain emotions and opinions by taking partially true base stories and injecting false statements such that the information looks realistic. In addition, the perception of the trustworthiness of news is often influenced by confirmation bias. As a result, people often believe distorted or outright incorrect news and spread such misinformation further. For example, it was shown that in the months preceding the 2016 American presidential election, organizations from both Russia and Iran ran organized efforts to create such stories and spread them on Twitter and Facebook (Cohen, 2018). It is therefore important to raise internet users' awareness of such practices. Key to this is providing users with means to understand whether information should be trusted or not.
  • In this section, we discuss how social networks increase the spread of biased news and misinformation. We discuss confirmation bias, echo chambers and other factors that may subconsciously influence a person's opinion. We show how these processes can interact to form a vicious circle that favors the rise of untrustworthy sources. Often, when an individual thinks they know something, they are satisfied by an explanation that confirms their belief, without necessarily considering all possible other explanations, and regardless of the veracity of this information. This is confirmation bias in action. Nickerson (1998) defined it as the tendency of people to both seek and interpret evidence that supports an already-held belief.
Javier E

The Backfire Effect « You Are Not So Smart - 0 views

  • corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.
  • Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.
  • Psychologists call stories like these narrative scripts, stories that tell you what you want to hear, stories which confirm your beliefs and give you permission to continue feeling as you already do. If believing in welfare queens protects your ideology, you accept it and move on.
  • ...8 more annotations...
  • Contradictory evidence strengthens the position of the believer. It is seen as part of the conspiracy, and missing evidence is dismissed as part of the coverup.
  • Most online battles follow a similar pattern, each side launching attacks and pulling evidence from deep inside the web to back up their positions until, out of frustration, one party resorts to an all-out ad hominem nuclear strike
  • you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.
  • you spend much more time considering information you disagree with than you do information you accept. Information which lines up with what you already believe passes through the mind like a vapor, but when you come across something which threatens your beliefs, something which conflicts with your preconceived notions of how the world works, you seize up and take notice. Some psychologists speculate there is an evolutionary explanation. Your ancestors paid more attention and spent more time thinking about negative stimuli than positive because bad things required a response
  • when your beliefs are challenged, you pore over the data, picking it apart, searching for weakness. The cognitive dissonance locks up the gears of your mind until you deal with it. In the process you form more neural connections, build new memories and put out effort – once you finally move on, your original convictions are stronger than ever.
  • The backfire effect is constantly shaping your beliefs and memory, keeping you consistently leaning one way or the other through a process psychologists call biased assimilation.
  • They then separated subjects into two groups; one group said they believed homosexuality was a mental illness and one did not. Each group then read the fake studies full of pretend facts and figures suggesting their worldview was wrong. On either side of the issue, after reading studies which did not support their beliefs, most people didn’t report an epiphany, a realization they’ve been wrong all these years. Instead, they said the issue was something science couldn’t understand. When asked about other topics later on, like spanking or astrology, these same people said they no longer trusted research to determine the truth. Rather than shed their belief and face facts, they rejected science altogether.
  • As social media and advertising progresses, confirmation bias and the backfire effect will become more and more difficult to overcome. You will have more opportunities to pick and choose the kind of information which gets into your head along with the kinds of outlets you trust to give you that information. In addition, advertisers will continue to adapt, not only generating ads based on what they know about you, but creating advertising strategies on the fly based on what has and has not worked on you so far. The media of the future may be delivered based not only on your preferences, but on how you vote, where you grew up, your mood, the time of day or year – every element of you which can be quantified. In a world where everything comes to you on demand, your beliefs may never be challenged.
katedriscoll

Confirmation Bias - an overview | ScienceDirect Topics - 0 views

  • Confirmation bias is a ubiquitous phenomenon, the effects of which have been traced as far back as Pythagoras’ studies of harmonic relationships in the 6th century B.C. (Nickerson, 1998), and is referenced in the writings of William Shakespeare and Francis Bacon (Risinger, Saks, Thompson, & Rosenthal, 2002). It is also a problematic phenomenon, having been implicated in “a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations” throughout human history, including the witch trials of Western Europe and New England, and the perpetuation of inaccurate medical diagnoses, ineffective medical treatments, and erroneous scientific theories (Nickerson, 1998, p. 175).
  • For over a century, psychologists have observed that people naturally favor information that is consistent with their beliefs or desires, and ignore or discount evidence to the contrary. In an article titled “The Mind’s Eye,” Jastrow (1899) was among the first to explain how the mind plays an active role in information processing, such that two individuals with different mindsets might interpret the same information in entirely different ways (see also Boring, 1930). Since then, a wealth of empirical research has demonstrated that confirmation bias affects how we perceive visual stimuli (e.g., Bruner & Potter, 1964; Leeper, 1935), how we gather and evaluate evidence (e.g., Lord, Ross, & Lepper, 1979; Wason, 1960), and how we judge—and behave toward—other people (e.g., Asch, 1946; Rosenthal & Jacobson, 1966; Snyder & Swann, 1978).
katedriscoll

What Is the Function of Confirmation Bias? | SpringerLink - 1 views

  • Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss three recent proposals of this kind before developing a novel alternative, what I call the ‘reality-matching account’
  • Confirmation bias is typically viewed as an epistemically pernicious tendency. For instance, Mercier and Sperber (2017: 215) maintain that the bias impedes the formation of well-founded beliefs, reduces people’s ability to correct their mistaken views, and makes them, when they reason on their own, “become overconfident” (Mercier 2016: 110).
katedriscoll

Confirmation Bias - Misinformation and Disinformation: Thinking Critically about Inform... - 1 views

  • Confirmation Bias: "subconscious tendency to seek and interpret information and other evidence in ways that affirm our existing beliefs, ideas, expectations, and/or hypotheses. Therefore, confirmation bias is both affected by and feeds our implicit biases. It can be most entrenched around beliefs and ideas that we are strongly attached to or that provoke a strong emotional response." Source: Facing History and Ourselves (This is a link to a lesson plan).
caelengrubb

The Reason We Reason | WIRED - 0 views

  • In the early 1980s, Amos Tversky and Thomas Gilovich began sifting through years of statistics from the Philadelphia 76ers. The psychologists looked at every single shot taken by every single player, and recorded whether or not that shot had been preceded by a string of hits or misses. All told, they analyzed thousands upon thousands of field goal attempts.
  • The larger question, of course, is why confirmation bias exists.
  • After analyzing all the shots of the 76ers, the psychologists discovered that there was absolutely no evidence of "the hot hand."
  • ...11 more annotations...
  • The 76ers were shocked by the evidence.
  • Why, then, do we believe in the hot hand? Confirmation bias is to blame
  • It's at this point that our faulty reasoning mechanisms kick in, as we start ignoring the misses and focusing on the makes
  • In other words, we seek out evidence that confirms our suspicions of streakiness. The end result is that a mental fiction dominates our perception of the game.
  • Why'd they do this? Tversky and Gilovich were interested in testing the "hot hand" phenomenon, which occurs when NBA players are convinced that they're hot, on a roll, in the zone.
  • This is the sort of mental mistake that seems ripe for fixing by natural selection, since it always leads to erroneous beliefs and faulty causal theorie
  • We'd be a hell of a lot smarter if we weren't only drawn to evidence that confirms what we already believe.
  • And this leads me to a fascinating and provocative new theory of reasoning put forth by Hugo Mercier and Dan Sperber. In essence, they argue that human reason has nothing to do with finding the truth, or locating the best alternative.
  • Instead, it’s all about being able to argue with others:
  • Needless to say, this new theory paints a rather bleak portrait of human nature
  • Instead, the function of reasoning is rooted in communication, in the act of trying to persuade other people that what we believe is true. We are social animals all the way down.
katherineharron

The world sacrificed its elderly in the race to protect hospitals. The result was a cat... - 0 views

  • Three months ago, as the novel coronavirus began to gain a foothold in countries across Europe, officials in the UK said they were still confident that the risk to the British public remained low.
  • but at the time there were just 13 confirmed cases and no deaths in the UK. While the government ordered hospitals to prepare for an influx of patients, its advice to some of the country's most vulnerable people -- elderly residents of care or nursing homes -- was that they were "very unlikely" to be infected.
  • By May 1, of the 33,365 total confirmed deaths in England and Wales, at least 12,526 -- or 38% -- were care home residents, according to the latest estimates from the Office of National Statistics (ONS).
  • ...12 more annotations...
  • The UK is not alone. Many other nations were slow to respond to the threat at care home facilities, and the consequences have been devastating.
  • Comparing death tolls can be difficult: some countries have separate data covering elderly care homes, while others include facilities for those with disabilities. Some countries do not include in their data those residents who die in hospitals, some have regional variation​, and some have no data at all.
  • There had been 1,661 coronavirus deaths among care home residents out of 3,395 total coronavirus deaths in Sweden by May 14, or 49%, according to LTCcovid's report.
  • 19 residents out of 110 had died in the past two months, but only five were confirmed Covid-19 deaths -- the rest were "undetermined," he said. He said he thought there had been "slight under-reporting" of deaths in the UK because of a lack of testing, and said the situation had been "harrowing."
  • A similar story played out in France, where coronavirus fatalities among care home residents in all settings make up more than half of all coronavirus deaths recorded as of May 18, according to health ministry data used in the LTCcovid report.
  • Elderly care sector professionals and care home workers published a letter to Health Minister Olivier Véran on March 20 expressing alarm and requesting 500,000 masks per week in affected areas, to which he agreed.
  • By March 24, the Spanish army was drafted in to help and found "abandoned" ​care home residents dead in their beds, according to Defense Minister Margarita Robles. The government said at its briefing the next day that the information had been passed to the public prosecutor, who was investigating. New care home guidelines called for extended isolation measures, but some homes said they would now have to send all staff home to comply.
  • She said the DHSC was prioritizing testing in care homes and had provided £3.2 billion ($3.9 billion) to local authorities to ease pressure on services including care homes, as well as an additional £600 million ($730 million) for homes last week. "Since the start of this pandemic we have worked to ensure our care homes and frontline care workforce get the support they need. Almost two thirds of care homes have not had an outbreak and deaths in all settings, including care homes, are falling."
  • Raffaele Antonelli Incalzi, head of the Italian geriatric society SIGG, said in a statement in early April that care homes were "biological time bombs," in part because overcrowded hospitals were moving elderly patients to unprepared homes.
  • The UK initially did not record care home deaths. While the latest official ONS data for England and Wales shows that 38% of coronavirus deaths occurred in care homes, LTCcovid said the figure could be far higher.
  • LTCcovid's report found that 3,890 of Canada's 4,740 coronavirus-linked deaths took place among care home residents as of May 8, or 82%, and Health Canada told CNN the percentage was nearly 80% on May 19. Canada's largest province, Ontario, has announced an independent inquiry.
  • Of the 247 total Covid-19 deaths in South Korea that had been confirmed as of April 30, 84 were care home residents -- a share of 34%. No large care home outbreaks have occurred since the measures were implemented.
tongoscar

Africa confirms first case of coronavirus | News | DW | 14.02.2020 - 0 views

shared by tongoscar on 19 Feb 20 - No Cached
  • Egypt on Friday confirmed its first case of coronavirus, making it the first known infection in Africa. A joint statement from Egypt's health and population ministries said the patient was not from Egypt. It did not say what nationality the person was or where they had been prior to arriving in Egypt.
  • Facebook cancels event in California
  • With over 120 new deaths reported in China, the virus death toll mounted to nearly 1,400 on Friday. Over 64,000 cases have been diagnosed globally. This weekend, a WHO-led joint mission will begin investigating the spread and severity of the outbreak in China, WHO chief Tedros Adhanom Ghebreyesus said.
  • ...1 more annotation...
  • Health authorities say it can be difficult to distinguish between the two illnesses. Currently, there are 15 confirmed cases of the coronavirus infection in the US. Some 600 people are currently under quarantine.
blythewallick

People show confirmation bias even about which way dots are moving -- ScienceDaily - 0 views

  • People have a tendency to interpret new information in a way that supports their pre-existing beliefs, a phenomenon known as confirmation bias.
  • Now, researchers reporting in Current Biology on September 13 have shown that people will do the same thing even when the decision they've made pertains to a choice that is rather less consequential: which direction a series of dots is moving and whether the average of a series of numbers is greater or less than 50.
  • "Confirmation biases have previously only been established in the domains of higher cognition or subjective preferences," for example in individuals' preferences for one consumer product or another, says Tobias Donner from University Medical Center Hamburg-Eppendorf (UKE), Germany. "It was rather striking for us to see that people displayed clear signs of confirmation bias when judging on sensory input that we expected to be subjectively neutral to them."
  • ...2 more annotations...
  • The experiments showed that participants, after making an initial call based on the first movie, were more likely to use subsequent evidence that was consistent with their initial choice to make a final judgment the second time around.
  • "Contrary to a common phrase, first impression does not have to be the last impression," Talluri says. "Such impressions, or choices, lead us to evaluate information in their favor. By acknowledging the fact that we selectively prioritize information agreeing with our previous choices, we could attempt to actively suppress this bias, at least in cases of critical significance, like evaluating job candidates or making policies that impact a large section of the society."
Javier E

Why the Past 10 Years of American Life Have Been Uniquely Stupid - The Atlantic - 0 views

  • Social scientists have identified at least three major forces that collectively bind together successful democracies: social capital (extensive social networks with high levels of trust), strong institutions, and shared stories.
  • Social media has weakened all three.
  • gradually, social-media users became more comfortable sharing intimate details of their lives with strangers and corporations. As I wrote in a 2019 Atlantic article with Tobias Rose-Stockwell, they became more adept at putting on performances and managing their personal brand—activities that might impress others but that do not deepen friendships in the way that a private phone conversation will.
  • ...118 more annotations...
  • the stage was set for the major transformation, which began in 2009: the intensification of viral dynamics.
  • Before 2009, Facebook had given users a simple timeline––a never-ending stream of content generated by their friends and connections, with the newest posts at the top and the oldest ones at the bottom
  • That began to change in 2009, when Facebook offered users a way to publicly “like” posts with the click of a button. That same year, Twitter introduced something even more powerful: the “Retweet” button, which allowed users to publicly endorse a post while also sharing it with all of their followers.
  • “Like” and “Share” buttons quickly became standard features of most other platforms.
  • Facebook developed algorithms to bring each user the content most likely to generate a “like” or some other interaction, eventually including the “share” as well.
  • Later research showed that posts that trigger emotions––especially anger at out-groups––are the most likely to be shared.
  • By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous”
  • If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.
  • This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment,
  • As a social psychologist who studies emotion, morality, and politics, I saw this happening too. The newly tweaked platforms were almost perfectly designed to bring out our most moralistic and least reflective selves. The volume of outrage was shocking.
  • It was just this kind of twitchy and explosive spread of anger that James Madison had tried to protect us from as he was drafting the U.S. Constitution.
  • The Framers of the Constitution were excellent social psychologists. They knew that democracy had an Achilles’ heel because it depended on the collective judgment of the people, and democratic communities are subject to “the turbulency and weakness of unruly passions.”
  • The key to designing a sustainable republic, therefore, was to build in mechanisms to slow things down, cool passions, require compromise, and give leaders some insulation from the mania of the moment while still holding them accountable to the people periodically, on Election Day.
  • The tech companies that enhanced virality from 2009 to 2012 brought us deep into Madison’s nightmare.
  • a less quoted yet equally important insight, about democracy’s vulnerability to triviality.
  • Madison notes that people are so prone to factionalism that “where no substantial occasion presents itself, the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts.”
  • Social media has both magnified and weaponized the frivolous.
  • It’s not just the waste of time and scarce attention that matters; it’s the continual chipping-away of trust.
  • a democracy depends on widely internalized acceptance of the legitimacy of rules, norms, and institutions.
  • when citizens lose trust in elected leaders, health authorities, the courts, the police, universities, and the integrity of elections, then every decision becomes contested; every election becomes a life-and-death struggle to save the country from the other side
  • The most recent Edelman Trust Barometer (an international measure of citizens’ trust in government, business, media, and nongovernmental organizations) showed stable and competent autocracies (China and the United Arab Emirates) at the top of the list, while contentious democracies such as the United States, the United Kingdom, Spain, and South Korea scored near the bottom (albeit above Russia).
  • The literature is complex—some studies show benefits, particularly in less developed democracies—but the review found that, on balance, social media amplifies political polarization; foments populism, especially right-wing populism; and is associated with the spread of misinformation.
  • When people lose trust in institutions, they lose trust in the stories told by those institutions. That’s particularly true of the institutions entrusted with the education of children.
  • Facebook and Twitter make it possible for parents to become outraged every day over a new snippet from their children’s history lessons––and math lessons and literature selections, and any new pedagogical shifts anywhere in the country
  • The motives of teachers and administrators come into question, and overreaching laws or curricular reforms sometimes follow, dumbing down education and reducing trust in it further.
  • young people educated in the post-Babel era are less likely to arrive at a coherent story of who we are as a people, and less likely to share any such story with those who attended different schools or who were educated in a different decade.
  • former CIA analyst Martin Gurri predicted these fracturing effects in his 2014 book, The Revolt of the Public. Gurri’s analysis focused on the authority-subverting effects of information’s exponential growth, beginning with the internet in the 1990s. Writing nearly a decade ago, Gurri could already see the power of social media as a universal solvent, breaking down bonds and weakening institutions everywhere it reached.
  • he notes a constructive feature of the pre-digital era: a single “mass audience,” all consuming the same content, as if they were all looking into the same gigantic mirror at the reflection of their own society. I
  • The digital revolution has shattered that mirror, and now the public inhabits those broken pieces of glass. So the public isn’t one thing; it’s highly fragmented, and it’s basically mutually hostile
  • Facebook, Twitter, YouTube, and a few other large platforms unwittingly dissolved the mortar of trust, belief in institutions, and shared stories that had held a large and diverse secular democracy together.
  • I think we can date the fall of the tower to the years between 2011 (Gurri’s focal year of “nihilistic” protests) and 2015, a year marked by the “great awokening” on the left and the ascendancy of Donald Trump on the right.
  • Twitter can overpower all the newspapers in the country, and stories cannot be shared (or at least trusted) across more than a few adjacent fragments—so truth cannot achieve widespread adherence.
  • fter Babel, nothing really means anything anymore––at least not in a way that is durable and on which people widely agree.
  • Politics After Babel
  • “Politics is the art of the possible,” the German statesman Otto von Bismarck said in 1867. In a post-Babel democracy, not much may be possible.
  • The ideological distance between the two parties began increasing faster in the 1990s. Fox News and the 1994 “Republican Revolution” converted the GOP into a more combative party.
  • So cross-party relationships were already strained before 2009. But the enhanced virality of social media thereafter made it more hazardous to be seen fraternizing with the enemy or even failing to attack the enemy with sufficient vigor.
  • What changed in the 2010s? Let’s revisit that Twitter engineer’s metaphor of handing a loaded gun to a 4-year-old. A mean tweet doesn’t kill anyone; it is an attempt to shame or punish someone publicly while broadcasting one’s own virtue, brilliance, or tribal loyalties. It’s more a dart than a bullet
  • from 2009 to 2012, Facebook and Twitter passed out roughly 1 billion dart guns globally. We’ve been shooting one another ever since.
  • “devoted conservatives,” comprised 6 percent of the U.S. population.
  • the warped “accountability” of social media has also brought injustice—and political dysfunction—in three ways.
  • First, the dart guns of social media give more power to trolls and provocateurs while silencing good citizens.
  • a small subset of people on social-media platforms are highly concerned with gaining status and are willing to use aggression to do so.
  • Across eight studies, Bor and Petersen found that being online did not make most people more aggressive or hostile; rather, it allowed a small number of aggressive people to attack a much larger set of victims. Even a small number of jerks were able to dominate discussion forums,
  • Additional research finds that women and Black people are harassed disproportionately, so the digital public square is less welcoming to their voices.
  • Second, the dart guns of social media give more power and voice to the political extremes while reducing the power and voice of the moderate majority.
  • The “Hidden Tribes” study, by the pro-democracy group More in Common, surveyed 8,000 Americans in 2017 and 2018 and identified seven groups that shared beliefs and behaviors.
  • Social media has given voice to some people who had little previously, and it has made it easier to hold powerful people accountable for their misdeeds
  • The group furthest to the left, the “progressive activists,” comprised 8 percent of the population. The progressive activists were by far the most prolific group on social media: 70 percent had shared political content over the previous year. The devoted conservatives followed, at 56 percent.
  • These two extreme groups are similar in surprising ways. They are the whitest and richest of the seven groups, which suggests that America is being torn apart by a battle between two subsets of the elite who are not representative of the broader society.
  • they are the two groups that show the greatest homogeneity in their moral and political attitudes.
  • likely a result of thought-policing on social media:
  • political extremists don’t just shoot darts at their enemies; they spend a lot of their ammunition targeting dissenters or nuanced thinkers on their own team.
  • Finally, by giving everyone a dart gun, social media deputizes everyone to administer justice with no due process. Platforms like Twitter devolve into the Wild West, with no accountability for vigilantes.
  • Enhanced-virality platforms thereby facilitate massive collective punishment for small or imagined offenses, with real-world consequences, including innocent people losing their jobs and being shamed into suicide
  • we don’t get justice and inclusion; we get a society that ignores context, proportionality, mercy, and truth.
  • Since the tower fell, debates of all kinds have grown more and more confused. The most pervasive obstacle to good thinking is confirmation bias, which refers to the human tendency to search only for evidence that confirms our preferred beliefs
  • search engines were supercharging confirmation bias, making it far easier for people to find evidence for absurd beliefs and conspiracy theorie
  • The most reliable cure for confirmation bias is interaction with people who don’t share your beliefs. They confront you with counterevidence and counterargument.
  • In his book The Constitution of Knowledge, Jonathan Rauch describes the historical breakthrough in which Western societies developed an “epistemic operating system”—that is, a set of institutions for generating knowledge from the interactions of biased and cognitively flawed individuals
  • English law developed the adversarial system so that biased advocates could present both sides of a case to an impartial jury.
  • Newspapers full of lies evolved into professional journalistic enterprises, with norms that required seeking out multiple sides of a story, followed by editorial review, followed by fact-checking.
  • Universities evolved from cloistered medieval institutions into research powerhouses, creating a structure in which scholars put forth evidence-backed claims with the knowledge that other scholars around the world would be motivated to gain prestige by finding contrary evidence.
  • Part of America’s greatness in the 20th century came from having developed the most capable, vibrant, and productive network of knowledge-producing institutions in all of human history
  • But this arrangement, Rauch notes, “is not self-maintaining; it relies on an array of sometimes delicate social settings and understandings, and those need to be understood, affirmed, and protected.”
  • This, I believe, is what happened to many of America’s key institutions in the mid-to-late 2010s. They got stupider en masse because social media instilled in their members a chronic fear of getting darted
  • it was so pervasive that it established new behavioral norms backed by new policies seemingly overnight
  • Participants in our key institutions began self-censoring to an unhealthy degree, holding back critiques of policies and ideas—even those presented in class by their students—that they believed to be ill-supported or wrong.
  • The stupefying process plays out differently on the right and the left because their activist wings subscribe to different narratives with different sacred values.
  • The “Hidden Tribes” study tells us that the “devoted conservatives” score highest on beliefs related to authoritarianism. They share a narrative in which America is eternally under threat from enemies outside and subversives within; they see life as a battle between patriots and traitors.
  • they are psychologically different from the larger group of “traditional conservatives” (19 percent of the population), who emphasize order, decorum, and slow rather than radical change.
  • The traditional punishment for treason is death, hence the battle cry on January 6: “Hang Mike Pence.”
  • Right-wing death threats, many delivered by anonymous accounts, are proving effective in cowing traditional conservatives
  • The wave of threats delivered to dissenting Republican members of Congress has similarly pushed many of the remaining moderates to quit or go silent, giving us a party ever more divorced from the conservative tradition, constitutional responsibility, and reality.
  • The stupidity on the right is most visible in the many conspiracy theories spreading across right-wing media and now into Congress.
  • The Democrats have also been hit hard by structural stupidity, though in a different way. In the Democratic Party, the struggle between the progressive wing and the more moderate factions is open and ongoing, and often the moderates win.
  • The problem is that the left controls the commanding heights of the culture: universities, news organizations, Hollywood, art museums, advertising, much of Silicon Valley, and the teachers’ unions and teaching colleges that shape K–12 education. And in many of those institutions, dissent has been stifled:
  • Liberals in the late 20th century shared a belief that the sociologist Christian Smith called the “liberal progress” narrative, in which America used to be horrifically unjust and repressive, but, thanks to the struggles of activists and heroes, has made (and continues to make) progress toward realizing the noble promise of its founding.
  • It is also the view of the “traditional liberals” in the “Hidden Tribes” study (11 percent of the population), who have strong humanitarian values, are older than average, and are largely the people leading America’s cultural and intellectual institutions.
  • when the newly viralized social-media platforms gave everyone a dart gun, it was younger progressive activists who did the most shooting, and they aimed a disproportionate number of their darts at these older liberal leaders.
  • Confused and fearful, the leaders rarely challenged the activists or their nonliberal narrative in which life at every institution is an eternal battle among identity groups over a zero-sum pie, and the people on top got there by oppressing the people on the bottom. This new narrative is rigidly egalitarian––focused on equality of outcomes, not of rights or opportunities. It is unconcerned with individual rights.
  • The universal charge against people who disagree with this narrative is not “traitor”; it is “racist,” “transphobe,” “Karen,” or some related scarlet letter marking the perpetrator as one who hates or harms a marginalized group.
  • The punishment that feels right for such crimes is not execution; it is public shaming and social death.
  • anyone on Twitter had already seen dozens of examples teaching the basic lesson: Don’t question your own side’s beliefs, policies, or actions. And when traditional liberals go silent, as so many did in the summer of 2020, the progressive activists’ more radical narrative takes over as the governing narrative of an organization.
  • This is why so many epistemic institutions seemed to “go woke” in rapid succession that year and the next, beginning with a wave of controversies and resignations at The New York Times and other newspapers, and continuing on to social-justice pronouncements by groups of doctors and medical associations
  • The problem is structural. Thanks to enhanced-virality social media, dissent is punished within many of our institutions, which means that bad ideas get elevated into official policy.
  • In a 2018 interview, Steve Bannon, the former adviser to Donald Trump, said that the way to deal with the media is “to flood the zone with shit.” He was describing the “firehose of falsehood” tactic pioneered by Russian disinformation programs to keep Americans confused, disoriented, and angry.
  • artificial intelligence is close to enabling the limitless spread of highly believable disinformation. The AI program GPT-3 is already so good that you can give it a topic and a tone and it will spit out as many essays as you like, typically with perfect grammar and a surprising level of coherence.
  • Renée DiResta, the research manager at the Stanford Internet Observatory, explained that spreading falsehoods—whether through text, images, or deep-fake videos—will quickly become inconceivably easy. (She co-wrote the essay with GPT-3.)
  • American factions won’t be the only ones using AI and social media to generate attack content; our adversaries will too.
  • In the 20th century, America’s shared identity as the country leading the fight to make the world safe for democracy was a strong force that helped keep the culture and the polity together.
  • In the 21st century, America’s tech companies have rewired the world and created products that now appear to be corrosive to democracy, obstacles to shared understanding, and destroyers of the modern tower.
  • What changes are needed?
  • I can suggest three categories of reforms––three goals that must be achieved if democracy is to remain viable in the post-Babel era.
  • We must harden democratic institutions so that they can withstand chronic anger and mistrust, reform social media so that it becomes less socially corrosive, and better prepare the next generation for democratic citizenship in this new age.
  • Harden Democratic Institutions
  • we must reform key institutions so that they can continue to function even if levels of anger, misinformation, and violence increase far above those we have today.
  • Reforms should reduce the outsize influence of angry extremists and make legislators more responsive to the average voter in their district.
  • One example of such a reform is to end closed party primaries, replacing them with a single, nonpartisan, open primary from which the top several candidates advance to a general election that also uses ranked-choice voting
  • A second way to harden democratic institutions is to reduce the power of either political party to game the system in its favor, for example by drawing its preferred electoral districts or selecting the officials who will supervise elections
  • These jobs should all be done in a nonpartisan way.
  • Reform Social Media
  • Social media’s empowerment of the far left, the far right, domestic trolls, and foreign agents is creating a system that looks less like democracy and more like rule by the most aggressive.
  • it is within our power to reduce social media’s ability to dissolve trust and foment structural stupidity. Reforms should limit the platforms’ amplification of the aggressive fringes while giving more voice to what More in Common calls “the exhausted majority.”
  • the main problem with social media is not that some people post fake or toxic stuff; it’s that fake and outrage-inducing content can now attain a level of reach and influence that was not possible before
  • Perhaps the biggest single change that would reduce the toxicity of existing platforms would be user verification as a precondition for gaining the algorithmic amplification that social media offers.
  • One of the first orders of business should be compelling the platforms to share their data and their algorithms with academic researchers.
  • Prepare the Next Generation
  • Childhood has become more tightly circumscribed in recent generations––with less opportunity for free, unstructured play; less unsupervised time outside; more time online. Whatever else the effects of these shifts, they have likely impeded the development of abilities needed for effective self-governance for many young adults
  • Depression makes people less likely to want to engage with new people, ideas, and experiences. Anxiety makes new things seem more threatening. As these conditions have risen and as the lessons on nuanced social behavior learned through free play have been delayed, tolerance for diverse viewpoints and the ability to work out disputes have diminished among many young people
  • Students did not just say that they disagreed with visiting speakers; some said that those lectures would be dangerous, emotionally devastating, a form of violence. Because rates of teen depression and anxiety have continued to rise into the 2020s, we should expect these views to continue in the generations to follow, and indeed to become more severe.
  • The most important change we can make to reduce the damaging effects of social media on children is to delay entry until they have passed through puberty.
  • The age should be raised to at least 16, and companies should be held responsible for enforcing it.
  • et them out to play. Stop starving children of the experiences they most need to become good citizens: free play in mixed-age groups of children with minimal adult supervision
  • while social media has eroded the art of association throughout society, it may be leaving its deepest and most enduring marks on adolescents. A surge in rates of anxiety, depression, and self-harm among American teens began suddenly in the early 2010s. (The same thing happened to Canadian and British teens, at the same time.) The cause is not known, but the timing points to social media as a substantial contributor—the surge began just as the large majority of American teens became daily users of the major platforms.
  • What would it be like to live in Babel in the days after its destruction? We know. It is a time of confusion and loss. But it is also a time to reflect, listen, and build.
  • In recent years, Americans have started hundreds of groups and organizations dedicated to building trust and friendship across the political divide, including BridgeUSA, Braver Angels (on whose board I serve), and many others listed at BridgeAlliance.us. We cannot expect Congress and the tech companies to save us. We must change ourselves and our communities.
  • when we look away from our dysfunctional federal government, disconnect from social media, and talk with our neighbors directly, things seem more hopeful. Most Americans in the More in Common report are members of the “exhausted majority,” which is tired of the fighting and is willing to listen to the other side and compromise. Most Americans now see that social media is having a negative impact on the country, and are becoming more aware of its damaging effects on children.
1 - 20 of 228 Next › Last »
Showing 20 items per page