Skip to main content

Home/ TOK Friends/ Group items tagged heuristic

Rss Feed Group items tagged

kaylynfreeman

Heuristics and Biases, Related But Not the Same | Psychology Today - 0 views

  • By treating them as the same, we miss nuances that are important for understanding human decision-making.
  • Biases—regardless of whether they are hardwired into us due to evolution, learned through socialization or direct experience—or a function of genetically influenced traits, represent predispositions to favor a given conclusion over other conclusions. Therefore, biases might be considered the leanings, priorities, and inclinations that influence our decisions[2].
  • Heuristics are mental shortcuts that allow us to make decisions more quickly, frugally, and/or accurately than if we considered additional information. They are derived from experience and formal learning and are open to continuous updates based on new experiences and information. Therefore, heuristics represent the strategies we employ to filter and attend to information[3].
  • ...6 more annotations...
  • This preference, which is perhaps a strong one, may have resulted in a bias to maintain the status quo. You rely on heuristics to help identify your deodorant (usually by sight) and you add it to your virtual cart and place your order. In this instance, your bias influenced your preference toward your current deodorant, and your heuristic helped you to identify it. Potential stinkiness crisis averted.
  • In that case, you will likely be motivated to make a purchasing decision consistent with your strong bias (i.e., look to purchase it from a different vendor, maintaining the status quo with your deodorant). Thus, in this scenario, you decide to look elsewhere.
  • At this step, the availability heuristic is likely to guide your decision, causing you to navigate to an alternative site that quickly comes to mind[6].
  • Your heuristics will help you select an alternative product that meets some criteria.
  • satisficing heuristic (opting for the first product that looks good enough), a similarity heuristic (opting for the product that looks closest to your current deodorant) or some other heuristic to help you select the product you decide to order.
  • The question, though, is often whether your biases and heuristics are aiding or inhibiting the ecological rationality of your decision, and that will vary from situation to situation.
katedriscoll

Frontiers | A Neural Network Framework for Cognitive Bias | Psychology - 0 views

  • Human decision-making shows systematic simplifications and deviations from the tenets of rationality (‘heuristics’) that may lead to suboptimal decisional outcomes (‘cognitive biases’). There are currently three prevailing theoretical perspectives on the origin of heuristics and cognitive biases: a cognitive-psychological, an ecological and an evolutionary perspective. However, these perspectives are mainly descriptive and none of them provides an overall explanatory framework for
  • the underlying mechanisms of cognitive biases. To enhance our understanding of cognitive heuristics and biases we propose a neural network framework for cognitive biases, which explains why our brain systematically tends to default to heuristic (‘Type 1’) decision making. We argue that many cognitive biases arise from intrinsic brain mechanisms that are fundamental for the working of biological neural networks. To substantiate our viewpoint, we discern and explain four basic neural network principles: (1) Association, (2) Compatibility, (3) Retainment, and (4) Focus. These principles are inherent to (all) neural networks which were originally optimized to perform concrete biological, perceptual, and motor functions. They form the basis for our inclinations to associate and combine (unrelated) information, to prioritize information that is compatible with our present state (such as knowledge, opinions, and expectations), to retain given information that sometimes could better be ignored, and to focus on dominant information while ignoring relevant information that is not directly activated. The supposed mechanisms are complementary and not mutually exclusive. For different cognitive biases they may all contribute in varying degrees to distortion of information. The present viewpoint not only complements the earlier three viewpoints, but also provides a unifying and binding framework for many cognitive bias phenomena.
  • The cognitive-psychological (or heuristics and biases) perspective (Evans, 2008; Kahneman and Klein, 2009) attributes cognitive biases to limitations in the available data and in the human information processing capacity (Simon, 1955; Broadbent, 1958; Kahneman, 1973, 2003; Norman and Bobrow, 1975)
johnsonel7

Opinion | Do heuristics help us make good decisions in uncertain times? - 0 views

  • Do heuristics, the shortcuts that the brain takes, support efficient decision making or does it impede efficient decision making?
  • Humans have neither unlimited resources nor unlimited time to take decisions. So, the brain has always developed smart heuristics, shortcuts to take efficient decisions.
  • One other key thought put forward by Gigerenzer is that there is a big difference between risk and uncertainty. We are dealing with risk when you know all the alternatives, outcomes and their probabilities. We are dealing with uncertainty when you don’t know all the alternatives, outcomes or their probabilities.
  • ...1 more annotation...
  • It has been found that if doctors are trained how to translate conditional probabilities into natural frequencies, the ability of the doctors to communicate the risk to their patients goes up dramatically. Just imagine the huge difference this can make to customer satisfaction in the healthcare business.
dicindioha

The Value of Good Old Hard Work | Huffington Post - 2 views

  • It’s no knock on the kids that come here, per se. But I do think there’s a shift happening in our youth culture where “work” is almost a dirty word. They just don’t know how. Maybe it’s because everything in life has been reduced to a computer screen and a smart phone?Maybe it’s because we’ve created an environment where we see labor as something of a lower position in life? Maybe it’s just the nature of the cycle of students we hire from year to year?
  • They know they’ve dedicated a part of their own labor to a bigger mission, and it’s a valuable asset to an organization when all the members have a certain level of “buy in.”
  •  
    It is interesting that this man is saying how hard work may not be valued as much any more, which reminded me of our discussion of the effort heuristic and how we feel better about things that require greater effort to obtain, and how hard work will get us places. It is interesting to think that some think our generation likes to find the easier way to get things, and that is valued because that makes them "smarter," which also could be a type of heuristic in short cutting work.
Javier E

The Fallacy of the 'I Turned Out Fine' Argument - The New York Times - 0 views

  • Most of the messages centered on one single, repeated theme: “I was smacked as a child and I turned out just fine.”
  • It makes sense, doesn’t it? Many of us think, “If I had something happen to me and nothing went wrong, then surely it’s fine for everyone else.”
  • The “I turned out just fine” argument is popular. It means that based on our personal experience we know what works and what doesn’t.But the argument has fatal flaws.
  • ...12 more annotations...
  • It’s what’s known as an anecdotal fallacy. This fallacy, in simple terms, states that “I’m not negatively affected (as far as I can tell), so it must be O.K. for everyone.
  • We are relying on a sample size of one. Ourselves, or someone we know. And we are applying that result to everyone
  • It relies on a decision-making shortcut known as the availability heuristic. Related to the anecdotal fallacy, it’s where we draw on information that is immediately available to us when we make a judgment call.
  • studies show that the availability heuristic is a cognitive bias that can cloud us from making accurate decisions utilizing all the information available. It blinds us to our own prejudices
  • It dismisses well-substantiated, scientific evidence. To say “I turned out fine” is an arrogant dismissal of an alternative evidence-based view
  • The statement closes off discourse and promotes a single perspective that is oblivious to alternatives that may be more enlightened. Anecdotal evidence often undermines scientific results, to our detriment.
  • It leads to entrenched attitudes.
  • Perhaps an inability to engage with views that run counter to our own suggests that we did not turn out quite so “fine.”
  • Where is the threshold for what constitutes having turned out fine? If it means we avoided prison, we may be setting the bar too low. Gainfully employed and have a family of our own? Still a pretty basic standard
  • It is as reasonable to say “I turned out fine because of this” as it is to say “I turned out fine in spite of this.”
  • To claim that on this basis spanking a child is fine means that we fall victim to anecdote, rely on our availability heuristic (thereby dismissing all broader data to the contrary), dismiss alternate views, fail to learn and progress by engaging with a challenging idea.
  • We expect our children to embrace learning and to progress in their thinking as they grow older. They deserve to expect the same from us.
Cecilia Ergueta

What Does 'Cultural Appropriation' Mean? - The Atlantic - 1 views

  • Some people correctly perceive something like a frat party full of blackface as wrongheaded, file it under "cultural appropriation," and adopt the erroneous heuristic that any appropriation of a culture is wrongheaded. When the chef who staffs the dining hall at their college serves sushi, they see injustice where there is none. Conversely, other folks see a protest over sushi, perceive that it is absurd, see it filed under cultural appropriation, and adopt the bad heuristic that any grievance lodged under that heading is bullshit. Later, when their Facebook stream unearths a story about blackface headlined, "These Frat Boys Are Guilty of Cultural Appropriation," they erroneously conclude that nothing wrongheaded occurred. Perhaps they even ignorantly add a dismissive comment, exacerbating the canard that racial animus or dehumanization is a nonissue.
caelengrubb

Looking inward in an era of 'fake news': Addressing cognitive bias | YLAI Network - 0 views

  • In an era when everyone seems eager to point out instances of “fake news,” it is easy to forget that knowing how we make sense of the news is as important as knowing how to spot incorrect or biased content
  • While the ability to analyze the credibility of a source and the veracity of its content remains an essential and often-discussed aspect of news literacy, it is equally important to understand how we as news consumers engage with and react to the information we find online, in our feeds, and on our apps
  • People process information they receive from the news in the same way they process all information around them — in the shortest, quickest way possible
  • ...11 more annotations...
  • When we consider how we engage with the news, some shortcuts we may want to pay close attention to, and reflect carefully on, are cognitive biases.
  • In fact, without these heuristics, it would be impossible for us to process all the information we receive daily. However, the use of these shortcuts can lead to “blind spots,” or unintentional ways we respond to information that can have negative consequences for how we engage with, digest, and share the information we encounter
  • These shortcuts, also called heuristics, streamline our problem-solving process and help us make relatively quick decisions.
  • Confirmation bias is the tendency to seek out and value information that confirms our pre-existing beliefs while discarding information that proves our ideas wrong.
  • Cognitive biases are best described as glitches in how we process information
  • Echo chamber effect refers to a situation in which we are primarily exposed to information, people, events, and ideas that already align with our point of view.
  • Anchoring bias, also known as “anchoring,” refers to people’s tendency to consider the first piece of information they receive about a topic as the most reliable
  • The framing effect is what happens when we make decisions based on how information is presented or discussed, rather than its actual substance.
  • Fluency heuristic occurs when a piece of information is deemed more valuable because it is easier to process or recall
  • Everyone operates under one or more cognitive biases. So, when searching for and reading the news (or other information), it is important to be aware of how these biases might shape how we make sense of this information.
  • In conclusion, we may not be able to control the content of the news — whether it is fake, reliable, or somewhere in between — but we can learn to be aware of how we respond to it and adjust our evaluations of the news accordingly.
knudsenlu

New Neuroscience Study Reveals What Worry About Money Does To Your Brain - 0 views

  • These findings suggest more evidence for why heuristics, or rules of thumb, are both so effective and ubiquitous. Consider a few from the financial world: Pay yourself first. Sell in May and go away. Buy and hold. Only use the principal. Buy low, sell high. The list could go on and on.
  • It’s no surprise that financial stress is draining. Anyone who’s worried over a budget or fretted about making the wrong investment choice or major purchase can recognize the telltale signs of such anxiety. But what’s actually happening to your brain when you feel under such pressure?
  • New research conducted by neuroscientist Sam Barnett, co-founder of ThinkAlike Laboratories and sponsored by Northwestern Mutual, offers a rare glimpse at the brain while it is making financial decisions—and finds that anxiety actually impairs our ability to make sound decisions.
  • ...1 more annotation...
  • Our brain was designed to make binary decisions—fight or flight—and avoid dangers,” Barnett says. But modern life with its ample choices and options doesn’t match how our brains naturally process information—explaining some ultimately self-defeating reactions such as impulsivity or inaction.
Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
julia rhodes

Q&A: Why It's Sometimes Rational to Be Irrational - Wired Science - 0 views

  • I think it’s safe to claim that magical thinking emerges from basic underlying cognitive mechanisms — shortcuts that we take, biases, heuristics.
  • A more controversial claim, which is very possible, is that magical thinking is an exaptation. An exaptation is some adaptation that emerged as a byproduct of something else, but became so useful that evolution started to select for aspects of that in addition to the initial thing.
  • Some people have argued that belief in god probably emerged from dualism, anthropomorphism and teleological reasoning. But then it became such a useful idea on its own that now we’re evolving to have a stronger belief in god, because belief in god is evolutionarily adaptive.
  • ...4 more annotations...
  • There are a lot of dangers to magical thinking.
  • It can lead to fatalism if you think your life is completely controlled by supernatural forces.
  • But if used carefully, magical thinking can have benefits, such as a sense of control or a sense of meaning in life. So I take this somewhat paradoxical stance of using irrationality rationally.
  • eling lucky is irrational because the charm or wish itself isn’t lucky. But feeling lucky gives you a sense of control, which increases your confidence and increases your performance in various challenges. So it’s rational to hold onto that irrational belief, on some level, because it can benefit you — even if the charm can’t.
caelengrubb

Believing in Overcoming Cognitive Biases | Journal of Ethics | American Medical Associa... - 0 views

  • Cognitive biases contribute significantly to diagnostic and treatment errors
  • A 2016 review of their roles in decision making lists 4 domains of concern for physicians: gathering and interpreting evidence, taking action, and evaluating decisions
  • Confirmation bias is the selective gathering and interpretation of evidence consistent with current beliefs and the neglect of evidence that contradicts them.
  • ...14 more annotations...
  • It can occur when a physician refuses to consider alternative diagnoses once an initial diagnosis has been established, despite contradicting data, such as lab results. This bias leads physicians to see what they want to see
  • Anchoring bias is closely related to confirmation bias and comes into play when interpreting evidence. It refers to physicians’ practices of prioritizing information and data that support their initial impressions, even when first impressions are wrong
  • When physicians move from deliberation to action, they are sometimes swayed by emotional reactions rather than rational deliberation about risks and benefits. This is called the affect heuristic, and, while heuristics can often serve as efficient approaches to problem solving, they can sometimes lead to bias
  • Further down the treatment pathway, outcomes bias can come into play. This bias refers to the practice of believing that good or bad results are always attributable to prior decisions, even when there is no valid reason to do so
  • The dual-process theory, a cognitive model of reasoning, can be particularly relevant in matters of clinical decision making
  • This theory is based on the argument that we use 2 different cognitive systems, intuitive and analytical, when reasoning. The former is quick and uses information that is readily available; the latter is slower and more deliberate.
  • Consideration should be given to the difficulty physicians face in employing analytical thinking exclusively. Beyond constraints of time, information, and resources, many physicians are also likely to be sleep deprived, work in an environment full of distractions, and be required to respond quickly while managing heavy cognitive loads
  • Simply increasing physicians’ familiarity with the many types of cognitive biases—and how to avoid them—may be one of the best strategies to decrease bias-related errors
  • The same review suggests that cognitive forcing strategies may also have some success in improving diagnostic outcomes
  • Afterwards, the resident physicians were debriefed on both case-specific details and on cognitive forcing strategies, interviewed, and asked to complete a written survey. The results suggested that resident physicians further along in their training (ie, postgraduate year three) gained more awareness of cognitive strategies than resident physicians in earlier years of training, suggesting that this tool could be more useful after a certain level of training has been completed
  • A 2013 study examined the effect of a 3-part, 1-year curriculum on recognition and knowledge of cognitive biases and debiasing strategies in second-year residents
  • Cognitive biases in clinical practice have a significant impact on care, often in negative ways. They sometimes manifest as physicians seeing what they want to see rather than what is actually there. Or they come into play when physicians make snap decisions and then prioritize evidence that supports their conclusions, as opposed to drawing conclusions from evidence
  • Fortunately, cognitive psychology provides insight into how to prevent biases. Guided reflection and cognitive forcing strategies deflect bias through close examination of our own thinking processes.
  • During medical education and consistently thereafter, we must provide physicians with a full appreciation of the cost of biases and the potential benefits of combatting them.
johnsonel7

Why first impressions matter even more for groups - Quartz at Work - 0 views

  • First impressions are powerful. We all remember being instantly drawn to or put off by someone we’ve just met.
  • imagine you see a video of a skateboarder landing an especially impressive trick. If she’s the first of a group of competitors, you’ll probably assume the rest of the group is equally skilled. But if you know this skater standout was the third to compete, it’s less likely to influence your impression of the rest of the group.
  • “Labeling someone or something ‘first’ can have a huge influence not only on your judgment of them, but on your judgment of others that are associated with them,”
  • ...3 more annotations...
  • “The sequence, no matter how arbitrary, is going to have some influence on your judgment,” Touré-Tillery says. “And this happens without you knowing that it’s having that kind of influence on your judgment.”
  • The researchers were struck by the power of the first-member heuristic. “It’s a very strong effect statistically,” she says, and one that seems to endure across contexts. So the researchers wanted to understand if there were any circumstances in which people would stop relying on this mental shortcut.
  • “This is something that we want to be mindful of, that thinking of someone as the first member of the group can influence what you then expect the rest of the group to do. And expectations are often self-fulfilling.”
blythewallick

Recognizing Strangers | Psychology Today - 0 views

  • Benoit Monin, assistant professor of psychology at Stanford University, showed college students 80 photos of faces, then asked them which ones they recognized from among the 40 they'd seen in an earlier session. The more attractive the photo (as rated by another group of students) the more likely it was to be recognized—regardless of whether the face had been seen before.
  • "The face's attractiveness actually changes your perception of your past," in this case, the perception of whether you've seen the face before. The shortcut may lead to errors, but it may also help us manage our busy lives, says Monin. "We tend to like familiar things, so it makes perfect sense that over time we would use liking as a clue to familiarity."
  • In what he calls the "warm-glow heuristic," people consider their affinity for a specific person or place as an indicator of familiarity. As with other mental shortcuts, people resort to this heuristic when they lack enough data on which to base their decisions.
  • ...1 more annotation...
  • In a second session, he showed them an entirely new set of words and asked which words were familiar from the earlier, bogus session. Subjects were more likely to think they'd seen positive words—such as "charm" and "glory"—than either negative or neutral words that appear with the same frequency in English.
Javier E

Why Baseball Is Obsessed With the Book 'Thinking, Fast and Slow' - The New York Times - 0 views

  • In Teaford’s case, the scouting evaluation was predisposed to a mental shortcut called the representativeness heuristic, which was first defined by the psychologists Daniel Kahneman and Amos Tversky. In such cases, an assessment is heavily influenced by what is believed to be the standard or the ideal.
  • Kahneman, a professor emeritus at Princeton University and a winner of the Nobel Prize in economics in 2002, later wrote “Thinking, Fast and Slow,” a book that has become essential among many of baseball’s front offices and coaching staffs.
  • “Pretty much wherever I go, I’m bothering people, ‘Have you read this?’” said Mejdal, now an assistant general manager with the Baltimore Orioles.
  • ...12 more annotations...
  • There aren’t many explicit references to baseball in “Thinking, Fast and Slow,” yet many executives swear by it
  • “From coaches to front office people, some get back to me and say this has changed their life. They never look at decisions the same way.
  • A few, though, swear by it. Andrew Friedman, the president of baseball operations for the Dodgers, recently cited the book as having “a real profound impact,” and said he reflects back on it when evaluating organizational processes. Keith Law, a former executive for the Toronto Blue Jays, wrote the book “Inside Game” — an examination of bias and decision-making in baseball — that was inspired by “Thinking, Fast and Slow.”
  • “As the decision tree in baseball has changed over time, this helps all of us better understand why it needed to change,” Mozeliak wrote in an email. He said that was especially true when “working in a business that many decisions are based on what we see, what we remember, and what is intuitive to our thinking.”
  • The central thesis of Kahneman’s book is the interplay between each mind’s System 1 and System 2, which he described as a “psychodrama with two characters.”
  • System 1 is a person’s instinctual response — one that can be enhanced by expertise but is automatic and rapid. It seeks coherence and will apply relevant memories to explain events.
  • System 2, meanwhile, is invoked for more complex, thoughtful reasoning — it is characterized by slower, more rational analysis but is prone to laziness and fatigue.
  • Kahneman wrote that when System 2 is overloaded, System 1 could make an impulse decision, often at the expense of self-control
  • No area of baseball is more susceptible to bias than scouting, in which organizations aggregate information from disparate sources:
  • “The independent opinion aspect is critical to avoid the groupthink and be aware of momentum,”
  • Matt Blood, the director of player development for the Orioles, first read “Thinking, Fast and Slow” as a Cardinals area scout nine years ago and said that he still consults it regularly. He collaborated with a Cardinals analyst to develop his own scouting algorithm as a tripwire to mitigate bias
  • Mejdal himself fell victim to the trap of the representativeness heuristic when he started with the Cardinals in 2005
mcginnisca

Donald Trump Just Called for Ending All Muslim Immigration to the US | VICE | United St... - 0 views

  • Monday afternoon, the Trump campaign issued a press release that, amid an increasingly Islamophobic climate in the US and abroad, called for a blanket ban on any Muslim immigration—a position so starkly bigoted that the two-paragraph statement went viral on Twitter in a matter of moments. (Some users even questioned whether it was real, but it's as real as everything in this universe.)
  • "Donald J. Trump is calling for a total and complete shutdown of Muslims entering the United States until our country's representatives can figure out what is going on," the release begins, leaving it unclear what exactly Trump thinks could possibly be "going on." An infiltration of the country by ISIS that the candidate has alluded to? A hostile population of American-born Muslims?
  • Trump goes on to discuss the "hatred" Muslims apparently have for Americans, or America, or something. "Where this hatred comes from and why we will have to determine," Trump says in the statement. "Until we are able to determine and understand this problem and the dangerous threat it poses, our country cannot be the victims of horrendous attacks by people that believe only in Jihad, and have no sense of reason or respect for human life." How the government could "determine" the source of this alleged hatred isn't explained, nor does Trump address how he or anyone else might put a stop to it.
  • ...2 more annotations...
  • The release cites a poll from something called the Center for Security Policy that claims 25 percent of Muslims surveyed said they were OK with violence against Americans and 51 percent "agreed that Muslims in America should have the choice of being governed according to Shariah." Those numbers sound too awful to be true, and there's evidence that they aren't—Georgetown's Bridge Initiative, which studies Islamophobia in America, has called the poll into question and noted that the CSP's founder Frank Gaffney once accused General David Petraeus, of all people, of "submission" to Islamic law.
  • the latest CNN poll had put The Donald in the lead in Iowa, a key early voting state, though another poll that used different sampling techniques showed Cruz ahead of Trump.
mcginnisca

Visiting an Anti-Muslim Hate Group at the Peak of America's Islamophobia | VICE | Unite... - 0 views

  • Who is the enemy and what is the enemy's 'Threat Doctrine?'"
  • "national security expert," where her advice tends to be "get rid of the Muslims."
  • They Must Be Stopped: Why We Must Defeat Radical Islam and How We Can Do It, and Because They Hate: A Survivor of Islamic Terror Warns
  • ...13 more annotations...
  • The purpose of the group is to "[promote] national security and [defeat] terrorism"—two goals which are, in the group's view, intrinsically threatened by the Islamic faith. The rhetoric of their meetings doesn't suggest that the problem is radical Islam, but Islam itself.
  • Just hours after the attacks in Paris, a mosque in St. Petersburg, Florida received a voicemail from a man who said he planned to shoot all Muslims, including children, in the head.
  • Act for America has flourished
  • the basic doctrine of the Quran, which is a personal, moral code for Muslims—as a threat to American security. It doesn't matter that the majority of American mass shootings are committed by young, white males (and often fanatical Christians), or that gun regulations play a role in terrorism-related events. Here, the group's mission is to "educate and effect change"—meaning, the Muslims have to go.
  • Yerushalmi claimed that 80 percent of mosques in the United States are "strictly Sharia," which he equated to Muslims following the same dogma as Al-Qaeda terrorists.
  • "the mythical 'moderate' Muslims who embrace traditional Islam but want a peaceful coexistence with the West is effectively non-existent."
  • By his logic, when a Muslim attends an American mosque, he is not only learning a violent doctrine but is also susceptible to be recruited by ISIS
  • Yerushalmi was suggesting further alienation. "Syrian immigrants, when they come here, where are they going to go to those mosques; the reservoir of support is there."
  • "The threat is real!" Yerushalmi concluded. "Sharia is not a peaceful, feel-good Islam."
  • "If my Roman Catholic church was preaching death to Muslims, Jews, everyone else, I'm sure it would be closed down with a blink of an eye," she continued. "So this is where we have to stop the growth of mosques!"
  • some members believe that the White House "is controlled by Muslims."
  • "They don't want peace and prosperity. They want to live by Sharia," said an angry old man.
  • President Obama delivered a speech responding to the San Bernardino shootings, calling for religious tolerance. He harshly condemned those who wish to discriminate against others based on their religion and addressed the plight of Muslim Americans who are currently enduring the ongoing Islamophobic backlash
oliviaodon

A scientific revolution? - 0 views

  • Puzzle-solving science, according to Kuhn, can therefore trigger a scientific revolution as scientists struggle to explain these anomalies and develop a novel basic theory to incorporate them into the existing body of knowledge. After an extended period of upheaval, in which followers of the new theory storm the bastions of accepted dogma, the old paradigm is gradually replaced.
  • biology is heading towards a similar scientific revolution that may shatter one of its most central paradigms. The discovery of a few small proteins with anomalous behaviour is about to overcome a central tenet of molecular biology: that information flows unidirectionally from the gene to the protein to the phenotype. It started with the discovery that prions, a class of small proteins that can exist in different forms, cause a range of highly debilitating diseases. This sparked further research
  • Scientific revolutions are still rare in biology, given that the field, unlike astronomy or physics, is relatively young.
  • ...1 more annotation...
  • The idea that all living beings stem from a primordial cell dating back two billion years is, in my opinion, a true paradigm. It does not have a heuristic value, unlike paradigms in physics such as gravitation or Einstein's famous equation, but it has a fundamental aspect.
Javier E

The Book Bench: Is Self-Knowledge Overrated? : The New Yorker - 1 views

  • It’s impossible to overstate the influence of Kahneman and Tversky. Like Darwin, they helped to dismantle a longstanding myth of human exceptionalism. Although we’d always seen ourselves as rational creatures—this was our Promethean gift—it turns out that human reason is rather feeble, easily overwhelmed by ancient instincts and lazy biases. The mind is a deeply flawed machine.
  • there is a subtle optimism lurking in all of Kahneman’s work: it is the hope that self-awareness is a form of salvation, that if we know about our mental mistakes, we can avoid them. One day, we will learn to equally weigh losses and gains; science can help us escape from the cycle of human error. As Kahneman and Tversky noted in the final sentence of their classic 1974 paper, “A better understanding of these heuristics and of the biases to which they lead could improve judgments and decisions in situations of uncertainty.” Unfortunately, such hopes appear to be unfounded. Self-knowledge isn’t a cure for irrationality; even when we know why we stumble, we still find a way to fall.
  • self-knowledge is surprisingly useless. Teaching people about the hazards of multitasking doesn’t lead to less texting in the car; learning about the weakness of the will doesn’t increase the success of diets; knowing that most people are overconfident about the future doesn’t make us more realistic. The problem isn’t that we’re stupid—it’s that we’re so damn stubborn
  • ...1 more annotation...
  • Kahneman has given us a new set of labels for our shortcomings. But his greatest legacy, perhaps, is also his bleakest: By categorizing our cognitive flaws, documenting not just our errors but also their embarrassing predictability, he has revealed the hollowness of a very ancient aspiration. Knowing thyself is not enough. Not even close.
kushnerha

BBC - Future - Why does walking through doorways make us forget? - 0 views

  • We’ve all done it. Run upstairs to get your keys, but forget that it is them you’re looking for once you get to the bedroom. Open the fridge door and reach for the middle shelf only to realise that we can't remember why we opened the fridge in the first place. Or wait for a moment to interrupt a friend to find that the burning issue that made us want to interrupt has now vanished from our minds
  • It’s known as the “Doorway Effect”, and it reveals some important features of how our minds are organised. Understanding this might help us appreciate those temporary moments of forgetfulness as more than just an annoyance
  • “What are you doing today?” she asks the first. “I’m putting brick after sodding brick on top of another,” sighs the first. “What are you doing today?” she asks the second. “I’m building a wall,” is the simple reply. But the third builder swells with pride when asked, and replies: “I’m building a cathedral!”
  • ...7 more annotations...
  • Maybe you heard that story as encouragement to think of the big picture, but to the psychologist in you the important moral is that any action has to be thought of at multiple levels if you are going to carry it out successfully. The third builder might have the most inspiring view of their day-job, but nobody can build a cathedral without figuring out how to successfully put one brick on top of another like the first builder.
  • As we move through our days our attention shifts between these levels – from our goals and ambitions, to plans and strategies, and to the lowest levels, our concrete actions. When things are going well, often in familiar situations, we keep our attention on what we want and how we do it seems to take care of itself. If you’re a skilled driver then you manage the gears, indicators and wheel automatically, and your attention is probably caught up in the less routine business of navigating the traffic or talking to your passengers. When things are less routine we have to shift our attention to the details of what we’re doing, taking our minds off the bigger picture for a moment.
  • The way our attention moves up and down the hierarchy of action is what allows us to carry out complex behaviours, stitching together a coherent plan over multiple moments, in multiple places or requiring multiple actions.
  • The Doorway Effect occurs when our attention moves between levels, and it reflects the reliance of our memories – even memories for what we were about to do – on the environment we’re in.
  • Imagine that we’re going upstairs to get our keys and forget that it is the keys we came for as soon as we enter the bedroom. Psychologically, what has happened is that the plan (“Keys!”) has been forgotten even in the middle of implementing a necessary part of the strategy (“Go to bedroom!”). Probably the plan itself is part of a larger plan (“Get ready to leave the house!”), which is part of plans on a wider and wider scale (“Go to work!”, “Keep my job!”, “Be a productive and responsible citizen”, or whatever). Each scale requires attention at some point. Somewhere in navigating this complex hierarchy the need for keys popped into mind, and like a circus performer setting plates spinning on poles, your attention focussed on it long enough to construct a plan, but then moved on to the next plate
  • And sometimes spinning plates fall. Our memories, even for our goals, are embedded in webs of associations. That can be the physical environment in which we form them, which is why revisiting our childhood home can bring back a flood of previously forgotten memories, or it can be the mental environment – the set of things we were just thinking about when that thing popped into mind.
  • The Doorway Effect occurs because we change both the physical and mental environments, moving to a different room and thinking about different things. That hastily thought up goal, which was probably only one plate among the many we’re trying to spin, gets forgotten when the context changes.
1 - 20 of 30 Next ›
Showing 20 items per page