Skip to main content

Home/ TOK Friends/ Group items tagged kahneman

Rss Feed Group items tagged

Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
dicindioha

Daniel Kahneman On Hiring Decisions - Business Insider - 0 views

  • Most hiring decisions come down to a gut decision. According to Nobel laureate Daniel Kahneman, however, this process is extremely flawed and there's a much better way.
    • dicindioha
       
      hiring comes down to 'gut feeling'
  • Kahneman asked interviewers to put aside personal judgments and limit interviews to a series of factual questions meant to generate a score on six separate personality traits. A few months later, it became clear that Kahneman's systematic approach was a vast improvement over gut decisions. It was so effective that the army would use his exact method for decades to come. Why you should care is because this superior method can be copied by any organization — and really, by anyone facing a hard decision.
  • First, select a few traits that are prerequisites for success in this position (technical proficiency, engaging personality, reliability, and so on. Don't overdo it — six dimensions is a good number. The traits you choose should be as independent as possible from each other, and you should feel that you can assess them reliably by asking a few factual questions. Next, make a list of those questions for each trait and think about how you will score it, say on a 1-5 scale. You should have an idea of what you will call "very weak" or "very strong."
    • dicindioha
       
      WHAT YOU SHOULD DO IN AN INTERVIEW
  • ...2 more annotations...
  • Do not skip around. To evaluate each candidate add up the six scores ... Firmly resolve that you will hire the candidate whose final score is the highest, even if there is another one whom you like better — try to resist your wish to invent broken legs to change the ranking.
  • than if you do what people normally do in such situations, which is to go into the interview unprepared and to make choices by an overall intuitive judgment such as "I looked into his eyes and liked what I saw."
  •  
    we cannot always use simply a 'gut feeling' from our so called 'reasoning' and emotional response to make big decisions like job hiring, which is what happens much of the time. this is a really interesting way to do it systematically. you still use your own perspective, but the questions asked will hopefully lead you to a better outcome
Javier E

Who You Are - NYTimes.com - 1 views

  • Before Kahneman and Tversky, people who thought about social problems and human behavior tended to assume that we are mostly rational agents. They assumed that people have control over the most important parts of their own thinking. They assumed that people are basically sensible utility-maximizers and that when they depart from reason it’s because some passion like fear or love has distorted their judgment.
  • Kahneman and Tversky conducted experiments. They proved that actual human behavior often deviates from the old models and that the flaws are not just in the passions but in the machinery of cognition. They demonstrated that people rely on unconscious biases and rules of thumb to navigate the world, for good and ill. Many of these biases have become famous: priming, framing, loss-aversion.
  • We are dual process thinkers. We have two interrelated systems running in our heads. One is slow, deliberate and arduous (our conscious reasoning). The other is fast, associative, automatic and supple (our unconscious pattern recognition). There is now a complex debate over the relative strengths and weaknesses of these two systems. In popular terms, think of it as the debate between “Moneyball” (look at the data) and “Blink” (go with your intuition).
  • ...3 more annotations...
  • We are not blank slates. All humans seem to share similar sets of biases. There is such a thing as universal human nature. The trick is to understand the universals and how tightly or loosely they tie us down.
  • We are players in a game we don’t understand. Most of our own thinking is below awareness. Fifty years ago, people may have assumed we are captains of our own ships, but, in fact, our behavior is often aroused by context in ways we can’t see. Our biases frequently cause us to want the wrong things. Our perceptions and memories are slippery, especially about our own mental states. Our free will is bounded. We have much less control over ourselves than we thought.
  • They also figured out ways to navigate around our shortcomings. Kahneman champions the idea of “adversarial collaboration” — when studying something, work with people you disagree with. Tversky had a wise maxim: “Let us take what the terrain gives.” Don’t overreach. Understand what your circumstances are offer
Keiko E

Book Review: Thinking, Fast and Slow - WSJ.com - 0 views

  • Can our healthy selves predict how we will feel in unhealthy circumstances with enough certainty to choose whether we would want to live or die? Can our present selves, in general, make reliable choices for our futures selves? How good are our decisions anyway, and how do we make them?
  • The "focusing illusion," according to Mr. Kahneman, happens when we call up a specific attribute of a thing or experience (e.g., climate) and use it to answer a broader and more difficult question (what makes life enjoyable, in California or anywhere else?).
  • One major effect of the work of Messrs. Kahneman and Tversky has been to overturn the assumption that human beings are rational decision-makers who weigh all the relevant factors logically before making choices. When the two men began their research, it was understood that, as a finite device with finite time, the brain had trouble calculating the costs and benefits of every possible course of action and that, separately, it was not very good at applying rules of logical inference to abstract situations. What Messrs. Kahneman and Tversky showed went far beyond this, however. They argued that, even when we have all the information that we need to arrive at a correct decision, and even when the logic is simple, we often get it drastically wrong.
  • ...1 more annotation...
  • we harbor two selves when it comes to happiness, too: one self that experiences pain and pleasure from moment to moment and another that remembers the emotions associated with complete events and episodes. The remembering self does not seem to care how long an experience was if it was getting better toward the end—so a longer colonoscopy that ended with decreasing pain will be seen later as preferable to a shorter procedure that involved less total pain but happened to end at a very painful point.
Javier E

The Book Bench: Is Self-Knowledge Overrated? : The New Yorker - 1 views

  • It’s impossible to overstate the influence of Kahneman and Tversky. Like Darwin, they helped to dismantle a longstanding myth of human exceptionalism. Although we’d always seen ourselves as rational creatures—this was our Promethean gift—it turns out that human reason is rather feeble, easily overwhelmed by ancient instincts and lazy biases. The mind is a deeply flawed machine.
  • there is a subtle optimism lurking in all of Kahneman’s work: it is the hope that self-awareness is a form of salvation, that if we know about our mental mistakes, we can avoid them. One day, we will learn to equally weigh losses and gains; science can help us escape from the cycle of human error. As Kahneman and Tversky noted in the final sentence of their classic 1974 paper, “A better understanding of these heuristics and of the biases to which they lead could improve judgments and decisions in situations of uncertainty.” Unfortunately, such hopes appear to be unfounded. Self-knowledge isn’t a cure for irrationality; even when we know why we stumble, we still find a way to fall.
  • self-knowledge is surprisingly useless. Teaching people about the hazards of multitasking doesn’t lead to less texting in the car; learning about the weakness of the will doesn’t increase the success of diets; knowing that most people are overconfident about the future doesn’t make us more realistic. The problem isn’t that we’re stupid—it’s that we’re so damn stubborn
  • ...1 more annotation...
  • Kahneman has given us a new set of labels for our shortcomings. But his greatest legacy, perhaps, is also his bleakest: By categorizing our cognitive flaws, documenting not just our errors but also their embarrassing predictability, he has revealed the hollowness of a very ancient aspiration. Knowing thyself is not enough. Not even close.
Javier E

Why Baseball Is Obsessed With the Book 'Thinking, Fast and Slow' - The New York Times - 0 views

  • In Teaford’s case, the scouting evaluation was predisposed to a mental shortcut called the representativeness heuristic, which was first defined by the psychologists Daniel Kahneman and Amos Tversky. In such cases, an assessment is heavily influenced by what is believed to be the standard or the ideal.
  • Kahneman, a professor emeritus at Princeton University and a winner of the Nobel Prize in economics in 2002, later wrote “Thinking, Fast and Slow,” a book that has become essential among many of baseball’s front offices and coaching staffs.
  • “Pretty much wherever I go, I’m bothering people, ‘Have you read this?’” said Mejdal, now an assistant general manager with the Baltimore Orioles.
  • ...12 more annotations...
  • There aren’t many explicit references to baseball in “Thinking, Fast and Slow,” yet many executives swear by it
  • “From coaches to front office people, some get back to me and say this has changed their life. They never look at decisions the same way.
  • A few, though, swear by it. Andrew Friedman, the president of baseball operations for the Dodgers, recently cited the book as having “a real profound impact,” and said he reflects back on it when evaluating organizational processes. Keith Law, a former executive for the Toronto Blue Jays, wrote the book “Inside Game” — an examination of bias and decision-making in baseball — that was inspired by “Thinking, Fast and Slow.”
  • “As the decision tree in baseball has changed over time, this helps all of us better understand why it needed to change,” Mozeliak wrote in an email. He said that was especially true when “working in a business that many decisions are based on what we see, what we remember, and what is intuitive to our thinking.”
  • The central thesis of Kahneman’s book is the interplay between each mind’s System 1 and System 2, which he described as a “psychodrama with two characters.”
  • System 1 is a person’s instinctual response — one that can be enhanced by expertise but is automatic and rapid. It seeks coherence and will apply relevant memories to explain events.
  • System 2, meanwhile, is invoked for more complex, thoughtful reasoning — it is characterized by slower, more rational analysis but is prone to laziness and fatigue.
  • Kahneman wrote that when System 2 is overloaded, System 1 could make an impulse decision, often at the expense of self-control
  • No area of baseball is more susceptible to bias than scouting, in which organizations aggregate information from disparate sources:
  • “The independent opinion aspect is critical to avoid the groupthink and be aware of momentum,”
  • Matt Blood, the director of player development for the Orioles, first read “Thinking, Fast and Slow” as a Cardinals area scout nine years ago and said that he still consults it regularly. He collaborated with a Cardinals analyst to develop his own scouting algorithm as a tripwire to mitigate bias
  • Mejdal himself fell victim to the trap of the representativeness heuristic when he started with the Cardinals in 2005
Javier E

Daniel Kahneman | Profile on TED.com - 1 views

  • rather than stating the optimal, rational answer, as an economist of the time might have, they quantified how most real people, consistently, make a less-rational choice. Their work treated economics not as a perfect or self-correcting machine, but as a system prey to quirks of human perception. The field of behavioral economics was born.
  • Tversky and calls for a new form of academic cooperation, marked not by turf battles but by "adversarial collaboration," a good-faith effort by unlike minds to conduct joint research, critiquing each other in the service of an ideal of truth to which both can contribute.
Javier E

Daniel Kahneman on 'Emergent Weirdness' in Artifical Intelligences - Alexis Madrigal - ... - 0 views

  • Human brains take shortcuts in making decisions. Finding where those shortcuts lead us to dumb places is what his life work has been all about. Artificial intelligences, say, Google, also have to take shortcuts, but they are *not* the same ones that our brains use. So, when an AI ends up in a weird place by taking a shortcut, that bias strikes us as uncannily weird. Get ready, too, because AI bias is going to start replacing human cognitive bias more and more regularly.
Javier E

Research Shows That the Smarter People Are, the More Susceptible They Are to Cognitive ... - 0 views

  • While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.
  • When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions.
  • in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.
  • ...7 more annotations...
  • they wanted to understand how these biases correlated with human intelligence.
  • self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.”
  • Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.”
  • This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves.
  • it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.
  • intelligence seems to make things worse.
  • the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.
Javier E

Owner of a Credit Card Processor Is Setting a New Minimum Wage: $70,000 a Year - NYTime... - 1 views

  • Mr. Price surprised his 120-person staff by announcing that he planned over the next three years to raise the salary of even the lowest-paid clerk, customer service representative and salesman to a minimum of $70,000.
  • Mr. Price, who started the Seattle-based credit-card payment processing firm in 2004 at the age of 19, said he would pay for the wage increases by cutting his own salary from nearly $1 million to $70,000 and using 75 to 80 percent of the company’s anticipated $2.2 million in profit this year.
  • his unusual proposal does speak to an economic issue that has captured national attention: The disparity between the soaring pay of chief executives and that of their employees.
  • ...7 more annotations...
  • The United States has one of the world’s largest pay gaps, with chief executives earning nearly 300 times what the average worker makes, according to some economists’ estimates. That is much higher than the 20-to-1 ratio recommended by Gilded Age magnates like J. Pierpont Morgan and the 20th century management visionary Peter Drucker.
  • “The market rate for me as a C.E.O. compared to a regular person is ridiculous, it’s absurd,” said Mr. Price, who said his main extravagances were snowboarding and picking up the bar bill. He drives a 12-year-old Audi
  • Under a financial overhaul passed by Congress in 2010, the Securities and Exchange Commission was supposed to require all publicly held companies to disclose the ratio of C.E.O. pay to the median pay of all other employees, but it has so far failed to put it in effect. Corporate executives have vigorously opposed the idea, complaining it would be cumbersome and costly to implement.
  • Of all the social issues that he felt he was in a position to do something about as a business leader, “that one seemed like a more worthy issue to go after.”
  • The happiness research behind Mr. Price’s announcement on Monday came from Angus Deaton and Daniel Kahneman, a Nobel Prize-winning psychologist. They found that what they called emotional well-being — defined as “the emotional quality of an individual’s everyday experience, the frequency and intensity of experiences of joy, stress, sadness, anger, and affection that make one’s life pleasant or unpleasant” — rises with income, but only to a point. And that point turns out to be about $75,000 a year.
  • Of course, money above that level brings pleasures — there’s no denying the delights of a Caribbean cruise or a pair of diamond earrings — but no further gains on the emotional well-being scale.
  • As Mr. Kahneman has explained it, income above the threshold doesn’t buy happiness, but a lack of money can deprive you of it.
Javier E

The Science of Snobbery: How We're Duped Into Thinking Fancy Things Are Better - The At... - 0 views

  • Expert judges and amateurs alike claim to judge classical musicians based on sound. But Tsay’s research suggests that the original judges, despite their experience and expertise, judged the competition (which they heard and watched live) based on visual information, just as amateurs do.
  • just like with classical music, we do not appraise wine in the way that we expect. 
  • Priceonomics revisited this seemingly damning research: the lack of correlation between wine enjoyment and price in blind tastings, the oenology students tricked by red food dye into describing a white wine like a red, a distribution of medals at tastings equivalent to what one would expect from pure chance, the grand crus described like cheap wines and vice-versa when the bottles are switched.
  • ...26 more annotations...
  • Taste does not simply equal your taste buds. It draws on information from all our senses as well as context. As a result, food is susceptible to the same trickery as wine. Adding yellow food dye to vanilla pudding leads people to experience a lemony taste. Diners eating in the dark at a chic concept restaurant confuse veal for tuna. Branding, packaging, and price tags are equally important to enjoyment. Cheap fish is routinely passed off as its pricier cousins at seafood and sushi restaurants. 
  • Just like with wine and classical music, we often judge food based on very different criteria than what we claim. The result is that our perceptions are easily skewed in ways we don’t anticipate. 
  • What does it mean for wine that presentation so easily trumps the quality imbued by being grown on premium Napa land or years of fruitful aging? Is it comforting that the same phenomenon is found in food and classical music, or is it a strike against the authenticity of our enjoyment of them as well? How common must these manipulations be until we concede that the influence of the price tag of a bottle of wine or the visual appearance of a pianist is not a trick but actually part of the quality?
  • To answer these questions, we need to investigate the underlying mechanism that leads us to judge wine, food, and music by criteria other than what we claim to value. And that mechanism seems to be the quick, intuitive judgments our minds unconsciously make
  • this unknowability also makes it easy to be led astray when our intuition makes a mistake. We may often be able to count on the price tag or packaging of food and wine for accurate information about quality. But as we believe that we’re judging based on just the product, we fail to recognize when presentation manipulates our snap judgments.
  • Participants were just as effective when watching 6 second video clips and when comparing their ratings to ratings of teacher effectiveness as measured by actual student test performance. 
  • The power of intuitive first impressions has been demonstrated in a variety of other contexts. One experiment found that people predicted the outcome of political elections remarkably well based on silent 10 second video clips of debates - significantly outperforming political pundits and predictions made based on economic indicators.
  • In a real world case, a number of art experts successfully identified a 6th century Greek statue as a fraud. Although the statue had survived a 14 month investigation by a respected museum that included the probings of a geologist, they instantly recognized something was off. They just couldn’t explain how they knew.
  • Cases like this represent the canon behind the idea of the “adaptive unconscious,” a concept made famous by journalist Malcolm Gladwell in his book Blink. The basic idea is that we constantly, quickly, and unconsciously do the equivalent of judging a book by its cover. After all, a cover provides a lot of relevant information in a world in which we don’t have time to read every page.
  • Gladwell describes the adaptive unconscious as “a kind of giant computer that quickly and quietly processes a lot of the data we need in order to keep functioning as human beings.”
  • In a famous experiment, psychologist Nalini Ambady provided participants in an academic study with 30 second silent video clips of a college professor teaching a class and asked them to rate the effectiveness of the professor.
  • In follow up experiments, Chia-Jung Tsay found that those judging musicians’ auditions based on visual cues were not giving preference to attractive performers. Rather, they seemed to look for visual signs of relevant characteristics like passion, creativity, and uniqueness. Seeing signs of passion is valuable information. But in differentiating between elite performers, it gives an edge to someone who looks passionate over someone whose play is passionate
  • Outside of these more eccentric examples, it’s our reliance on quick judgments, and ignorance of their workings, that cause people to act on ugly, unconscious biases
  • It’s also why - from a business perspective - packaging and presentation is just as important as the good or service on offer. Why marketing is just as important as product. 
  • Gladwell ends Blink optimistically. By paying closer attention to our powers of rapid cognition, he argues, we can avoid its pitfalls and harness its powers. We can blindly audition musicians behind a screen, look at a piece of art devoid of other context, and pay particular attention to possible unconscious bias in our performance reports.
  • But Gladwell’s success in demonstrating how the many calculations our adaptive unconscious performs without our awareness undermines his hopeful message of consciously harnessing its power.
  • As a former world-class tennis player and coach of over 50 years, Braden is a perfect example of the ideas behind thin slicing. But if he can’t figure out what his unconscious is up to when he recognizes double faults, why should anyone else expect to be up to the task?
  • flawed judgment in fields like medicine and investing has more serious consequences. The fact that expertise is so tricky leads psychologist Daniel Kahneman to assert that most experts should seek the assistance of statistics and algorithms in making decisions.
  • In his book Thinking, Fast and Slow, he describes our two modes of thought: System 1, like the adaptive unconscious, is our “fast, instinctive, and emotional” intuition. System 2 is our “slower, more deliberative, and more logical” conscious thought. Kahneman believes that we often leave decisions up to System 1 and generally place far “too much confidence in human judgment” due to the pitfalls of our intuition described above.
  • Not every judgment will be made in a field that is stable and regular enough for an algorithm to help us make judgments or predictions. But in those cases, he notes, “Hundreds of studies have shown that wherever we have sufficient information to build a model, it will perform better than most people.”
  • Experts can avoid the pitfalls of intuition more easily than laypeople. But they need help too, especially as our collective confidence in expertise leads us to overconfidence in their judgments. 
  • This article has referred to the influence of price tags and context on products and experiences like wine and classical music concerts as tricks that skew our perception. But maybe we should consider them a real, actual part of the quality.
  • Losing ourselves in a universe of relativism, however, will lead us to miss out on anything new or unique. Take the example of the song “Hey Ya!” by Outkast. When the music industry heard it, they felt sure it would be a hit. When it premiered on the radio, however, listeners changed the channel. The song sounded too dissimilar from songs people liked, so they responded negatively. 
  • It took time for people to get familiar with the song and realize that they enjoyed it. Eventually “Hey Ya!” became the hit of the summer.
  • Many boorish people talking about the ethereal qualities of great wine probably can't even identify cork taint because their impressions are dominated by the price tag and the wine label. But the classic defense of wine - that you need to study it to appreciate it - is also vindicated. The open question - which is both editorial and empiric - is what it means for the industry that constant vigilance and substantial study is needed to dependably appreciate wine for the product quality alone. But the questions is relevant to the enjoyment of many other products and experiences that we enjoy in life.
  • Maybe the most important conclusion is to not only recognize the fallibility of our judgments and impressions, but to recognize when it matters, and when it doesn’t
katedriscoll

Frontiers | A Neural Network Framework for Cognitive Bias | Psychology - 0 views

  • Human decision-making shows systematic simplifications and deviations from the tenets of rationality (‘heuristics’) that may lead to suboptimal decisional outcomes (‘cognitive biases’). There are currently three prevailing theoretical perspectives on the origin of heuristics and cognitive biases: a cognitive-psychological, an ecological and an evolutionary perspective. However, these perspectives are mainly descriptive and none of them provides an overall explanatory framework for
  • the underlying mechanisms of cognitive biases. To enhance our understanding of cognitive heuristics and biases we propose a neural network framework for cognitive biases, which explains why our brain systematically tends to default to heuristic (‘Type 1’) decision making. We argue that many cognitive biases arise from intrinsic brain mechanisms that are fundamental for the working of biological neural networks. To substantiate our viewpoint, we discern and explain four basic neural network principles: (1) Association, (2) Compatibility, (3) Retainment, and (4) Focus. These principles are inherent to (all) neural networks which were originally optimized to perform concrete biological, perceptual, and motor functions. They form the basis for our inclinations to associate and combine (unrelated) information, to prioritize information that is compatible with our present state (such as knowledge, opinions, and expectations), to retain given information that sometimes could better be ignored, and to focus on dominant information while ignoring relevant information that is not directly activated. The supposed mechanisms are complementary and not mutually exclusive. For different cognitive biases they may all contribute in varying degrees to distortion of information. The present viewpoint not only complements the earlier three viewpoints, but also provides a unifying and binding framework for many cognitive bias phenomena.
  • The cognitive-psychological (or heuristics and biases) perspective (Evans, 2008; Kahneman and Klein, 2009) attributes cognitive biases to limitations in the available data and in the human information processing capacity (Simon, 1955; Broadbent, 1958; Kahneman, 1973, 2003; Norman and Bobrow, 1975)
kaylynfreeman

Opinion | How Fear Distorts Our Thinking About the Coronavirus - The New York Times - 0 views

  • When it comes to making decisions that involve risks, we humans can be irrational in quite systematic ways — a fact that the psychologists Amos Tversky and Daniel Kahneman famously demonstrated with the help of a hypothetical situation, eerily apropos of today’s coronavirus epidemic, that has come to be known as the Asian disease problem.
  • This is irrational because the two questions don’t differ mathematically. In both cases, choosing the first option means accepting the certainty that 200 people live, and choosing the second means embracing a one-third chance that all could be saved with an accompanying two-thirds chance that all will die. Yet in our minds, Professors Tversky and Kahneman explained, losses loom larger than gains, and so when the options are framed in terms of deaths rather than cures, we’ll accept more risks to try to avoid deaths.
  • Our decision making is bad enough when the disease is hypothetical. But when the disease is real — when we see actual death tolls climbing daily, as we do with the coronavirus — another factor besides our sensitivity to losses comes into play: fear.
  • ...1 more annotation...
  • The brain states we call emotions exist for one reason: to help us decide what to do next. They reflect our mind’s predictions for what’s likely to happen in the world and therefore serve as an efficient way to prepare us for it. But when the emotions we feel aren’t correctly calibrated for the threat or when we’re making judgments in domains where we have little knowledge or relevant information, our feelings become more likely to lead us astray.
Javier E

Jonathan Haidt: Reasons Do Matter - NYTimes.com - 0 views

  • I never said that reason plays no role in judgment. Rather, I urged that we be realistic about reasoning and recognize that reasons persuade others on moral and political issues only under very special circumstances.
  • two basic kinds of cognitive events are “seeing-that” and “reasoning-why.” (These terms correspond roughly to what the psychologist Daniel Kahneman and others call “System 1” and “System 2” and that I call the “elephant” and the “rider.”)
  • We effortlessly and intuitively “see that” something is true, and then we work to find justifications, or “reasons why,” which we can give to others.  Both processes are crucial for understanding belief and persuasion. Both are needed for the kind of democratic deliberation that Lynch (and I) want to promote.
  • ...13 more annotations...
  • as an intuitionist, I see hope in an approach to deliberative democracy that uses social psychology to calm the passions and fears that make horizontal movement so difficult.
  • if your opponent succeeds in defeating your reasons, you are unlikely to change your judgment. You’ve been dragged into the upper-left quadrant, but you still feel, intuitively, that it’s wrong
  • This, I suggest, is how moral arguments proceed when people have strong intuitions anchoring their beliefs. And intuitions are rarely stronger than when they are part of our partisan identities. So I’m not saying that reasons “play no role in moral judgment.” In fact, four of the six links in my Social Intuitionist Model are reasoning links. Most of what’s going on during an argu
  • ment is reasoning
  • I’m saying that reason is far less powerful than intuition, so if you’re arguing (or deliberating) with a partner who lives on the other side of the political spectrum from you, and you approach issues such as abortion, gay marriage or income inequality with powerfully different intuitive reactions, you are unlikely to effect any persuasion no matter how good your arguments and no matter how much time you give your opponent to reflect upon your logic.
  • According to Margolis, people don’t change their minds unless they move along the horizontal dimension. Intuition is what most matters for belief. Yet a moral argument generally consists of round after round of reasoning. Each person tries to pull the other along the vertical dimension.
  • One of the issues I am most passionate about is political civility. I co-run a site at www.CivilPolitics.org where we define civility as “the ability to disagree with others while respecting their sincerity and decency.” We explain our goals like this: “We believe this ability [civility] is best fostered by indirect methods (changing contexts, payoffs and institutions) rather than by direct methods (such as pleading with people to be more civil, or asking people to sign civility pledges).” In other words, we hope to open up space for civil disagreement by creating contexts in which elephants (automatic processes and intuitions) are calmer, rather than by asking riders (controlled processes, including reasoning) to try harder.
  • We are particularly interested in organizations that try to create a sense of community and camaraderie as a precondition for political discussions.
  • if you want to persuade someone, talk to the elephant first. Trigger the right intuitions first.
  • This is why there has been such rapid movement on gay marriage and gay rights. It’s not because good arguments have suddenly appeared, which nobody thought of in the 1990s
  • younger people, who grew up knowing gay people and seeing gay couples on television, have no such disgust. For them, the arguments are much more persuasive.
  • I love Aristotle’s emphasis on habit — and I had a long section on virtue ethics in Chapter 6 that got cut at the last minute, but which I have just now posted online here
  • philosophers have the best norms for good thinking that I have ever encountered. When my work is critiqued by a philosopher I can be certain that he or she has read me carefully, including the footnotes, and will not turn me into a straw man. More than any other subculture I know, the philosophical community embodies the kinds of normative pressures for reason-giving and responsiveness to reasons that Allan Gibbard describes in “Wise Choices, Apt Feelings.”
Javier E

Nate Silver, Artist of Uncertainty - 0 views

  • In 2008, Nate Silver correctly predicted the results of all 35 Senate races and the presidential results in 49 out of 50 states. Since then, his website, fivethirtyeight.com (now central to The New York Times’s political coverage), has become an essential source of rigorous, objective analysis of voter surveys to predict the Electoral College outcome of presidential campaigns. 
  • Political junkies, activists, strategists, and journalists will gain a deeper and more sobering sense of Silver’s methods in The Signal and the Noise: Why So Many Predictions Fail—But Some Don’t (Penguin Press). A brilliant analysis of forecasting in finance, geology, politics, sports, weather, and other domains, Silver’s book is also an original fusion of cognitive psychology and modern statistical theory.
  • Its most important message is that the first step toward improving our predictions is learning how to live with uncertainty.
  • ...7 more annotations...
  • he blends the best of modern statistical analysis with research on cognition biases pioneered by Princeton psychologist and Nobel laureate in economics  Daniel Kahneman and the late Stanford psychologist Amos Tversky. 
  • Silver’s background in sports and poker turns out to be invaluable. Successful analysts in gambling and sports are different from fans and partisans—far more aware that “sure things” are likely to be illusions,
  • The second step is starting to understand why it is that big data, super computers, and mathematical sophistication haven’t made us better at separating signals (information with true predictive value) from noise (misleading information). 
  • One of the biggest problems we have in separating signal from noise is that when we look too hard for certainty that isn’t there, we often end up attracted to noise, either because it is more prominent or because it confirms what we would like to believe.
  • In discipline after discipline, Silver shows in his book that when you look at even the best single forecast, the average of all independent forecasts is 15 to 20 percent more accurate. 
  • Silver has taken the next major step: constantly incorporating both state polls and national polls into Bayesian models that also incorporate economic data.
  • Silver explains why we will be misled if we only consider significance tests—i.e., statements that the margin of error for the results is, for example, plus or minus four points, meaning there is one chance in 20 that the percentages reported are off by more than four. Calculations like these assume the only source of error is sampling error—the irreducible error—while ignoring errors attributable to house effects, like the proportion of cell-phone users, one of the complex set of assumptions every pollster must make about who will actually vote. In other words, such an approach ignores context in order to avoid having to justify and defend judgments. 
Javier E

The Choice Explosion - The New York Times - 0 views

  • the social psychologist Sheena Iyengar asked 100 American and Japanese college students to take a piece of paper. On one side, she had them write down the decisions in life they would like to make for themselves. On the other, they wrote the decisions they would like to pass on to others.
  • The Americans desired choice in four times more domains than the Japanese.
  • Americans now have more choices over more things than any other culture in human history. We can choose between a broader array of foods, media sources, lifestyles and identities. We have more freedom to live out our own sexual identities and more religious and nonreligious options to express our spiritual natures.
  • ...15 more annotations...
  • But making decisions well is incredibly difficult, even for highly educated professional decision makers. As Chip Heath and Dan Heath point out in their book “Decisive,” 83 percent of corporate mergers and acquisitions do not increase shareholder value, 40 percent of senior hires do not last 18 months in their new position, 44 percent of lawyers would recommend that a young person not follow them into the law.
  • It’s becoming incredibly important to learn to decide well, to develop the techniques of self-distancing to counteract the flaws in our own mental machinery. The Heath book is a very good compilation of those techniques.
  • assume positive intent. When in the midst of some conflict, start with the belief that others are well intentioned. It makes it easier to absorb information from people you’d rather not listen to.
  • Suzy Welch’s 10-10-10 rule. When you’re about to make a decision, ask yourself how you will feel about it 10 minutes from now, 10 months from now and 10 years from now. People are overly biased by the immediate pain of some choice, but they can put the short-term pain in long-term perspective by asking these questions.
  • An "explosion" that may also be a "dissolution" or "disintegration," in my view. Unlimited choices. Conduct without boundaries. All of which may be viewed as either "great" or "terrible." The poor suffer when they have no means to pursue choices, which is terrible. The rich seem only to want more and more, wealth without boundaries, which is great for those so able to do. Yes, we need a new decision-making tool, but perhaps one that is also very old: simplify, simplify,simplify by setting moral boundaries that apply to all and which define concisely what our life together ought to be.
  • our tendency to narrow-frame, to see every decision as a binary “whether or not” alternative. Whenever you find yourself asking “whether or not,” it’s best to step back and ask, “How can I widen my options?”
  • deliberate mistakes. A survey of new brides found that 20 percent were not initially attracted to the man they ended up marrying. Sometimes it’s useful to make a deliberate “mistake” — agreeing to dinner with a guy who is not your normal type. Sometimes you don’t really know what you want and the filters you apply are hurting you.
  • It makes you think that we should have explicit decision-making curriculums in all schools. Maybe there should be a common course publicizing the work of Daniel Kahneman, Cass Sunstein, Dan Ariely and others who study the way we mess up and the techniques we can adopt to prevent error.
  • The explosion of choice places extra burdens on the individual. Poorer Americans have fewer resources to master decision-making techniques, less social support to guide their decision-making and less of a safety net to catch them when they err.
  • the stress of scarcity itself can distort decision-making. Those who experienced stress as children often perceive threat more acutely and live more defensively.
  • The explosion of choice means we all need more help understanding the anatomy of decision-making.
  • living in an area of concentrated poverty can close down your perceived options, and comfortably “relieve you of the burden of choosing life.” It’s hard to maintain a feeling of agency when you see no chance of opportunity.
  • In this way the choice explosion has contributed to widening inequality.
  • The relentless all-hour reruns of "Law and Order" in 100 channel cable markets provide direct rebuff to the touted but hollow promise/premise of wider "choice." The small group of personalities debating a pre-framed trivial point of view, over and over, nightly/daily (in video clips), without data, global comparison, historic reference, regional content, or a deep commitment to truth or knowledge of facts has resulted in many choosing narrower limits: streaming music, coffee shops, Facebook--now a "choice" of 1.65 billion users.
  • It’s important to offer opportunity and incentives. But we also need lessons in self-awareness — on exactly how our decision-making tool is fundamentally flawed, and on mental frameworks we can adopt to avoid messing up even more than we do.
Javier E

The Amygdala Made Me Do It - NYTimes.com - 1 views

  • It’s the invasion of the Can’t-Help-Yourself books. Unlike most pop self-help books, these are about life as we know it — the one you can change, but only a little, and with a ton of work. Professor Kahneman, who won the Nobel Prize in economic science a decade ago, has synthesized a lifetime’s research in neurobiology, economics and psychology. “Thinking, Fast and Slow” goes to the heart of the matter: How aware are we of the invisible forces of brain chemistry, social cues and temperament that determine how we think and act?
  • The choices we make in day-to-day life are prompted by impulses lodged deep within the nervous system. Not only are we not masters of our fate; we are captives of biological determinism. Once we enter the portals of the strange neuronal world known as the brain, we discover that — to put the matter plainly — we have no idea what we’re doing.
  • Mr. Duhigg’s thesis is that we can’t change our habits, we can only acquire new ones. Alcoholics can’t stop drinking through willpower alone: they need to alter behavior — going to A.A. meetings instead of bars, for instance — that triggers the impulse to drink.
  • ...1 more annotation...
  • they’re full of stories about people who accomplished amazing things in life by, in effect, rewiring themselves
markfrankel18

Why Waiting in Line Is Torture - NYTimes.com - 1 views

  • the experience of waiting, whether for luggage or groceries, is defined only partly by the objective length of the wait. “Often the psychology of queuing is more important than the statistics of the wait itself,” notes the M.I.T. operations researcher Richard Larson, widely considered to be the world’s foremost expert on lines.
  • This is also why one finds mirrors next to elevators. The idea was born during the post-World War II boom, when the spread of high-rises led to complaints about elevator delays. The rationale behind the mirrors was similar to the one used at the Houston airport: give people something to occupy their time, and the wait will feel shorter.
  • Professors Carmon and Kahneman have also found that we are more concerned with how long a line is than how fast it’s moving. Given a choice between a slow-moving short line and a fast-moving long one, we will often opt for the former, even if the waits are identical. (This is why Disney hides the lengths of its lines by wrapping them around buildings and using serpentine queues.)
  • ...1 more annotation...
  • Surveys show that many people will wait twice as long for fast food, provided the establishment uses a first-come-first-served, single-queue ordering system as opposed to a multi-queue setup. Anyone who’s ever had to choose a line at a grocery store knows how unfair multiple queues can seem; invariably, you wind up kicking yourself for not choosing the line next to you moving twice as fast. But there’s a curious cognitive asymmetry at work here. While losing to the line at our left drives us to despair, winning the race against the one to our right does little to lift our spirits. Indeed, in a system of multiple queues, customers almost always fixate on the line they’re losing to and rarely the one they’re beating.
anonymous

Michael Lewis on the King of Human Error | Business | Vanity Fair - 0 views

  • Between 1971 and 1984, Kahneman and Tversky had published a series of quirky papers exploring the ways human judgment may be distorted when we are making decisions in conditions of uncertainty.
1 - 20 of 32 Next ›
Showing 20 items per page