Skip to main content

Home/ TOK@ISPrague/ Group items tagged Kahneman

Rss Feed Group items tagged

markfrankel18

The Irrationality Myth | Big Think | Praxis - 0 views

  • There is indeed ample data that institutions and individuals can wreak havoc when they deviate from certain principles of logic and objectivity. There is also a good deal of dispiriting evidence about the rationality of voters that spurs questions about the effectiveness—and even the legitimacy—of democratic government. Yet much of the work of cognitive scientists—even fascinating Nobel Prize-winning research by Daniel Kahneman—leaves me edified but not alarmed. The human capacity for reason may be fragile and partial but it is not belied by studies in which large percentages of subjects answer a few tricky questions incorrectly.
  • My hypothesis is that while we love reading about humanity’s tendency toward the irrational, we take offense when light is thrown on our own individual incompetencies. 
markfrankel18

How Our Minds Mislead Us: The Marvels and Flaws of Our Intuition | Brain Pickings - 1 views

  • One of the most fascinating examples of heuristics and biases is what we call intuition — a complex cluster of cognitive processes, sometimes helpful but often misleading. Kahneman notes that thoughts come to mind in one of two ways: Either by “orderly computation,” which involves a series of stages of remembering rules and then applying them, or by perception, an evolutionary function that allows us to predict outcomes based on what we’re perceiving.
  • Coherence means that you’re going to adopt one interpretation in general. Ambiguity tends to be suppressed. This is part of the mechanism that you have here that ideas activate other ideas and the more coherent they are, the more likely they are to activate each other. Other things that don’t fit fall away by the wayside. We’re enforcing coherent interpretations. We see the world as much more coherent than it is.
  • There is no sharp line between intuition and perception.
  • ...1 more annotation...
  • The confidence people have in their beliefs is not a measure of the quality of evidence [but] of the coherence of the story that the mind has managed to construct.
markfrankel18

Daniel Kahneman: 'What would I eliminate if I had a magic wand? Overconfidenc... - 0 views

  • Not even he believes that the various flaws that bedevil decision-making can be successfully corrected. The most damaging of these is overconfidence: the kind of optimism that leads governments to believe that wars are quickly winnable and capital projects will come in on budget despite statistics predicting exactly the opposite. It is the bias he says he would most like to eliminate if he had a magic wand. But it “is built so deeply into the structure of the mind that you couldn’t change it without changing many other things”.
markfrankel18

How Our Minds Mislead Us: The Marvels and Flaws of Our Intuition | Brain Pickings - 2 views

  • There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.
  • What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .
  •  
    The Marvels and Flaws of Intuition (from the Brain Pickings Blog)
markfrankel18

Book Review: The Half-Life of Facts - WSJ.com - 0 views

  • Knowledge, then, is less a canon than a consensus in a state of constant disruption. Part of the disruption has to do with error and its correction, but another part with simple newness—outright discoveries or new modes of classification and analysis, often enabled by technology.
  • ore commonly, however, changes in scientific facts reflect the way that science is done. Mr. Arbesman describes the "Decline Effect"—the tendency of an original scientific publication to present results that seem far more compelling than those of later studies. Such a tendency has been documented in the medical literature over the past decade by John Ioannidis, a researcher at Stanford, in areas as diverse as HIV therapy, angioplasty and stroke treatment. The cause of the decline may well be a potent combination of random chance (generating an excessively impressive result) and publication bias (leading positive results to get preferentially published). If shaky claims enter the realm of science too quickly, firmer ones often meet resistance. As Mr. Arbesman notes, scientists struggle to let go of long-held beliefs, something that Daniel Kahneman has described as "theory-induced blindness." Had the Austrian medical community in the 1840s accepted the controversial conclusions of Dr. Ignaz Semmelweis that physicians were responsible for the spread of childbed fever—and heeded his hand-washing recommendations—a devastating outbreak of the disease might have been averted.
markfrankel18

Why Smart People Are Stupid - The New Yorker - 1 views

  • When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions.
  • Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves.
Lawrence Hrubes

How Do Experiences Become Memories? : NPR - 1 views

  • Nobel laureate and founder of behavioral economics Daniel Kahneman goes through a series of examples of things we might remember, from vacations to colonoscopies. He explains how our "experiencing selves" and our "remembering selves" perceive happiness differently.
  •  
    Note: See Daniel Kahneman's TED Talk
markfrankel18

We are more rational than those who nudge us - Steven Poole - Aeon - 3 views

  • We are told that we are an irrational tangle of biases, to be nudged any which way. Does this claim stand to reason?
  • A culture that believes its citizens are not reliably competent thinkers will treat those citizens differently to one that respects their reflective autonomy. Which kind of culture do we want to be? And we do have a choice. Because it turns out that the modern vision of compromised rationality is more open to challenge than many of its followers accept.
  • Modern skepticism about rationality is largely motivated by years of experiments on cognitive bias.
  • ...5 more annotations...
  • The thorny question is whether these widespread departures from the economic definition of ‘rationality’ should be taken to show that we are irrational, or whether they merely show that the economic definition of rationality is defective.
  • There has been some controversy over the correct statistical interpretations of some studies, and several experiments that ostensibly demonstrate ‘priming’ effects, in particular, have notoriously proven difficult to replicate. But more fundamentally, the extent to which such findings can show that we are acting irrationally often depends on what we agree should count as ‘rational’ in the first place.
  • During the development of game theory and decision theory in the mid-20th century, a ‘rational’ person in economic terms became defined as a lone individual whose decisions were calculated to maximise self-interest, and whose preferences were (logically or mathematically) consistent in combination and over time. It turns out that people are not in fact ‘rational’ in this homo economicus way,
  • if we want to understand others, we can always ask what is making their behaviour ‘rational’ from their point of view. If, on the other hand, we just assume they are irrational, no further conversation can take place.
  • And so there is less reason than many think to doubt humans’ ability to be reasonable. The dissenting critiques of the cognitive-bias literature argue that people are not, in fact, as individually irrational as the present cultural climate assumes. And proponents of debiasing argue that we can each become more rational with practice. But even if we each acted as irrationally as often as the most pessimistic picture implies, that would be no cause to flatten democratic deliberation into the weighted engineering of consumer choices, as nudge politics seeks to do. On the contrary, public reason is our best hope for survival.
1 - 9 of 9
Showing 20 items per page