Skip to main content

Home/ TOK@ISPrague/ Group items tagged belief

Rss Feed Group items tagged

Lawrence Hrubes

Why Do People Persist in Believing Things That Just Aren't True? : The New Yorker - 1 views

  • Last month, Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years. They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds. The result was dramatic: a whole lot of nothing. None of the interventions worked.
  • Until recently, attempts to correct false beliefs haven’t had much success. Stephan Lewandowsky, a psychologist at the University of Bristol whose research into misinformation began around the same time as Nyhan’s, conducted a review of misperception literature through 2012. He found much speculation, but, apart from his own work and the studies that Nyhan was conducting, there was little empirical research. In the past few years, Nyhan has tried to address this gap by using real-life scenarios and news in his studies: the controversy surrounding weapons of mass destruction in Iraq, the questioning of Obama’s birth certificate, and anti-G.M.O. activism. Traditional work in this area has focussed on fictional stories told in laboratory settings, but Nyhan believes that looking at real debates is the best way to learn how persistently incorrect views of the world can be corrected.
  • One thing he learned early on is that not all errors are created equal. Not all false information goes on to become a false belief—that is, a more lasting state of incorrect knowledge—and not all false beliefs are difficult to correct. Take astronomy. If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief. But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief? The process isn’t nearly as simple. The crucial difference between then and now, of course, is the importance of the misperception. When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.
markfrankel18

Can Scientific Belief Go Too Far? : 13.7: Cosmos And Culture : NPR - 3 views

  • Do some scientists hold on to a belief longer than they should? Or, more provocatively phrased, when does a scientific belief become an article of faith?
  • This kind of posture, when there is a persistent holding on to a belief that is continually contradicted by facts, can only be called faith. In the quantum case, it's faith in an ordered, rational nature, even if it reveals itself through random behavior. "God doesn't play dice," wrote Einstein to his colleague Max Born. His conviction led him and others to look for theories that could explain the quantum probabilities as manifestations of a deeper order. And they failed. (And we now know that this randomness will not go away, being the very essence of quantum phenomena.) There is, however, an essential difference between religious faith and scientific faith: dogma. In science, dogma is untenable. Sooner or later, even the deepest ingrained ideas — if proven wrong — must collapse under the weight of evidence. A scientist who holds on to an incorrect theory or hypothesis makes for a sad figure. In religion, given that evidence is either elusive or irrelevant, faith is always viable.
Lawrence Hrubes

Modern psychology's God problem | GarethCook - 7 views

  •  
    "Modern psychology has a serious God problem. America is a deeply spiritual country. More than half of Americans say religion is "very important" to them, and more than 90 percent profess a belief in a higher power. Yet psychology, as a scientific endeavor, has done almost nothing to understand how spiritual beliefs shape psychological problems, or affect treatment. "
markfrankel18

BBC - Future - How to debunk falsehoods - 1 views

  • We all resist changing our beliefs about the world, but what happens when some of those beliefs are based on misinformation? Is there a right way to correct someone when they believe something that's wrong?
  • Too often, argue Lewandowsky and Cook, communicators assume a 'deficit model' in their interactions with the misinformed. This is the idea that we have the right information, and all we need to do to make people believe is to somehow "fill in" the deficit in other people's understanding. Just telling people the evidence for the truth will be enough to replace their false beliefs. Beliefs don't work like that.
markfrankel18

Why Do People Persist in Believing Things That Just Aren't True? : The New Yorker - 2 views

  • One thing he learned early on is that not all errors are created equal. Not all false information goes on to become a false belief—that is, a more lasting state of incorrect knowledge—and not all false beliefs are difficult to correct. Take astronomy. If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief. But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief?
markfrankel18

Fake news and gut instincts: Why millions of Americans believe things that aren't true ... - 2 views

  •  
    "There are many individual qualities that seem like they should promote accuracy, but don't. Valuing evidence, however, appears to be an exception. The bigger the role evidence plays in shaping a person's beliefs, the more accurate that person tends to be. We aren't the only ones who have observed a pattern like this. Another recent study shows that people who exhibit higher scientific curiosity also tend to adopt more accurate beliefs about politically charged science topics, such as fracking and global warming."
markfrankel18

Science Is Not Religion | Jeff Schweitzer - 3 views

  • Author Christine Ma-Kellams recently told HuffPost Science that, "In many ways, science seems like a 21st Century religion. It's a belief system that many wholeheartedly defend and evolve their lives around, sometimes as much as the devoutest of religious folk." Nothing could be further from the truth. Science is not a "belief system" but a process and methodology for seeking an objective reality. Of course because scientific exploration is a human endeavor it comes with all the flaws of humanity: ego, short-sightedness, corruption and greed. But unlike a "belief system" such as religion untethered to an objective truth, science is over time self-policing; competing scientists have a strong incentive to corroborate and build on the findings of others; but equally, to prove other scientists wrong by means that can be duplicated by others.
markfrankel18

Theory of mind and the belief in God. - Slate Magazine - 0 views

  • As a direct consequence of the evolution of the human social brain, and owing to the importance of our theory-of-mind skills in that process, we sometimes can't help but see intentions, desires, and beliefs in things that haven't even a smidgeon of a neural system. In particular, when inanimate objects do unexpected things, we sometimes reason about them just as we do for oddly behaving—or misbehaving—people. More than a few of us have kicked our broken-down vehicles in the sides and verbally abused our incompetent computers. Most of us stop short of actually believing these objects possess mental states—indeed, we would likely be hauled away to an asylum if we genuinely believed that they held malicious intent—but our emotions and behaviors toward such objects seem to betray our primitive, unconscious thinking: we act as though they're morally culpable for their actions.
markfrankel18

The Certainty of Donald Rumsfeld (Part 4) - NYTimes.com - 0 views

  • What do I take from this? To me, progress hinges on our ability to discriminate knowledge from belief, fact from fantasy, on the basis of evidence. It’s not the known unknown from the known known, or the unknown unknown from the known unknown, that is crucial to progress. It’s what evidence do you have for X, Y or Z? What is the justification for your beliefs? When confronted with such a question, Rumsfeld was never, ever able to come up with an answer.
  • The history of the Iraq war is replete with false assumptions, misinterpreted evidence, errors in judgment. Mistakes can be made. We all make them. But Rumsfeld created a climate where mistakes could be made with little or no way to correct them. Basic questions about evidence for W.M.D. were replaced with equivocations and obfuscations. A hall of mirrors. An infinite regress to nowhere. What do I know I know? What do I know I know I know? What do I know I don’t know I don’t know? Ad infinitum. Absence of evidence could be evidence of absence or evidence of presence. Take your pick. An obscurantist’s dream. There’s a quotation I have never liked. It comes from F. Scott Fitzgerald’s The Crack-Up. “The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function.” Not really. The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time and know they are opposed.
  • Rumsfeld, too, may believe what he is saying. But believing something does not make it true. The question is why he believed what he believed. On the basis of what evidence? Mere belief is not enough.
Lawrence Hrubes

Teaching Doubt - The New Yorker - 0 views

  • “Non-overlapping magisteria” has a nice ring to it. The problem is that there are many religious claims that not only “overlap” with empirical data but are incompatible with it. As a scientist who also spends a fair amount of time in the public arena, if I am asked if our understanding of the Big Bang conflicts with the idea of a six-thousand-year-old universe, I face a choice: I can betray my scientific values, or encourage that person to doubt his or her own beliefs. More often than you might think, teaching science is inseparable from teaching doubt.
  • Doubt about one’s most cherished beliefs is, of course, central to science: the physicist Richard Feynman stressed that the easiest person to fool is oneself. But doubt is also important to non-scientists. It’s good to be skeptical, especially about ideas you learn from perceived authority figures. Recent studies even suggest that being taught to doubt at a young age could make people better lifelong learners. That, in turn, means that doubters—people who base their views on evidence, rather than faith—are likely to be better citizens.
  • Science class isn’t the only place where students can learn to be skeptical. A provocative novel that presents a completely foreign world view, or a history lesson exploring the vastly different mores of the past, can push you to skeptically reassess your inherited view of the universe. But science is a place where such confrontation is explicit and accessible. It didn’t take more than a simple experiment for Galileo to overturn the wisdom of Aristotle. Informed doubt is the very essence of science.
  • ...1 more annotation...
  • Some teachers shy away from confronting religious beliefs because they worry that planting the seeds of doubt will cause some students to question or abandon their own faith or the faith of their parents. But is that really such a bad thing? It offers some young people the chance to escape the guilt imposed upon them simply for questioning what they’re told. Last year, I received an e-mail from a twenty-seven-year-old man who is now studying in the United States after growing up in Saudi Arabia. His father was executed by family members after converting to Christianity. He says that it’s learning about science that has finally liberated him from the spectre of religious fundamentalism.
Lawrence Hrubes

How a Gay-Marriage Study Went Wrong - The New Yorker - 1 views

  • ast December, Science published a provocative paper about political persuasion. Persuasion is famously difficult: study after study—not to mention much of world history—has shown that, when it comes to controversial subjects, people rarely change their minds, especially if those subjects are important to them. You may think that you’ve made a convincing argument about gun control, but your crabby uncle isn’t likely to switch sides in the debate. Beliefs are sticky, and hardly any approach, no matter how logical it may be, can change that. The Science study, “When contact changes minds: An experiment on transmission of support for gay equality,” seemed to offer a method that could work.
  • In the document, “Irregularities in LaCour (2014),” Broockman, along with a fellow graduate student, Joshua Kalla, and a professor at Yale, Peter Aronow, argued that the survey data in the study showed multiple statistical irregularities and was likely “not collected as described.”
  • If, in the end, the data do turn out to be fraudulent, does that say anything about social science as a whole? On some level, the case would be a statistical fluke. Despite what news headlines would have you believe, outright fraud is incredibly rare; almost no one commits it, and almost no one experiences it firsthand. As a result, innocence is presumed, and the mindset is one of trust.
  • ...2 more annotations...
  • There’s another issue at play: the nature of belief. As I’ve written before, we are far quicker to believe things that mesh with our view of how life should be. Green is a firm supporter of gay marriage, and that may have made him especially pleased about the study. (Did it have a similar effect on liberally minded reviewers at Science? We know that studies confirming liberal thinking sometimes get a pass where ones challenging those ideas might get killed in review; the same effect may have made journalists more excited about covering the results.)
  • In short, confirmation bias—which is especially powerful when we think about social issues—may have made the study’s shakiness easier to overlook.
Lawrence Hrubes

How Cold Weather Makes You Forget About Global Warming : The New Yorker - 2 views

  •  
    "A number of other researchers have since produced similar findings: temperatures that deviate from the norm affect people's beliefs in climate change. In one study, subjects placed in a heated cubicle believed more acutely in global warming than people placed in non-heated ones."
Lawrence Hrubes

Pondering Miracles, Medical and Religious - The New York Times - 0 views

  • The tribunal that questioned me was not juridical, but ecclesiastical. I was not asked about my faith. (For the record, I’m an atheist.) I was not asked if it was a miracle. I was asked if I could explain it scientifically. I could not, though I had come armed for my testimony with the most up-to-date hematological literature, which showed that long survivals following relapses were not seen.
  • When, at the end, the Vatican committee asked if I had anything more to say, I blurted out that as much as her survival, thus far, was remarkable, I fully expected her to relapse some day sooner or later. What would the Vatican do then, revoke the canonization? The clerics recorded my doubts. But the case went forward and d’Youville was canonized on Dec. 9, 1990.
  • Respect for our religious patients demands understanding and tolerance; their beliefs are as true for them as the “facts” may be for physicians. Now almost 40 years later, that mystery woman is still alive and I still cannot explain why. Along with the Vatican, she calls it a miracle. Why should my inability to offer an explanation trump her belief? However they are interpreted, miracles exist, because that is how they are lived in our world.
markfrankel18

Mathematicians dispute claims that the 'golden ratio' is a natural blueprint for beauty... - 0 views

  • But the widespread belief that the golden ratio is the natural blueprint for beauty is pseudo-scientific “hocus-pocus” and a “myth that refuses to go away”, according to leading mathematicians.
  • Dr Devlin, who campaigns against myths associated with the golden ratio, pointed to “considerable evidence” that people do not find golden rectangles more appealing than others. On the contrary, they tend to favour aspect ratios they are familiar with, such as an A4 piece of paper or a computer screen.
  • Theories that the Parthenon in Athens and Great Pyramid in Egypt were built according to the golden ratio have also been disproved, he said. “The golden ratio stuff is in the realm of religious belief. People will argue it is true because they believe it, but it’s just not fact.”
markfrankel18

Is the Field of Psychology Biased Against Conservatives? - 0 views

  • Perhaps even more potentially problematic than negative personal experience is the possibility that bias may influence research quality: its design, execution, evaluation, and interpretation. In 1975, Stephen Abramowitz and his colleagues sent a fake manuscript to eight hundred reviewers from the American Psychological Association—four hundred more liberal ones (fellows of the Society for the Psychological Study of Social Issues and editors of the Journal of Social Issues) and four hundred less liberal (social and personality psychologists who didn’t fit either of the other criteria). The paper detailed the psychological well-being of student protesters who had occupied a college administration building and compared them to their non-activist classmates. In one version, the study found that the protesters were more psychologically healthy. In another, it was the more passive group that emerged as mentally healthier. The rest of the paper was identical. And yet, the two papers were not evaluated identically. A strong favorable reaction was three times more likely when the paper echoed one’s political beliefs—that is, when the more liberal reviewers read the version that portrayed the protesters as healthier.
  • All these studies and analyses are classic examples of confirmation bias: when it comes to questions of subjective belief, we more easily believe the things that mesh with our general world view. When something clashes with our vision of how things should be, we look immediately for the flaws.
markfrankel18

To Understand Religion, Think Football - Issue 17: Big Bangs - Nautilus - 5 views

  • The invention of religion is a big bang in human history. Gods and spirits helped explain the unexplainable, and religious belief gave meaning and purpose to people struggling to survive. But what if everything we thought we knew about religion was wrong? What if belief in the supernatural is window dressing on what really matters—elaborate rituals that foster group cohesion, creating personal bonds that people are willing to die for. Anthropologist Harvey Whitehouse thinks too much talk about religion is based on loose conjecture and simplistic explanations. Whitehouse directs the Institute of Cognitive and Evolutionary Anthropology at Oxford University. For years he’s been collaborating with scholars around the world to build a massive body of data that grounds the study of religion in science. Whitehouse draws on an array of disciplines—archeology, ethnography, history, evolutionary psychology, cognitive science—to construct a profile of religious practices.
  • I suppose people do try to fill in the gaps in their knowledge by invoking supernatural explanations. But many other situations prompt supernatural explanations. Perhaps the most common one is thinking there’s a ritual that can help us when we’re doing something with a high risk of failure. Lots of people go to football matches wearing their lucky pants or lucky shirt. And you get players doing all sorts of rituals when there’s a high-risk situation like taking a penalty kick.
  • We tend to take a few bits and pieces of the most familiar religions and see them as emblematic of what’s ancient and pan-human. But those things that are ancient and pan-human are actually ubiquitous and not really part of world religions. Again, it really depends on what we mean by “religion.” I think the best way to answer that question is to try and figure out which cognitive capacities came first.
markfrankel18

The Price of Denialism - NYTimes.com - 1 views

  • In other words, we need to be able to tell when we believe or disbelieve in something based on high standards of evidence and when we are just engaging in a bit of motivated reasoning and letting our opinions take over. When we withhold belief because the evidence does not live up to the standards of science, we are skeptical. When we refuse to believe something, even in the face of what most others would take to be compelling evidence, we are engaging in denial. In most cases, we do this because at some level it upsets us to think that the theory is true.
  • So how to tell a fact from an opinion? By the time we sit down to evaluate the evidence for a scientific theory, it is probably too late. If we take the easy path in our thinking, it eventually becomes a habit. If we lie to others, sooner or later we may believe the lie ourselves. The real battle comes in training ourselves to embrace the right attitudes about belief formation in the first place, and for this we need to do a little philosophy.
markfrankel18

Book Review: The Half-Life of Facts - WSJ.com - 0 views

  • Knowledge, then, is less a canon than a consensus in a state of constant disruption. Part of the disruption has to do with error and its correction, but another part with simple newness—outright discoveries or new modes of classification and analysis, often enabled by technology.
  • ore commonly, however, changes in scientific facts reflect the way that science is done. Mr. Arbesman describes the "Decline Effect"—the tendency of an original scientific publication to present results that seem far more compelling than those of later studies. Such a tendency has been documented in the medical literature over the past decade by John Ioannidis, a researcher at Stanford, in areas as diverse as HIV therapy, angioplasty and stroke treatment. The cause of the decline may well be a potent combination of random chance (generating an excessively impressive result) and publication bias (leading positive results to get preferentially published). If shaky claims enter the realm of science too quickly, firmer ones often meet resistance. As Mr. Arbesman notes, scientists struggle to let go of long-held beliefs, something that Daniel Kahneman has described as "theory-induced blindness." Had the Austrian medical community in the 1840s accepted the controversial conclusions of Dr. Ignaz Semmelweis that physicians were responsible for the spread of childbed fever—and heeded his hand-washing recommendations—a devastating outbreak of the disease might have been averted.
markfrankel18

Why Americans Are the Weirdest People in the World - 1 views

  • Henrich’s work with the ultimatum game was an example of a small but growing countertrend in the social sciences, one in which researchers look straight at the question of how deeply culture shapes human cognition. His new colleagues in the psychology department, Heine and Norenzayan, were also part of this trend. Heine focused on the different ways people in Western and Eastern cultures perceived the world, reasoned, and understood themselves in relationship to others. Norenzayan’s research focused on the ways religious belief influenced bonding and behavior. The three began to compile examples of cross-cultural research that, like Henrich’s work with the Machiguenga, challenged long-held assumptions of human psychological universality.
  • As Heine, Norenzayan, and Henrich furthered their search, they began to find research suggesting wide cultural differences almost everywhere they looked: in spatial reasoning, the way we infer the motivations of others, categorization, moral reasoning, the boundaries between the self and others, and other arenas. These differences, they believed, were not genetic. The distinct ways Americans and Machiguengans played the ultimatum game, for instance, wasn’t because they had differently evolved brains. Rather, Americans, without fully realizing it, were manifesting a psychological tendency shared with people in other industrialized countries that had been refined and handed down through thousands of generations in ever more complex market economies. When people are constantly doing business with strangers, it helps when they have the desire to go out of their way (with a lawsuit, a call to the Better Business Bureau, or a bad Yelp review) when they feel cheated. Because Machiguengan culture had a different history, their gut feeling about what was fair was distinctly their own. In the small-scale societies with a strong culture of gift-giving, yet another conception of fairness prevailed. There, generous financial offers were turned down because people’s minds had been shaped by a cultural norm that taught them that the acceptance of generous gifts brought burdensome obligations. Our economies hadn’t been shaped by our sense of fairness; it was the other way around.
  • Studies show that Western urban children grow up so closed off in man-made environments that their brains never form a deep or complex connection to the natural world. While studying children from the U.S., researchers have suggested a developmental timeline for what is called “folkbiological reasoning.” These studies posit that it is not until children are around 7 years old that they stop projecting human qualities onto animals and begin to understand that humans are one animal among many. Compared to Yucatec Maya communities in Mexico, however, Western urban children appear to be developmentally delayed in this regard. Children who grow up constantly interacting with the natural world are much less likely to anthropomorphize other living things into late childhood.
markfrankel18

Gamblers, Scientists and the Mysterious Hot Hand - The New York Times - 0 views

  • The opposite of that is the hot-hand fallacy — the belief that winning streaks, whether in basketball or coin tossing, have a tendency to continue, as if propelled by their own momentum. Both misconceptions are reflections of the brain’s wired-in rejection of the power that randomness holds over our lives. Look deep enough, we instinctively believe, and we may uncover a hidden order.
  • A working paper published this summer has caused a stir by proposing that a classic body of research disproving the existence of the hot hand in basketball is flawed by a subtle misperception about randomness. If the analysis is correct, the possibility remains that the hot hand is real.
  • Taken to extremes, seeing connections that don’t exist can be a symptom of a psychiatric condition called apophenia. In less pathological forms, the brain’s hunger for pattern gives rise to superstitions (astrology, numerology) and is a driving factor in what has been called a replication crisis in science — a growing number of papers that cannot be confirmed by other laboratories.
1 - 20 of 52 Next › Last »
Showing 20 items per page