Skip to main content

Home/ TOK Friends/ Group items tagged VACCINE:

Rss Feed Group items tagged

Javier E

The varieties of denialism | Scientia Salon - 1 views

  • a stimulating conference at Clark University about “Manufacturing Denial,” which brought together scholars from wildly divergent disciplines — from genocide studies to political science to philosophy — to explore the idea that “denialism” may be a sufficiently coherent phenomenon underlying the willful disregard of factual evidence by ideologically motivated groups or individuals.
  • the Oxford defines a denialist as “a person who refuses to admit the truth of a concept or proposition that is supported by the majority of scientific or historical evidence,” which represents a whole different level of cognitive bias or rationalization. Think of it as bias on steroids.
  • First, as a scientist: it’s just not about the facts, indeed — as Brendan showed data in hand during his presentation — insisting on facts may have counterproductive effects, leading the denialist to double down on his belief.
  • ...22 more annotations...
  • if I think that simply explaining the facts to the other side is going to change their mind, then I’m in for a rude awakening.
  • As a philosopher, I found to be somewhat more disturbing the idea that denialism isn’t even about critical thinking.
  • what the large variety of denialisms have in common is a very strong, overwhelming, ideological commitment that helps define the denialist identity in a core manner. This commitment can be religious, ethnical or political in nature, but in all cases it fundamentally shapes the personal identity of the people involved, thus generating a strong emotional attachment, as well as an equally strong emotional backlash against critics.
  • To begin with, of course, they think of themselves as “skeptics,” thus attempting to appropriate a word with a venerable philosophical pedigree and which is supposed to indicate a cautiously rational approach to a given problem. As David Hume put it, a wise person (i.e., a proper skeptic) will proportion her beliefs to the evidence. But there is nothing of the Humean attitude in people who are “skeptical” of evolution, climate change, vaccines, and so forth.
  • Denialists have even begun to appropriate the technical language of informal logic: when told that a majority of climate scientists agree that the planet is warming up, they are all too happy to yell “argument from authority!” When they are told that they should distrust statements coming from the oil industry and from “think tanks” in their pockets they retort “genetic fallacy!” And so on. Never mind that informal fallacies are such only against certain background information, and that it is eminently sensible and rational to trust certain authorities (at the least provisionally), as well as to be suspicious of large organizations with deep pockets and an obvious degree of self-interest.
  • What commonalities can we uncover across instances of denialism that may allow us to tackle the problem beyond facts and elementary logic?
  • the evidence from the literature is overwhelming that denialists have learned to use the vocabulary of critical thinking against their opponents.
  • Another important issue to understand is that denialists exploit the inherently tentative nature of scientific or historical findings to seek refuge for their doctrines.
  • . Scientists have been wrong before, and doubtlessly will be again in the future, many times. But the issue is rather one of where it is most rational to place your bets as a Bayesian updater: with the scientific community or with Faux News?
  • Science should be portrayed as a human story of failure and discovery, not as a body of barely comprehensible facts arrived at by epistemic priests.
  • Is there anything that can be done in this respect? I personally like the idea of teaching “science appreciation” classes in high school and college [2], as opposed to more traditional (usually rather boring, both as a student and as a teacher) science instruction
  • Denialists also exploit the media’s self imposed “balanced” approach to presenting facts, which leads to the false impression that there really are two approximately equal sides to every debate.
  • This is a rather recent phenomenon, and it is likely the result of a number of factors affecting the media industry. One, of course, is the onset of the 24-hr media cycle, with its pernicious reliance on punditry. Another is the increasing blurring of the once rather sharp line between reporting and editorializing.
  • The problem with the media is of course made far worse by the ongoing crisis in contemporary journalism, with newspapers, magazines and even television channels constantly facing an uncertain future of revenues,
  • he push back against denialism, in all its varied incarnations, is likely to be more successful if we shift the focus from persuading individual members of the public to making political and media elites accountable.
  • This is a major result coming out of Brendan’s research. He showed data set after data set demonstrating two fundamental things: first, large sections of the general public do not respond to the presentation of even highly compelling facts, indeed — as mentioned above — are actually more likely to entrench further into their positions.
  • Second, whenever one can put pressure on either politicians or the media, they do change their tune, becoming more reasonable and presenting things in a truly (as opposed to artificially) balanced way.
  • Third, and most crucially, there is plenty of evidence from political science studies that the public does quickly rally behind a unified political leadership. This, as much as it is hard to fathom now, has happened a number of times even in somewhat recent times
  • when leaders really do lead, the people follow. It’s just that of late the extreme partisan bickering in Washington has made the two major parties entirely incapable of working together on the common ground that they have demonstrably had in the past.
  • Another thing we can do about denialism: we should learn from the detailed study of successful cases and see what worked and how it can be applied to other instances
  • Yet another thing we can do: seek allies. In the case of evolution denial — for which I have the most first-hand experience — it has been increasingly obvious to me that it is utterly counterproductive for a strident atheist like Dawkins (or even a relatively good humored one like yours truly) to engage creationists directly. It is far more effective when we have clergy (Barry Lynn of Americans United for the Separation of Church and State [6] comes to mind) and religious scientists
  • Make no mistake about it: denialism in its various forms is a pernicious social phenomenon, with potentially catastrophic consequences for our society. It requires a rallying call for all serious public intellectuals, academic or not, who have the expertise and the stamina to join the fray to make this an even marginally better world for us all. It’s most definitely worth the fight.
aliciathompson1

How Dr. Ben Carson Ruined His Legacy - The Daily Beast - 0 views

  • This leads to the salient question of whether being a neurosurgeon is in any way a relevant qualification to seek the highest elected office in the nation.
    • aliciathompson1
       
      What qualifies someone to be the president?
  • specifically for the purposes of harvesting that tissue
  • Dr. Carson the scientist would laugh someone off the stage at a conference if they presented such sloppy thinking for review.
  • ...2 more annotations...
  • When asked about whether as president he would allow waterboarding, he refused to condemn it and blithely dismissed fighting “politically correct wars.”
  • The sad reality is that the party that gave America Todd Akin and has a bloviating vaccine truther as its front-runner sorely needs the voice Dr. Carson could be lending to its dialogue.
proudsa

CDC Director Calls It 'Shameful' This Curable Disease Still Kills Millions - 0 views

  • "Drug-resistant tuberculosis threatens to reverse the gains that we've made. It's not just the threat overseas, it's the threat here,"
  • Drug-resistant tuberculosis knows no borders, and we risk turning the clock back on antibiotics and making it very difficult for us to stop tuberculosis from spreading around the world and in this country if we don't improve our control efforts.
  • We do prioritize addressing MDR-TB. We have done that for more than 20 years; that's why we've been able to drastically reduce U.S. cases of MDR-TB.
  • ...5 more annotations...
  • How will you and the CDC help ensure a well-executed program?
  • Right, and one of the things that will help tuberculosis control is the test and treat approach, and increasing the proportion of HIV-positive people that are treated. The majority of TB cases currently occur before people are started on anti-HIV medicines.
  • Partly it's the characteristic of the bacteria: You require long-term treatment for many months, we don't have a vaccine as we do with measles or polio, but partly it's the nature of the control program.
  • Fundamentally, what happens with tuberculosis will depend on two things. First, how well we implement what we know today, and second, how quickly we get better tools to stop tuberculosis.
  • If we could figure out which of them are going to develop active TB and provide shorter, more effective treatments for them, then we might really be able to knock down tuberculosis cases.
kushnerha

Why People Are Confused About What Experts Really Think - The New York Times - 2 views

  • GIVEN the complexities of the modern world, we all have to rely on expert opinion. Are G.M.O. foods safe? Is global warming real? Should children be vaccinated for measles? We don’t have the time or the training to adjudicate these questions ourselves. We defer to the professionals.
  • And to find out what the experts think, we typically rely on the news media. This creates a challenge for journalists: There are many issues on which a large majority of experts agree but a small number hold a dissenting view. Is it possible to give voice to experts on both sides — standard journalistic practice — without distorting the public’s perception of the level of disagreement?
  • This can be hard to do. Indeed, critics argue that journalists too often generate “false balance,” creating an impression of disagreement when there is, in fact, a high level of consensus. One solution, adopted by news organizations such as the BBC, is “weight of evidence” reporting, in which the presentation of conflicting views is supplemented by an indication of where the bulk of expert opinion lies.
  • ...9 more annotations...
  • Both studies suggest that “weight of evidence” reporting is an imperfect remedy. It turns out that hearing from experts on both sides of an issue distorts our perception of consensus — even when we have all the information we need to correct that misperception.
  • In one study, all the participants were presented with a numerical summary, drawn from a panel of experts convened by the University of Chicago, of the range of expert opinion on certain economic issues.
  • One group of participants, however, was presented not only with the numerical summary of expert opinion but also with an excerpted comment from one expert on either side of an issue.
  • Then, all the participants were asked to rate their perception of the extent to which the experts agreed with one another on each issue. Even though both had a precise count of the number of experts on either side, the participants who also read the comments of the opposing experts gave ratings that did not distinguish as sharply between the high-consensus and the low-consensus issues. In other words, being exposed to the conflicting comments made it more difficult for participants to distinguish the issues most experts agreed on (such as carbon tax) from those for which there was substantial disagreement (such as minimum wage).
  • This distorting influence affected not only the participants’ perception of the degree of consensus, but also their judgments of whether there was sufficient consensus to use it to guide public policy.
  • What explains this cognitive glitch? One possibility is that when we are presented with comments from experts on either side of an issue, we produce a mental representation of the disagreement that takes the form of one person on either side, which somehow contaminates our impression of the distribution of opinions in the larger population of experts.
  • Another possibility is that we may just have difficulty discounting the weight of a plausible argument, even when we know it comes from an expert whose opinion is held by only a small fraction of his or her peers.
  • It’s also possible that the mere presence of conflict (in the form of contradictory expert comments) triggers a general sense of uncertainty in our minds, which in turn colors our perceptions of the accuracy of current expert understanding of an issue.
  • the implications are worrisome. Government action is guided in part by public opinion. Public opinion is guided in part by perceptions of what experts think. But public opinion may — and often does — deviate from expert opinion, not simply, it seems, because the public refuses to acknowledge the legitimacy of experts, but also because the public may not be able to tell where the majority of expert opinion lies.
Javier E

'The Death of Expertise' Explores How Ignorance Became a Virtue - The New York Times - 1 views

  • a larger wave of anti-rationalism that has been accelerating for years — manifested in the growing ascendance of emotion over reason in public debates, the blurring of lines among fact and opinion and lies, and denialism in the face of scientific findings about climate change and vaccination.
  • “Americans have reached a point where ignorance, especially of anything related to public policy, is an actual virtue,”
  • “To reject the advice of experts is to assert autonomy, a way for Americans to insulate their increasingly fragile egos from ever being told they’re wrong about anything. It is a new Declaration of Independence: No longer do we hold these truths to be self-evident, we hold all truths to be self-evident, even the ones that aren’t true. All things are knowable and every opinion on any subject is as good as any other.”
  • ...10 more annotations...
  • iterating arguments explored in more depth in books like Al Gore’s “The Assault on Reason,” Susan Jacoby’s “The Age of American Unreason,” Robert Hughes’s “Culture of Complaint” and, of course, Richard Hofstadter’s 1963 classic, “Anti-Intellectualism in American Life.” Nichols’s source notes are one of the highlights of the volume, pointing the reader to more illuminating books and articles.
  • “resistance to intellectual authority” naturally took root in a country, dedicated to the principles of liberty and egalitarianism, and how American culture tends to fuel “romantic notions about the wisdom of the common person or the gumption of the self-educated genius.”
  • the “protective swaddling environment of the modern university infantilizes students,”
  • today’s populism has magnified disdain for elites and experts of all sorts, be they in foreign policy, economics, even science.
  • Trump won the 2016 election, Nichols writes, because “he connected with a particular kind of voter who believes that knowing about things like America’s nuclear deterrent is just so much pointy-headed claptrap.” Worse, he goes on, some of these voters “not only didn’t care that Trump is ignorant or wrong, they likely were unable to recognize his ignorance or errors,” thanks to their own lack of knowledge.
  • While the internet has allowed more people more access to more information than ever before, it has also given them the illusion of knowledge when in fact they are drowning in data and cherry-picking what they choose to read
  • it becomes easy for one to succumb to “confirmation bias” — the tendency, as Nichols puts it, “to look for information that only confirms what we believe, to accept facts that only strengthen our preferred explanations, and to dismiss data that challenge what we accept as truth.”
  • When confronted with hard evidence that they are wrong, many will simply double down on their original assertions. “This is the ‘backfire effect,’” Nichols writes, “in which people redouble their efforts to keep their own internal narrative consistent, no matter how clear the indications that they’re wrong.” As a result, extreme views are amplified online, just as fake news and propaganda easily go viral.
  • Today, all these factors have combined to create a maelstrom of unreason that’s not just killing respect for expertise, but also undermining institutions, thwarting rational debate and spreading an epidemic of misinformation. These developments, in turn, threaten to weaken the very foundations of our democracy.
  • “Laypeople complain about the rule of experts and they demand greater involvement in complicated national questions, but many of them only express their anger and make these demands after abdicating their own important role in the process: namely, to stay informed and politically literate enough to choose representatives who can act on their behalf.”
Javier E

Gender-Reveal Parties and Cultural Despair : The New Yorker - 1 views

  • the nature of manufactured customs and instant traditions. They emerge from an atomized society in order to fill a perceived void where real ceremonies used to be, and they end by reflecting that society’s narcissism
  • At bottom, the invented rituals that proliferate in our culture signify a disenchantment with modernity. If, like millions of Americans, you’re secular and the traditions of a church or temple have no hold on you, or if you’re assimilated and ethnic identity has faded away, then what is there to sustain you on the lonely path through a turbulent, rootless, uncertain world?
  • Science might not be enough, which is why so many educated people have turned against it and adopted hostile theories about childhood vaccination. This is the same disenchantment that has produced religious revivalism through much of the world.
Javier E

The Irrational Risk of Thinking We Can Be Rational About Risk | Risk: Reason and Realit... - 0 views

  • in the most precise sense of the word, facts are meaningless…just disconnected ones and zeroes in the computer until we run them through the software of how those facts feel
  • Of all the building evidence about human cognition that suggests we ought to be a little more humble about our ability to reason, no other finding has more significance, because Elliott teaches us that no matter how smart we like to think we are, our perceptions are inescapably a blend of reason and gut reaction, intellect and instinct, facts and feelings.
  • many people, particularly intellectuals and academics and policy makers, maintain a stubborn post-Enlightenment confidence in the supreme power of rationality. They continue to believe that we can make the ‘right’ choices about risk based on the facts, that with enough ‘sound science’ evidence from toxicology and epidemiology and cost-benefit analysis, the facts will reveal THE TRUTH. At best this confidence is hopeful naivete. At worst, it is intellectual arrogance that denies all we’ve learned about the realities of human cognition. In either case, it’s dangerous
  • ...5 more annotations...
  • There are more than a dozen of these risk perception factors, (see Ch. 3 of “How Risky Is It, Really? Why Our Fears Don’t Match the Facts", available online free at)
  • Because our perceptions rely as much as or more on feelings than simply on the facts, we sometimes get risk wrong. We’re more afraid of some risks than we need to be (child abduction, vaccines), and not as afraid of some as we ought to be (climate change, particulate air pollution), and that “Perception Gap” can be a risk in and of itself
  • We must understand that instinct and intellect are interwoven components of a single system that helps us perceive the world and make our judgments and choices, a system that worked fine when the risks we faced were simpler but which can make dangerous mistakes as we try to figure out some of the more complex dangers posed in our modern world.
  • What we can do to avoid the dangers that arise when our fears don’t match the facts—the most rational thing to do—is, first, to recognize that our risk perceptions can never be purely objectively perfectly 'rational', and that our subjective perceptions are prone to potentially dangerous mistakes.
  • Then we can begin to apply all the details we've discovered of how our risk perception system works, and use that knowledge and self-awareness to make wiser, more informed, healthier choices
Javier E

Our Dangerous Inability to Agree on What is TRUE | Risk: Reason and Reality | Big Think - 1 views

  • Given that human cognition is never the product of pure dispassionate reason, but a subjective interpretation of the facts based on our feelings and biases and instincts, when can we ever say that we know who is right and who is wrong, about anything? When can we declare a fact so established that it’s fair to say, without being called arrogant, that those who deny this truth don’t just disagree…that they’re just plain wrong
  • This isn’t about matters of faith, or questions of ultimately unknowable things which by definition can not be established by fact. This is a question about what is knowable, and provable by careful objective scientific inquiry, a process which includes challenging skepticism rigorously applied precisely to establish what, beyond any reasonable doubt, is in fact true. The way evolution has been established
  • With enough careful investigation and scrupulously challenged evidence, we can establish knowable truths that are not just the product of our subjective motivated reasoning. We can apply our powers of reason and our ability to objectively analyze the facts and get beyond the point where what we 'know' is just an interpretation of the evidence through the subconscious filters of who we trust and our biases and instincts. We can get to the point where if someone wants to continue believe that the sun revolves around the earth, or that vaccines cause autism, or that evolution is a deceit, it is no longer arrogant - though it may still be provocative - to call those people wrong.
  • ...6 more annotations...
  • here is a truth with which I hope we can all agree. Our subjective system of cognition can be dangerous. It can produce perceptions that conflict with the evidence, what I call The Perception Gap, which can in turn produce profound harm.
  • The Perception Gap can lead to disagreements that create destructive and violent social conflict, to dangerous personal choices that feel safe but aren’t, and to policies more consistent with how we feel than what is in fact in our best interest. The Perception Gap may in fact be potentially more dangerous than any individual risk we face.
  • We need to recognize the greater threat that our subjective system of cognition can pose, and in the name of our own safety and the welfare of the society on which we depend, do our very best to rise above it or, when we can’t, account for this very real danger in the policies we adopt.
  • we have an obligation to confront our own ideological priors. We have an obligation to challenge ourselves, to push ourselves, to be suspicious of conclusions that are too convenient, to be sure that we're getting it right.
  • subjective cognition is built-in, subconscious, beyond free will, and unavoidably leads to different interpretations of the same facts.
  • Views that have more to do with competing tribal biases than objective interpretations of the evidence create destructive and violent conflict.
Javier E

The Dangers of Pseudoscience - NYTimes.com - 0 views

  • the “demarcation problem,” the issue of what separates good science from bad science and pseudoscience (and everything in between). The problem is relevant for at least three reasons.
  • The first is philosophical: Demarcation is crucial to our pursuit of knowledge; its issues go to the core of debates on epistemology and of the nature of truth and discovery.
  • The second reason is civic: our society spends billions of tax dollars on scientific research, so it is important that we also have a good grasp of what constitutes money well spent in this regard.
  • ...18 more annotations...
  • Third, as an ethical matter, pseudoscience is not — contrary to popular belief — merely a harmless pastime of the gullible; it often threatens people’s welfare,
  • It is precisely in the area of medical treatments that the science-pseudoscience divide is most critical, and where the role of philosophers in clarifying things may be most relevant.
  • some traditional Chinese remedies (like drinking fresh turtle blood to alleviate cold symptoms) may in fact work
  • There is no question that some folk remedies do work. The active ingredient of aspirin, for example, is derived from willow bark, which had been known to have beneficial effects since the time of Hippocrates. There is also no mystery about how this happens: people have more or less randomly tried solutions to their health problems for millennia, sometimes stumbling upon something useful
  • What makes the use of aspirin “scientific,” however, is that we have validated its effectiveness through properly controlled trials, isolated the active ingredient, and understood the biochemical pathways through which it has its effects
  • In terms of empirical results, there are strong indications that acupuncture is effective for reducing chronic pain and nausea, but sham therapy, where needles are applied at random places, or are not even pierced through the skin, turn out to be equally effective (see for instance this recent study on the effect of acupuncture on post-chemotherapy chronic fatigue), thus seriously undermining talk of meridians and Qi lines
  • Asma at one point compares the current inaccessibility of Qi energy to the previous (until this year) inaccessibility of the famous Higgs boson,
  • But the analogy does not hold. The existence of the Higgs had been predicted on the basis of a very successful physical theory known as the Standard Model. This theory is not only exceedingly mathematically sophisticated, but it has been verified experimentally over and over again. The notion of Qi, again, is not really a theory in any meaningful sense of the word. It is just an evocative word to label a mysterious force
  • Philosophers of science have long recognized that there is nothing wrong with positing unobservable entities per se, it’s a question of what work such entities actually do within a given theoretical-empirical framework. Qi and meridians don’t seem to do any, and that doesn’t seem to bother supporters and practitioners of Chinese medicine. But it ought to.
  • what’s the harm in believing in Qi and related notions, if in fact the proposed remedies seem to help?
  • we can incorporate whatever serendipitous discoveries from folk medicine into modern scientific practice, as in the case of the willow bark turned aspirin. In this sense, there is no such thing as “alternative” medicine, there’s only stuff that works and stuff that doesn’t.
  • Second, if we are positing Qi and similar concepts, we are attempting to provide explanations for why some things work and others don’t. If these explanations are wrong, or unfounded as in the case of vacuous concepts like Qi, then we ought to correct or abandon them.
  • pseudo-medical treatments often do not work, or are even positively harmful. If you take folk herbal “remedies,” for instance, while your body is fighting a serious infection, you may suffer severe, even fatal, consequences.
  • Indulging in a bit of pseudoscience in some instances may be relatively innocuous, but the problem is that doing so lowers your defenses against more dangerous delusions that are based on similar confusions and fallacies. For instance, you may expose yourself and your loved ones to harm because your pseudoscientific proclivities lead you to accept notions that have been scientifically disproved, like the increasingly (and worryingly) popular idea that vaccines cause autism.
  • Philosophers nowadays recognize that there is no sharp line dividing sense from nonsense, and moreover that doctrines starting out in one camp may over time evolve into the other. For example, alchemy was a (somewhat) legitimate science in the times of Newton and Boyle, but it is now firmly pseudoscientific (movements in the opposite direction, from full-blown pseudoscience to genuine science, are notably rare).
  • The verdict by philosopher Larry Laudan, echoed by Asma, that the demarcation problem is dead and buried, is not shared by most contemporary philosophers who have studied the subject.
  • the criterion of falsifiability, for example, is still a useful benchmark for distinguishing science and pseudoscience, as a first approximation. Asma’s own counterexample inadvertently shows this: the “cleverness” of astrologers in cherry-picking what counts as a confirmation of their theory, is hardly a problem for the criterion of falsifiability, but rather a nice illustration of Popper’s basic insight: the bad habit of creative fudging and finagling with empirical data ultimately makes a theory impervious to refutation. And all pseudoscientists do it, from parapsychologists to creationists and 9/11 Truthers.
  • The borderlines between genuine science and pseudoscience may be fuzzy, but this should be even more of a call for careful distinctions, based on systematic facts and sound reasoning. To try a modicum of turtle blood here and a little aspirin there is not the hallmark of wisdom and even-mindedness. It is a dangerous gateway to superstition and irrationality.
Javier E

Thomas Piketty Tours U.S. for His New Book - NYTimes.com - 0 views

  • The response from  fellow economists, so far mainly from the liberal side of the spectrum, has verged on the rapturous. Mr. Krugman,  a columnist for The New York Times, predicted in The New York Review of Books that Mr. Piketty’s book would “change both the way we think about society and the way we do economics.”
  • Mr. Piketty’s dedication to data has long made him a star among economists, who credit his work on income inequality (with Emmanuel Saez and others) for diving deep into seemingly dull tax archives to bring an unprecedented historical perspective to the subject.
  • Six years after the financial crisis, “people are looking for a bible of sorts,” said Julia Ott, an assistant professor of the history of capitalism at the New School, who appeared on a panel with Mr. Piketty at New York University on Thursday. “He’s speaking to a real feeling out there that things haven’t been fixed, that we need to take stock, that we need big ideas, big proposals, big global solutions.”
  • ...3 more annotations...
  • At the book’s center is Mr. Piketty’s contention — contrary to the influential theory developed by Simon Kuznets in the 1950s and ’60s — that mature capitalist economies do not inevitably evolve toward greater economic equality. Instead, Mr. Piketty contends, the data reveals a deeper historical tendency for the rate of return on capital to outstrip the overall rate of economic growth, leading to greater and greater concentrations of wealth at the very top.
  • Mr. Piketty rejected any economic determinism. “It all depends on what the political system decides,” he said.
  • Mr. Piketty, who writes in the book that the collapse of Communism in 1989 left him “vaccinated for life” against the “lazy rhetoric of anticapitalism,” is no Marxian revolutionary. “I believe in private property,” he said in the interview. “But capitalism and markets should be the slave of democracy and not the opposite.”
nolan_delaney

How to be good at stress | ideas.ted.com - 0 views

  • He dedicated his career to identifying what distinguishes people who thrive under stress from those who are defeated by it. The ones who thrive, he concluded, are those who view stress as inevitable, and rather than try to avoid it, they look for ways to engage with it, adapt to it, and learn from it.
  • what is new is how psychology and neuroscience have begun to examine this truism. Research is beginning to reveal not only why stress helps us learn and grow, but also what makes some people more likely to experience these benefits.
  • . But the stress response doesn’t end when your heart stops pounding. Other stress hormones are released to help you recover from the challenge. These stress-recovery hormones include DHEA and nerve growth factor, both of which increase neuroplasticity. In other words, they help your brain learn from experience
  • ...7 more annotations...
  • . DHEA is classified as a neurosteroid; in the same way that steroids help your body grow stronger from physical exercise, DHEA helps your brain grow stronger from psychological challenges. For several hours after you have a strong stress response, the brain is rewiring itself to remember and learn from the experience. Stress leaves an imprint on your brain that prepares you to handle similar stress the next time you encounter it.
  • Psychologists call the process of learning and growing from a difficult experience stress inoculation. Going through the experience gives your brain and body a kind of stress vaccine. This is why putting people through practice stress is a key training technique for NASA astronauts, Navy SEALS, emergency responders and elite athletes, and others who have to thrive under high levels of stress.
  • . (This is part of what makes the science of stress so fascinating, and also so puzzling.
  • Higher levels of cortisol have been associated with worse outcomes, such as impaired immune function and depression. In contrast, higher levels of DHEA—the neurosteroid—have been linked to reduced risk of anxiety, depression, heart disease, neurodegeneration and other diseases we typically think of as stress-related.
  • An important question, then, is: How do you influence your own — or somebody else’s — growth index?
  • This mindset can actually shift your stress physiology toward a state that makes such a positive outcome more likely, for example by increasing your growth index and reducing harmful side effects of stress such as inflammation.
  • Other studies confirm that viewing a stressful situation as an opportunity to improve your skills, knowledge or strengths makes it more likely that you will experience stress inoculation or stress-related growth. Once you appreciate that going through stress makes you better at it, it gets easier to face each new challenge. And the expectation of growth sends a signal to your brain and body: get ready to learn something, because you can handle this.
  •  
    Good timing for an article about stress considering we are taking exams this week.  New physiology studies suggest that your brain releases a growth hormone  after a stressful experience (that is like steroid for the brain) that temporarily increases your ability to learn.   Interesting to think just how this trait/hormone was evolved...
grayton downing

In Syria, Doctors Risk Life and Juggle Ethics - NYTimes.com - 1 views

  • Majid, who gave only his first name to protect his safety, collected hair and urine samples, clothing, tree leaves, soil and even a dead bird. He shared it with the Syrian American Medical Society, a humanitarian group that had been delivering such samples to American intelligence officials, as proof of possible chemical attacks.
  • United Nations inspectors have taken the first steps to destroy Syria’s chemical stockpile.
  • Many Syrian doctors have fled; those who remain describe dire conditions where even the most basic care is not available.
  • ...4 more annotations...
  • Mothers are desperate to have their children vaccinated; patients with chronic conditions like heart disease and diabetes struggle to get medicine; and there is “huge anxiety in the population,”
  • On Aug. 21, the group got word from some of its “silent partner” hospitals of a flood of patients with “neurotoxic symptoms” — roughly 3,600 in a period of three hours, including 355 who died.
  • The debate over whether doctors should expose human rights abuses has long been “one of these inside baseball arguments within the humanitarian community,” said Len Rubenstein, an expert on human rights and medical ethics at Johns Hopkins University. While Doctors Without Borders has a culture of “bearing witness,” he said, not all humanitarian organizations do.
  • The International Committee of the Red Cross, for instance, adheres to a strict code of political neutrality;
Javier E

The Science of Why We Don't Believe Science | Mother Jones - 2 views

  • an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, "death panels," the birthplace and religion of the president (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
  • The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
  • reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
  • ...5 more annotations...
  • Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. "They retrieve thoughts that are consistent with their previous beliefs," says Taber, "and that will lead them to build an argument and challenge what they're hearing."
  • In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
  • Our "reasoning" is a means to a predetermined end—winning our "case"—and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.
  • That's not to suggest that we aren't also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It's just that we have other important goals b
  • esides accuracy—including identity affirmation and protecting one's sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.
Javier E

The Widening World of Hand-Picked Truths - The New York Times - 0 views

  • it’s not just organized religions that are insisting on their own alternate truths. On one front after another, the hard-won consensus of science is also expected to accommodate personal beliefs, religious or otherwise, about the safety of vaccines, G.M.O. crops, fluoridation or cellphone radio waves, along with the validity of global climate change.
  • But presenting people with the best available science doesn’t seem to change many minds. In a kind of psychological immune response, they reject ideas they consider harmful.
  • Viewed from afar, the world seems almost on the brink of conceding that there are no truths, only competing ideologies — narratives fighting narratives. In this epistemological warfare, those with the most power are accused of imposing their version of reality — the “dominant paradigm” — on the rest, leaving the weaker to fight back with formulations of their own. Everything becomes a version.
  • ...3 more annotations...
  • I heard from young anthropologists, speaking the language of postmodernism, who consider science to be just another tool with which Western colonialism further extends its “cultural hegemony” by marginalizing the dispossessed and privileging its own worldview.
  • Science, through this lens, doesn’t discover knowledge, it “manufactures” it, along with other marketable goods.
  • The widening gyre of beliefs is accelerated by the otherwise liberating Internet. At the same time it expands the reach of every mind, it channels debate into clashing memes, often no longer than 140 characters, that force people to extremes and trap them in self-reinforcing bubbles of thought.
Javier E

Science Confirms: Politics Wrecks Your Ability to Do Math | Mother Jones - 1 views

  • According to a new psychology paper, our political passions can even undermine our very basic reasoning skills. More specifically, the study finds that people who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.
  • Survey respondents performed wildly differently on what was in essence the same basic problem, simply depending upon whether they had been told that it involved guns or whether they had been told that it involved a new skin cream.
  • What's more, it turns out that highly numerate liberals and conservatives were even more—not less—susceptible to letting politics skew their reasoning than were those with less mathematical ability.
  • ...7 more annotations...
  • Not surprisingly, Kahan's study found that the more numerate you are, the more likely you are to get the answer to this "skin cream" problem right. Moreover, it found no substantial difference between highly numerate Democrats and highly numerate Republicans in this regard. The better members of both political groups were at math, the better they were at solving the skin cream problem.
  • So how did people fare on the handgun version of the problem? They performed quite differently than on the skin cream version, and strong political patterns emerged in the results—especially among people who are good at mathematical reasoning. Most strikingly, highly numerate liberal Democrats did almost perfectly when the right answer was that the concealed weapons ban does indeed work to decrease crime (version C of the experiment)—an outcome that favors their pro-gun-control predilections. But they did much worse when the correct answer was that crime increases in cities that enact the ban (version D of the experiment).
  • The opposite was true for highly numerate conservative Republicans
  • these results are a fairly strong refutation of what is called the "deficit model" in the field of science and technology studies—the idea that if people just had more knowledge, or more reasoning ability, then they would be better able to come to consensus with scientists and experts on issues like climate change, evolution, the safety of vaccines, and pretty much anything else involving science or data
  • Kahan's data suggest the opposite—that political biases skew our reasoning abilities, and this problem seems to be worse for people with advanced capacities like scientific literacy and numeracy.
  • What's happening when highly numerate liberals and conservatives actually get it wrong? Either they're intuiting an incorrect answer that is politically convenient and feels right to them, leading them to inquire no further—or else they're stopping to calculate the correct answer, but then refusing to accept it and coming up with some elaborate reason why 1 + 1 doesn't equal 2 in this particular instance. (Kahan suspects it's mostly the former, rather than the latter.)
  • This new study is just one out of many in this respect, but it provides perhaps the most striking demonstration yet of just how motivated, just how biased, reasoning can be—especially about politics.
maddieireland334

Flu shot only 19% effective this winter - 0 views

  •  
    Flu vaccines were only 19% effective in preventing doctor visits for influenza this season, one of the lowest rates in the past decade, the Centers for Disease Control and Prevention said Thursday. Health officials had predicted flu shots would be less protective than usual this year.
Javier E

The Politics of Fraudulent Dietary Supplements - NYTimes.com - 0 views

  • One pill makes you smarter. One pill makes you thin. One pill makes you happy. Another keeps you energized. And so what if tests conducted by scientists in New York and Canada have found that the substances behind these miracle enhancements may contain nothing more than powdered rice or houseplants. If enough people believe they’ll be healthier, well, it’s a nice racket.
  • Nice, to the tune of $13 billion a year in sales. And here in Utah, which is to the dietary supplement business what Northern California is to marijuana, a huge industry has taken hold,
  • We’re not talking drugs, or even, in many cases, food here. Drugs have to undergo rigorous testing and review by the federal government. Dietary supplements do not. Drugs have to prove to be effective. Dietary supplements do not.
  • ...6 more annotations...
  • These are the Frankenstein remedies — botanicals, herbs, minerals, enzymes, amino acids, dried stuff. They’re “natural.” They’re not cheap. And Americans pop them like Skittles, despite recent studies showing that nearly a third of all herbal supplements on the market may be outright frauds.
  • To understand how we got here, you have to go back to 1994, when Senator Orrin G. Hatch of Utah midwifed through Congress a new industry protected from all but minimal regulation. It is also an industry that would make many of his closest associates and family members rich. In turn, they’ve rewarded him with sizable campaign contributions.
  • Even though serious illnesses, and some deaths are on the rise from misuse of these supplements, Hatch is determined to keep regulators at bay. “I am committed to protect this industry and the integrity of its products,” he told a gathering of potency pill-pushers and the like in Utah last fall.
  • what about the medical implications? These pills and powders can’t, by law, make specific claims to cure anything. So they claim to make you healthier. The consumer is left playing doctor, reading questionable assertions that course through the unfiltered garbage of the Internet.
  • there was this finding reported in the authoritative Annals of Internal Medicine: “Enough is enough: Stop wasting money on vitamin and mineral supplements.”
  • So, the industry keeps growing, with 65,000 dietary supplements now on the market, consumed by nearly half of all Americans. The larger issue is mistrust of authority, a willful ignorance that knows no political side. Thus, right-wing libertarians promote a freewheeling market of quack products, while left-wing conspiracy theorists disdain modern medicine in favor of anything sold as “natural” or vaguely countercultural. These are some of the same people who will not vaccinate their children.
maddieireland334

Protection Without a Vaccine - 0 views

  •  
    Last month, a team of scientists announced what could prove to be an enormous step forward in the fight against H.I.V. Scientists at Scripps Research Institute said they had developed an artificial antibody that, once in the blood, grabbed hold of the virus and inactivated it. The molecule can eliminate H.I.V.
Javier E

Faith vs. Facts - NYTimes.com - 0 views

  • a broad group of scholars is beginning to demonstrate that religious belief and factual belief are indeed different kinds of mental creatures.
  • People process evidence differently when they think with a factual mind-set rather than with a religious mind-set
  • Even what they count as evidence is different
  • ...16 more annotations...
  • And they are motivated differently, based on what they conclude.
  • On what grounds do scholars make such claims?
  • the very language people use changes when they talk about religious beings, and the changes mean that they think about their realness differently.
  • to say, “I believe that Jesus Christ is alive” signals that you know that other people might not think so. It also asserts reverence and piety
  • We seem to regard religious beliefs and factual beliefs with what the philosopher Neil Van Leeuwen calls different “cognitive attitudes.”
  • when people consider the truth of a religious belief, what the belief does for their lives matters more than, well, the facts.
  • We evaluate religious beliefs more with our sense of destiny, purpose and the way we think the world should be
  • religious and factual beliefs play different roles in interpreting the same events. Religious beliefs explain why, rather than how
  • people’s reliance on supernatural explanations increases as they age.
  • It’s the young kids who seem skeptical when researchers ask them about gods and ancestors, and the adults who seem clear and firm. It seems that supernatural ideas do things for adults they do not yet do for children.
  • people don’t use rational, instrumental reasoning when they deal with religious beliefs
  • sacred values are immune to the normal cost-benefit trade-offs that govern other dimensions of our lives.
  • The danger point seems to be when people feel themselves to be completely fused with a group defined by its sacred value.
  • One of the interesting things about sacred values, however, is that they are both general (“I am a true Christian”) and particular (“I believe that abortion is murder”)
  • It is possible that this is the key to effective negotiation, because the ambiguity allows the sacred value to be reframed without losing its essential truth
  • these new ideas about religious belief should shape the way people negotiate about ownership of the land, just as they should shape the way we think about climate change deniers and vaccine avoiders. People aren’t dumb in not recognizing the facts. They are using a reasoning process that responds to moral arguments more than scientific ones, and we should understand that when we engage.
kushnerha

BBC - Future - The surprising downsides of being clever - 0 views

  • If ignorance is bliss, does a high IQ equal misery? Popular opinion would have it so. We tend to think of geniuses as being plagued by existential angst, frustration, and loneliness. Think of Virginia Woolf, Alan Turing, or Lisa Simpson – lone stars, isolated even as they burn their brightest. As Ernest Hemingway wrote: “Happiness in intelligent people is the rarest thing I know.”
  • Combing California’s schools for the creme de la creme, he selected 1,500 pupils with an IQ of 140 or more – 80 of whom had IQs above 170. Together, they became known as the “Termites”, and the highs and lows of their lives are still being studied to this day.
  • Termites’ average salary was twice that of the average white-collar job. But not all the group met Terman’s expectations – there were many who pursued more “humble” professions such as police officers, seafarers, and typists. For this reason, Terman concluded that “intellect and achievement are far from perfectly correlated”. Nor did their smarts endow personal happiness. Over the course of their lives, levels of divorce, alcoholism and suicide were about the same as the national average.
  • ...16 more annotations...
  • One possibility is that knowledge of your talents becomes something of a ball and chain. Indeed, during the 1990s, the surviving Termites were asked to look back at the events in their 80-year lifespan. Rather than basking in their successes, many reported that they had been plagued by the sense that they had somehow failed to live up to their youthful expectations.
  • The most notable, and sad, case concerns the maths prodigy Sufiah Yusof. Enrolled at Oxford University aged 12, she dropped out of her course before taking her finals and started waitressing. She later worked as a call girl, entertaining clients with her ability to recite equations during sexual acts.
  • Another common complaint, often heard in student bars and internet forums, is that smarter people somehow have a clearer vision of the world’s failings. Whereas the rest of us are blinkered from existential angst, smarter people lay awake agonising over the human condition or other people’s folly.
  • MacEwan University in Canada found that those with the higher IQ did indeed feel more anxiety throughout the day. Interestingly, most worries were mundane, day-to-day concerns, though; the high-IQ students were far more likely to be replaying an awkward conversation, than asking the “big questions”. “It’s not that their worries were more profound, but they are just worrying more often about more things,” says Penney. “If something negative happened, they thought about it more.”
  • seemed to correlate with verbal intelligence – the kind tested by word games in IQ tests, compared to prowess at spatial puzzles (which, in fact, seemed to reduce the risk of anxiety). He speculates that greater eloquence might also make you more likely to verbalise anxieties and ruminate over them. It’s not necessarily a disadvantage, though. “Maybe they were problem-solving a bit more than most people,” he says – which might help them to learn from their mistakes.
  • The harsh truth, however, is that greater intelligence does not equate to wiser decisions; in fact, in some cases it might make your choices a little more foolish.
  • we need to turn our minds to an age-old concept: “wisdom”. His approach is more scientific that it might at first sound. “The concept of wisdom has an ethereal quality to it,” he admits. “But if you look at the lay definition of wisdom, many people would agree it’s the idea of someone who can make good unbiased judgement.”
  • “my-side bias” – our tendency to be highly selective in the information we collect so that it reinforces our previous attitudes. The more enlightened approach would be to leave your assumptions at the door as you build your argument – but Stanovich found that smarter people are almost no more likely to do so than people with distinctly average IQs.
  • People who ace standard cognitive tests are in fact slightly more likely to have a “bias blind spot”. That is, they are less able to see their own flaws, even when though they are quite capable of criticising the foibles of others. And they have a greater tendency to fall for the “gambler’s fallacy”
  • A tendency to rely on gut instincts rather than rational thought might also explain why a surprisingly high number of Mensa members believe in the paranormal; or why someone with an IQ of 140 is about twice as likely to max out their credit card.
  • “The people pushing the anti-vaccination meme on parents and spreading misinformation on websites are generally of more than average intelligence and education.” Clearly, clever people can be dangerously, and foolishly, misguided.
  • spent the last decade building tests for rationality, and he has found that fair, unbiased decision-making is largely independent of IQ.
  • Crucially, Grossmann found that IQ was not related to any of these measures, and certainly didn’t predict greater wisdom. “People who are very sharp may generate, very quickly, arguments [for] why their claims are the correct ones – but may do it in a very biased fashion.”
  • employers may well begin to start testing these abilities in place of IQ; Google has already announced that it plans to screen candidates for qualities like intellectual humility, rather than sheer cognitive prowess.
  • He points out that we often find it easier to leave our biases behind when we consider other people, rather than ourselves. Along these lines, he has found that simply talking through your problems in the third person (“he” or “she”, rather than “I”) helps create the necessary emotional distance, reducing your prejudices and leading to wiser arguments.
  • If you’ve been able to rest on the laurels of your intelligence all your life, it could be very hard to accept that it has been blinding your judgement. As Socrates had it: the wisest person really may be the one who can admit he knows nothing.
« First ‹ Previous 101 - 120 of 172 Next › Last »
Showing 20 items per page