Skip to main content

Home/ TOK@ISPrague/ Group items tagged experiments

Rss Feed Group items tagged

markfrankel18

No Higgs Boson of Hitler: Ron Rosenbaum Explains 'Explaining Hitler' | VICE United States - 0 views

  • In 1998 the journalist Ron Rosenbaum published Explaining Hitler. Contrary to what the title might suggest, it is not an explanation of Hitler, per se, but rather a 500-page meta-analysis of different theories intended to explain Hitler. Ron Rosenbaum traveled from the ruins of Hitler's Austrian birthplace to meet the historians, psychologists, and Nazi-hunters who have promoted different explanations for Hitler's evil. Whether the basis of the theories are plausible (Hitler's Jewish ancestry) or extremely unlikely (Hitler's penis was bitten off while he attempted to pee into the mouth of a billy goat) they are all presented with a relentless skepticism that makes reading Explaining Hitler a unique and destabilizing experience.
markfrankel18

Why can't the world's greatest minds solve the mystery of consciousness? | Oliver Burke... - 0 views

  • There was only one truly hard problem of consciousness, Chalmers said. It was a puzzle so bewildering that, in the months after his talk, people started dignifying it with capital letters – the Hard Problem of Consciousness – and it’s this: why on earth should all those complicated brain processes feel like anything from the inside? Why aren’t we just brilliant robots, capable of retaining information, of responding to noises and smells and hot saucepans, but dark inside, lacking an inner life? And how does the brain manage it? How could the 1.4kg lump of moist, pinkish-beige tissue inside your skull give rise to something as mysterious as the experience of being that pinkish-beige lump, and the body to which it is attached?
  • in recent years, a handful of neuroscientists have come to believe that it may finally be about to be solved – but only if we are willing to accept the profoundly unsettling conclusion that computers or the internet might soon become conscious, too.
markfrankel18

How your eyes trick your mind - 0 views

  • Visual, or optical, illusions show us that our minds tend to make assumptions about the world – and what you think you see is often not the truth. Throughout history, curious minds have questioned why our eyes are so easily fooled by these simple drawings. Illusions, we have found, can reveal everything from how we process time and space to our experience of consciousness.
markfrankel18

Letting People Simulate Blindness Actually Worsens Attitudes Toward Blindness - 0 views

  •  "When people think about what it would be like to be blind, they take from their own brief and relatively superficial experience and imagine it would be really, really terrible and that they wouldn't be able to function well," said Arielle Silverman, a postdoctoral researcher at the University of Washington in Seattle, who is lead author of the paper and also blind.  Silverman became interested in studying the effects of blindness simulations in part because of her own interactions with strangers enthusiastically wanting to help her navigate her way across a street, for example. "I noticed and wondered why people who've never met a blind person before seem to intuitively have good attitudes toward blind people and people who tell me they have interacted with a blind person before tend to seem more condescending," she said
markfrankel18

Why Cambodians Never Get 'Depressed' : Goats and Soda : NPR - 0 views

  • People in Cambodia experience what we Americans call depression. But there's no direct translation for the word "depression" in the Cambodian Khmer language. Instead, people may say thelea tdeuk ceut, which literally means "the water in my heart has fallen." Anxious or depressed Haitians, on the other hand, may use the phrase reflechi twop, which means "thinking too much." And in parts of Nepal and India, people use the English word "tension."
  • But just as words for depression and anxiety get lost in translation, so can treatments.
Lawrence Hrubes

The Power of Touch - The New Yorker - 0 views

  • At a home in the Romanian city of Iași, Carlson measured cortisol levels in a group of children ranging from two months to three years old. The caregiver-to-child ratio was twenty to one, and most of the children had experienced severe neglect and sensory deprivation. Multiple times a day, Carlson took saliva samples, tracking how cortisol levels fluctuated in response to stressful events. The children, she found, were hormonally off kilter. Under normal conditions, cortisol peaks just before we wake up and then tapers off; in the leagăne infants, it peaked in the afternoon and remained elevated. Those levels, in turn, correlated with faltering performance on numerous cognitive and physical assessments. Then Carlson tried an intervention modelled on the work of Joseph Sparling, a child-development specialist, and the outcomes changed. When half of the orphans received more touching from more caregivers—an increase in hugs, holding, and the making of small adjustments to clothes and hair—their performance markedly improved. They grew bigger, stronger, and more responsive, both cognitively and emotionally, and they reacted better to stress.
  • Touch is the first of the senses to develop in the human infant, and it remains perhaps the most emotionally central throughout our lives. While many researchers have appreciated its power, others have been more circumspect. Writing in 1928, John B. Watson, one of the originators of the behaviorist school of psychology, urged parents to maintain a physical boundary between themselves and their children: “Never hug and kiss them, never let them sit on your lap. If you must, kiss them once on the forehead when they say goodnight. Shake hands with them in the morning. Give them a pat on the head if they have made an extraordinarily good job on a difficult task.” Watson acknowledged that children must be bathed, clothed, and cared for, but he believed that excessive touching—that is, caressing—would create “mawkish” adults. An untouched child, he argued, “enters manhood so bulwarked with stable work and emotional habits that no adversity can quite overwhelm him.” Now we know that, to attain that result, he should have suggested the opposite: touch, as frequent and as caring as possible
  • And yet touch is rarely purely physical. Field’s more recent work has shown that the brain is very good at distinguishing an emotional touch from a similar, but non-emotional, one. A massage chair is not a masseuse. Certain touch receptors exist solely to convey emotion to the brain, rather than sensory information about the external environment. A recent study shows that we can identify other people’s basic emotions based on how they touch us, even when they are separated from us by a curtain. And the emotions that are communicated by touch can go on to shape our behavior. One recent review found that, even if we have no conscious memory of a touch—a hand on the shoulder, say—we may be more likely to agree to a request, respond more (or less) positively to a person or product, or form closer bonds with someone.
markfrankel18

Facebook psychology: How social media changes the way we mourn. - 0 views

  • Facebook allows us to be public about our sorrows, but to be heard they must sound some joyous note. Research suggests that social media is uncommonly good at making us miserable. The irony may be that Facebook itself is also making us more reluctant to voice that misery.
markfrankel18

Why Are Certain Smells So Hard to Identify? - The New Yorker - 0 views

  • A recent paper in the journal Cognition, for instance, quipped that if people were as bad at naming sights as they are at naming scents, “they would be diagnosed as aphasic and sent for medical help.” The paper quoted scattershot attempts by participants in a previous study to label the smell of lemon: “air freshener,” “bathroom freshener,” “magic marker,” “candy,” “lemon-fresh Pledge,” “some kind of fruit.” This sort of difficulty seems to have very little to do, however, with the nose’s actual capabilities. Last spring, an article in the journal Science reported that we are capable of discriminating more than a trillion different odors. (A biologist at Caltech subsequently disputed the finding, arguing that it contained mathematical errors, though he acknowledged the “richness of human olfactory experience.”) Whence, then, our bumbling translation of scent into speech?
  • That question was the subject, two weekends ago, of an American Association for the Advancement of Science symposium at the San Jose Convention Center (which smelled, pleasantly but nonspecifically, of clean carpet). The preëminence of eye over nose was apparent even in the symposium abstract, which touted data that “shed new light” and opened up “yet new vistas.” (Reading it over during a phone interview, Jonathan Reinarz, a professor at the University of Birmingham, in England, and the author of “Past Scents: Historical Perspectives on Smell,” asked me, “What’s wrong with a little bit of inscent?”) Nevertheless, the people on the panel were decidedly pro-smell. “One thing that everyone at this symposium will agree on is that human olfactory discriminatory power is quite excellent, if you give it a chance,” Jay Gottfried, a Northwestern University neuroscientist, told me. Noam Sobel, of the Weizmann Institute of Science, used a stark hypothetical to drive home the ways in which smell can shape behavior: “If I offer you a beautiful mate, of the gender of your choice, who smells of sewage, versus a less attractive mate who smells of sweet spice, with whom would you mate?”
  • But difficulty with talking about smell is not universal. Asifa Majid, a psycholinguist at Radboud University Nijmegen, in the Netherlands, and the organizer of the A.A.A.S. symposium, studies a group of around a thousand hunter-gatherers in northern Malaysia and southern Thailand who speak a language called Jahai. In one analysis, Majid and her colleague Niclas Burenhult found that speakers of Jahai were as good at classifying scratch-and-sniff cards as they were at classifying color chips; their English-speaking counterparts, meanwhile, tended to give meandering and disparate descriptions of scents. At the symposium, Majid presented new research involving around thirty Jahai and thirty Dutch people. In that study, the Jahai named smells in an average of two seconds, whereas the Dutch took thirteen—“and this is just to say, ‘Uh, I don’t know,’ ” Majid joked onstage.
  • ...1 more annotation...
  • Olfaction experts each have their pet theories as to why our scent lexicon is so lacking. Jonathan Reinarz blames the lingering effects of the Enlightenment, which, he says, placed a special emphasis on vision. Jay Gottfried, who is something of a nasal prodigy—he once guessed, on the basis of perfume residue, that one of his grad students had gotten back together with an ex-girlfriend—blames physiology. Whereas visual information is subject to elaborate processing in many areas of the brain, his research suggests, odor information is parsed in a much less intricate way, notably by the limbic system, which is associated with emotion and memory formation. This area, Gottfried said, takes “a more crude and unpolished approach to the process of naming,” and the brain’s language centers can have trouble making use of such unrefined input. Meanwhile, Donald A. Wilson, a neuroscientist at New York University School of Medicine, blames biases acquired in childhood.
feliepdissenha

BBC News - Why good memories are less likely to fade - 0 views

  • It was 80 years ago that the idea of negative memories fading faster was first proposed. Back in the 1930s psychologists collected recollections about life events like people's holidays - marking them as pleasant or unpleasant. Weeks later an unannounced request came from the researchers to recall their memories. Of the unpleasant experiences nearly 60% were forgotten - but only 42% of the pleasant memories had faded.
  • In all, 2,400 autobiographical memories were included, from 562 individuals in 10 countries.
Lawrence Hrubes

BBC - Earth - Why does time always run forwards and never backwards? - 0 views

  • Newton's laws are astonishingly successful at describing the world. They explain why apples fall from trees and why the Earth orbits the Sun. But they have an odd feature: they work just as well backwards as forwards. If an egg can break, then Newton's laws say it can un-break.This is obviously wrong, but nearly every theory that physicists have discovered since Newton has the same problem. The laws of physics simply don't care whether time runs forwards or backwards, any more than they care about whether you're left-handed or right-handed.But we certainly do. In our experience, time has an arrow, always pointing into the future. "You might mix up east and west, but you would not mix up yesterday and tomorrow," says Sean Carroll, a physicist at the California Institute of Technology in Pasadena. "But the fundamental laws of physics don't distinguish between past and future."
Lawrence Hrubes

Teaching Doubt - The New Yorker - 0 views

  • “Non-overlapping magisteria” has a nice ring to it. The problem is that there are many religious claims that not only “overlap” with empirical data but are incompatible with it. As a scientist who also spends a fair amount of time in the public arena, if I am asked if our understanding of the Big Bang conflicts with the idea of a six-thousand-year-old universe, I face a choice: I can betray my scientific values, or encourage that person to doubt his or her own beliefs. More often than you might think, teaching science is inseparable from teaching doubt.
  • Doubt about one’s most cherished beliefs is, of course, central to science: the physicist Richard Feynman stressed that the easiest person to fool is oneself. But doubt is also important to non-scientists. It’s good to be skeptical, especially about ideas you learn from perceived authority figures. Recent studies even suggest that being taught to doubt at a young age could make people better lifelong learners. That, in turn, means that doubters—people who base their views on evidence, rather than faith—are likely to be better citizens.
  • Science class isn’t the only place where students can learn to be skeptical. A provocative novel that presents a completely foreign world view, or a history lesson exploring the vastly different mores of the past, can push you to skeptically reassess your inherited view of the universe. But science is a place where such confrontation is explicit and accessible. It didn’t take more than a simple experiment for Galileo to overturn the wisdom of Aristotle. Informed doubt is the very essence of science.
  • ...1 more annotation...
  • Some teachers shy away from confronting religious beliefs because they worry that planting the seeds of doubt will cause some students to question or abandon their own faith or the faith of their parents. But is that really such a bad thing? It offers some young people the chance to escape the guilt imposed upon them simply for questioning what they’re told. Last year, I received an e-mail from a twenty-seven-year-old man who is now studying in the United States after growing up in Saudi Arabia. His father was executed by family members after converting to Christianity. He says that it’s learning about science that has finally liberated him from the spectre of religious fundamentalism.
Lawrence Hrubes

A Pioneer for Death With Dignity - NYTimes.com - 0 views

  • More than two decades before Brittany Maynard’s public advocacy for death with dignity inspired lawmakers in Washington, D.C., and at least 16 states to introduce legislation authorizing the medical practice of aid in dying for the terminally ill, Senator Frank Roberts of Oregon sponsored one of the nation’s first death-with-dignity bills.
  • Medical aid in dying has always had enormous public support. Recent polls by Gallup and Harris show that 69 to 74 percent of people believe terminally ill adults should have access to medical means to bring about a peaceful death. This belief is strong throughout the nation and across all demographic categories, including age, disability, religion and political party.
  • First, the phenomenon of Brittany Maynard has transformed the movement for end-of-life-choice into an unstoppable force. Ms. Maynard was the 29-year-old woman dying of brain cancer, who moved, with her family, from her home in California to establish residency in Oregon and gain access to aid in dying. As her pain and seizures escalated and as inevitable paralysis, blindness and stupor approached, she drank medication obtained under Oregon’s Death With Dignity Act and died quietly in a circle of her loved ones last fall. Her family vows to fulfill her legacy of legal reform in her native California and beyond. Young and old alike identify with Brittany Maynard. Her experience as a refugee for dignity sparks the “aha!” moment when people understand the grave injustice of government’s withholding from a competent, dying adult the elements of choice and control over suffering.
markfrankel18

The Right to Write - NYTimes.com - 1 views

  • I sat on a panel once with another novelist and a distinguished African-American critic, to discuss Harriet Beecher Stowe’s novel “Uncle Tom’s Cabin.” The critic said, “Of course, as a white woman, Stowe had no right to write the black experience.” The other novelist said lightly, “No, of course not. And I had no right to write about 14th-century Scandinavians. Which I did.”
  • Who owns the story, the person who lives it or the person who writes it?
Lawrence Hrubes

BBC - Future - The last unmapped places on Earth - 0 views

  • Today it is safe to say there are no unknown territories with dragons. However, it’s not quite true to say that every corner of the planet is charted. We may seem to have a map for everywhere, but that doesn’t mean they are complete, accurate or even trustworthy.For starters, all maps are biased toward their creator’s subjective view of the world. As Lewis Carroll famously pointed out, a perfectly objective and faithful 1:1 representation of the world would literally have to be the same size as the place it depicted. Therefore, mapmakers must make sensible design decisions in order to compress the physical world into a much smaller, flatter depiction. Those decisions inevitably introduce personal biases, however, such as our tendency to place ourselves at the centre of the world. “We always want to put ourselves on the map,” says Jerry Brotton, a professor of renaissance studies at Queen Mary University London, and author of A History of the World in 12 Maps. “Maps address an existential question as much as one that’s about orientation and coordinates.“We want to find ourselves on the map, but at the same time, we are also outside of the map, rising above the world and looking down as if we were god,” he continues. “It’s a transcendental experience.”
Lawrence Hrubes

My Great-Great-Aunt Discovered Francium. And It Killed Her. - NYTimes.com - 0 views

  • There is a common narrative in science of the tragic genius who suffers for a great reward, and the tale of Curie, who died from exposure to radiation as a result of her pioneering work, is one of the most famous. There is a sense of grandeur in the idea that paying heavily is a means of advancing knowledge. But in truth, you can’t control what it is that you find — whether you’ve sacrificed your health for it, or simply years of your time.
  • How quickly an element decayed and how it did so — meaning which of its component parts it shed — became the focus of researchers in radioactivity. Apart from purely scientific insights, there was a hope that radiation could lead to something marvelous. X-rays, a kind of radiation discovered by Wilhelm Roentgen and produced by accelerated electrons, had already been hailed as a major medical breakthrough and, in addition to showing doctors their patients’ insides, were being investigated as a treatment for skin lesions from tuberculosis and lupus. In her 1904 book “Investigations on Radioactive Substances,” Marie Curie wrote that radium had promise, too — diseased skin exposed to it later regrew in a healthy state. Radium’s curious ability to destroy tissue was being turned against cancer, with doctors sewing capsules of radium into the surgical wounds of cancer patients (including Henrietta Lacks, whose cells are used today in research). This enthusiasm for radioactivity was not confined to the doctor’s office. The element was in face creams, tonics, even candy. According to the Encyclopaedia Britannica article that Curie and her daughter wrote on radium in 1926, preliminary experiments suggested that radium could even improve the quality of soil.
  • Perhaps the most tragic demonstration of this involved workers at the United States Radium Corporation factory in Orange, N.J., which in 1917 began hiring young women to paint watch faces with glow-in-the-dark radium paint. The workers were told that the paint was harmless and were encouraged to lick the paintbrushes to make them pointy enough to inscribe small numbers. In the years that followed, the women began to suffer ghoulish physical deterioration. Their jaws melted and ballooned into masses of tumors larger than fists, and cancers riddled their bodies. They developed anemia and necrosis. The sensational court case started — and won — by the dying Radium Girls, as they were called, is a landmark in the history of occupational health. It was settled in June 1928, four months before Marguerite Perey arrived at the Radium Institute to begin a 30-year career of heavy exposure to radiation.
  • ...2 more annotations...
  • We know now that alpha and beta particles emitted in radiation attack DNA and that the mutations they cause can lead to cancer. Ingested radioactive elements can concentrate in the bones, where they continue their decay, in effect poisoning someone for as long as that person lives. By the time Perey made her discovery, she was already heavily contaminated. She spent the last 15 years of her life in treatment for a gruesome bone cancer that spread throughout her body, claiming her eyesight, pieces of her hand and most of the years in which she had planned to study francium. As the disease progressed, she warned her students of the horrible consequences of radiation exposure. Francis, my grandfather, says he recalls hearing that when she walked into labs with radiation counters in her later years, they would go off.
  • Over the years, historians have pondered what drove the Curies to throw caution so thoroughly to the wind. Perhaps it was inconceivable to them that the benefits of their research would not outweigh the risks to themselves and their employees. In a field in which groundbreaking discoveries were being made and the competition might arrive there first, speed was put above other concerns, Rona noted. But you almost get the impression that in the Curie lab, dedication to science was demonstrated by a willingness to poison yourself — as if what made a person’s research meaningful were the sacrifices made in the effort to learn something new.
Lawrence Hrubes

BBC World Service - The Why Factor, Sad Music - 0 views

  • Helena Merriman asks why people listen to sad music. A recent study has shown that sad music has become increasingly popular, but why do people choose to listen to it, and what goes on in the brain and the body when they do so? Helena speaks to Japanese pianist and music researcher Dr Ai Kawakami who has some surprising answers about some of the positive feelings people experience when they listen to sad music. American writer Amanda Stern tells Helena why she regularly listens (and cries) to sad music and British composer Debbie Wiseman, known for her moving TV and film scores, explains what makes a piece of music sound sad. You’ll also hear pieces of sad music suggested by BBC listeners from all over the world.
Lawrence Hrubes

Walter Mischel, The Marshallow Test, and Self-Control - The New Yorker - 1 views

  • Mischel’s story isn’t surprising—nicotine is addictive, and quitting is difficult—except for one thing: Mischel is the creator of the marshmallow test, one of the most famous experiments in the history of psychology, which is often cited as evidence of the importance of self-control. In the original test, which was administered at the Bing Nursery School, at Stanford, in the nineteen-sixties, Mischel’s team would present a child with a treat (marshmallows were just one option) and tell her that she could either eat the one treat immediately or wait alone in the room for several minutes until the researcher returned, at which point she could have two treats. The promised treats were always visible and the child knew that all she had to do to stop the agonizing wait was ring a bell to call the experimenter back—although in that case, she wouldn’t get the second treat. The longer a child delayed gratification, Mischel found—that is, the longer she was able to wait—the better she would fare later in life at numerous measures of what we now call executive function. She would perform better academically, earn more money, and be healthier and happier. She would also be more likely to avoid a number of negative outcomes, including jail time, obesity, and drug use.
  • It was not until one day in the late nineteen-sixties, when he saw a man with metastasized lung cancer in the halls of Stanford’s medical school—chest exposed, head shaved, little green “x” marks all over his body, marking the points where radiation would go—that Mischel realized he was fooling himself. Finally, something clicked. From then on, each time he wanted a cigarette (approximately every three minutes, by his count) he would create a picture in his mind of the man in the hallway. As he described it to me, “I changed the objective value of the cigarette. It went from something I craved to something disgusting.” He hasn’t had a smoke since.
  •  
    "Mischel, who is now eighty-four years old, has just published his first popular book, "The Marshmallow Test: Mastering Self-Control." It is part memoir, part scientific analysis, and part self-help guide. In the book, he describes the original impetus for the marshmallow study. At the time, his daughters, Judith, Rebecca, and Linda, were three, four, and five years old, respectively. "I began to see this fascinating phenomenon where they morphed from being highly impulsive, immediate creatures who couldn't delay anything," he told me. "There were these amazingly rapid changes-everything around them was the same, but something inside them had changed. I realized I didn't have a clue what was going on in their heads." He wondered what was it that had enabled them to go from deciding that they wanted to wait to actually being able to do so. He found the answer among their classmates at the Bing preschool."
Lawrence Hrubes

Executing Them Softly - NYTimes.com - 0 views

  • Since the late 19th century in the United States, critical responses to the spectacle of pain in executions have continued to spur ardent calls for the improvement of killing technology. One of the most prolific legal theorists of capital punishment, Austin Sarat, has concisely referred to this history: “The movement from hanging to electrocution, from electrocution to the gas chamber, from gas to lethal injection, reads like someone’s version of the triumph of progress, with each new technique enthusiastically embraced as the latest and best way to kill without imposing pain.” Recent debates over the administration of midazolam and pentobarbital, and in what dosage, seamlessly integrate themselves into Sarat’s grim progress narrative. The inexhaustible impulse to seek out less painful killing technologies puts a series of questions in sharp relief: What is, and should be, the role of pain in retributive justice? And how has the law come to rationalize the condemned’s experience of pain during an execution? While the Eighth Amendment stipulates the necessity of avoiding “cruel and unusual punishment,” in 1890 the Supreme Court decided this clause could mean that no method of execution should impose “something more than the mere extinguishment of life.” And then, in 1958, the court also determined that the amendment should reflect the “evolving standards of decency that mark the progress of a maturing society.” If we were to consider the “standard of decency” in our society today, we would be pushed to ask: By what moral order have we continued to establish the “extinguishment of life” as something “mere,” and the pain of the condemned as excessive? In other words, how has the pain experienced during an execution become considered cruel and unconstitutional but not the very act of killing itself? We should dial back to older histories of law to tap into pain’s perennially vexed role in retributive theories of justice.
markfrankel18

The Moral Instinct - New York Times - 3 views

  • It seems we may all be vulnerable to moral illusions the ethical equivalent of the bending lines that trick the eye on cereal boxes and in psychology textbooks. Illusions are a favorite tool of perception scientists for exposing the workings of the five senses, and of philosophers for shaking people out of the naïve belief that our minds give us a transparent window onto the world (since if our eyes can be fooled by an illusion, why should we trust them at other times?). Today, a new field is using illusions to unmask a sixth sense, the moral sense.
  • The first hallmark of moralization is that the rules it invokes are felt to be universal. Prohibitions of rape and murder, for example, are felt not to be matters of local custom but to be universally and objectively warranted. One can easily say, “I don’t like brussels sprouts, but I don’t care if you eat them,” but no one would say, “I don’t like killing, but I don’t care if you murder someone.”The other hallmark is that people feel that those who commit immoral acts deserve to be punished.
  • Until recently, it was understood that some people didn’t enjoy smoking or avoided it because it was hazardous to their health. But with the discovery of the harmful effects of secondhand smoke, smoking is now treated as immoral. Smokers are ostracized; images of people smoking are censored; and entities touched by smoke are felt to be contaminated (so hotels have not only nonsmoking rooms but nonsmoking floors). The desire for retribution has been visited on tobacco companies, who have been slapped with staggering “punitive damages.” At the same time, many behaviors have been amoralized, switched from moral failings to lifestyle choices.
  • ...10 more annotations...
  • But whether an activity flips our mental switches to the “moral” setting isn’t just a matter of how much harm it does. We don’t show contempt to the man who fails to change the batteries in his smoke alarms or takes his family on a driving vacation, both of which multiply the risk they will die in an accident. Driving a gas-guzzling Hummer is reprehensible, but driving a gas-guzzling old Volvo is not; eating a Big Mac is unconscionable, but not imported cheese or crème brûlée. The reason for these double standards is obvious: people tend to align their moralization with their own lifestyles.
  • People don’t generally engage in moral reasoning, Haidt argues, but moral rationalization: they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.
  • Together, the findings corroborate Greene’s theory that our nonutilitarian intuitions come from the victory of an emotional impulse over a cost-benefit analysis.
  • The psychologist Philip Tetlock has shown that the mentality of taboo — a conviction that some thoughts are sinful to think — is not just a superstition of Polynesians but a mind-set that can easily be triggered in college-educated Americans. Just ask them to think about applying the sphere of reciprocity to relationships customarily governed by community or authority. When Tetlock asked subjects for their opinions on whether adoption agencies should place children with the couples willing to pay the most, whether people should have the right to sell their organs and whether they should be able to buy their way out of jury duty, the subjects not only disagreed but felt personally insulted and were outraged that anyone would raise the question.
  • The moral sense, then, may be rooted in the design of the normal human brain. Yet for all the awe that may fill our minds when we reflect on an innate moral law within, the idea is at best incomplete. Consider this moral dilemma: A runaway trolley is about to kill a schoolteacher. You can divert the trolley onto a sidetrack, but the trolley would trip a switch sending a signal to a class of 6-year-olds, giving them permission to name a teddy bear Muhammad. Is it permissible to pull the lever? This is no joke. Last month a British woman teaching in a private school in Sudan allowed her class to name a teddy bear after the most popular boy in the class, who bore the name of the founder of Islam. She was jailed for blasphemy and threatened with a public flogging, while a mob outside the prison demanded her death. To the protesters, the woman’s life clearly had less value than maximizing the dignity of their religion, and their judgment on whether it is right to divert the hypothetical trolley would have differed from ours. Whatever grammar guides people’s moral judgments can’t be all that universal. Anyone who stayed awake through Anthropology 101 can offer many other examples.
  • The impulse to avoid harm, which gives trolley ponderers the willies when they consider throwing a man off a bridge, can also be found in rhesus monkeys, who go hungry rather than pull a chain that delivers food to them and a shock to another monkey. Respect for authority is clearly related to the pecking orders of dominance and appeasement that are widespread in the animal kingdom. The purity-defilement contrast taps the emotion of disgust that is triggered by potential disease vectors like bodily effluvia, decaying flesh and unconventional forms of meat, and by risky sexual practices like incest.
  • All this brings us to a theory of how the moral sense can be universal and variable at the same time. The five moral spheres are universal, a legacy of evolution. But how they are ranked in importance, and which is brought in to moralize which area of social life — sex, government, commerce, religion, diet and so on — depends on the culture.
  • By analogy, we are born with a universal moral grammar that forces us to analyze human action in terms of its moral structure, with just as little awareness. The idea that the moral sense is an innate part of human nature is not far-fetched. A list of human universals collected by the anthropologist Donald E. Brown includes many moral concepts and emotions, including a distinction between right and wrong; empathy; fairness; admiration of generosity; rights and obligations; proscription of murder, rape and other forms of violence; redress of wrongs; sanctions for wrongs against the community; shame; and taboos.
  • Here is the worry. The scientific outlook has taught us that some parts of our subjective experience are products of our biological makeup and have no objective counterpart in the world. The qualitative difference between red and green, the tastiness of fruit and foulness of carrion, the scariness of heights and prettiness of flowers are design features of our common nervous system, and if our species had evolved in a different ecosystem or if we were missing a few genes, our reactions could go the other way. Now, if the distinction between right and wrong is also a product of brain wiring, why should we believe it is any more real than the distinction between red and green? And if it is just a collective hallucination, how could we argue that evils like genocide and slavery are wrong for everyone, rather than just distasteful to us?
  • Putting God in charge of morality is one way to solve the problem, of course, but Plato made short work of it 2,400 years ago. Does God have a good reason for designating certain acts as moral and others as immoral? If not — if his dictates are divine whims — why should we take them seriously? Suppose that God commanded us to torture a child. Would that make it all right, or would some other standard give us reasons to resist? And if, on the other hand, God was forced by moral reasons to issue some dictates and not others — if a command to torture a child was never an option — then why not appeal to those reasons directly?
markfrankel18

We Didn't Eat the Marshmallow. The Marshmallow Ate Us. - NYTimes.com - 4 views

  • The marshmallow study captured the public imagination because it is a funny story, easily told, that appears to reduce the complex social and psychological question of why some people succeed in life to a simple, if ancient, formulation: Character is destiny. Except that in this case, the formulation isn’t coming from the Greek philosopher Heraclitus or from a minister preaching that “patience is a virtue” but from science, that most modern of popular religions.
  • But how our brains work is just one of many factors that drive the choices we make. Just last year, a study by researchers at the University of Rochester called the conclusions of the Stanford experiments into question, showing that some children were more likely to eat the first marshmallow when they had reason to doubt the researcher’s promise to come back with a second one. In the study, published in January 2013 in Cognition under the delectable title “Rational Snacking,” Celeste Kidd, Holly Palmeri and Richard N. Aslin wrote that for a child raised in an unstable environment, “the only guaranteed treats are the ones you have already swallowed,” while a child raised in a more stable environment, in which promises are routinely delivered upon, might be willing to wait a few more minutes, confident that he will get that second treat.
  • Willpower can do only so much for children facing domestic instability, poor physical health or intellectual deficits.
« First ‹ Previous 81 - 100 of 123 Next › Last »
Showing 20 items per page