Skip to main content

Home/ TOK@ISPrague/ Group items matching "for" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
markfrankel18

The Moral Instinct - New York Times - 3 views

  • It seems we may all be vulnerable to moral illusions the ethical equivalent of the bending lines that trick the eye on cereal boxes and in psychology textbooks. Illusions are a favorite tool of perception scientists for exposing the workings of the five senses, and of philosophers for shaking people out of the naïve belief that our minds give us a transparent window onto the world (since if our eyes can be fooled by an illusion, why should we trust them at other times?). Today, a new field is using illusions to unmask a sixth sense, the moral sense.
  • The first hallmark of moralization is that the rules it invokes are felt to be universal. Prohibitions of rape and murder, for example, are felt not to be matters of local custom but to be universally and objectively warranted. One can easily say, “I don’t like brussels sprouts, but I don’t care if you eat them,” but no one would say, “I don’t like killing, but I don’t care if you murder someone.”The other hallmark is that people feel that those who commit immoral acts deserve to be punished.
  • Until recently, it was understood that some people didn’t enjoy smoking or avoided it because it was hazardous to their health. But with the discovery of the harmful effects of secondhand smoke, smoking is now treated as immoral. Smokers are ostracized; images of people smoking are censored; and entities touched by smoke are felt to be contaminated (so hotels have not only nonsmoking rooms but nonsmoking floors). The desire for retribution has been visited on tobacco companies, who have been slapped with staggering “punitive damages.” At the same time, many behaviors have been amoralized, switched from moral failings to lifestyle choices.
  • ...10 more annotations...
  • But whether an activity flips our mental switches to the “moral” setting isn’t just a matter of how much harm it does. We don’t show contempt to the man who fails to change the batteries in his smoke alarms or takes his family on a driving vacation, both of which multiply the risk they will die in an accident. Driving a gas-guzzling Hummer is reprehensible, but driving a gas-guzzling old Volvo is not; eating a Big Mac is unconscionable, but not imported cheese or crème brûlée. The reason for these double standards is obvious: people tend to align their moralization with their own lifestyles.
  • People don’t generally engage in moral reasoning, Haidt argues, but moral rationalization: they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.
  • Together, the findings corroborate Greene’s theory that our nonutilitarian intuitions come from the victory of an emotional impulse over a cost-benefit analysis.
  • The psychologist Philip Tetlock has shown that the mentality of taboo — a conviction that some thoughts are sinful to think — is not just a superstition of Polynesians but a mind-set that can easily be triggered in college-educated Americans. Just ask them to think about applying the sphere of reciprocity to relationships customarily governed by community or authority. When Tetlock asked subjects for their opinions on whether adoption agencies should place children with the couples willing to pay the most, whether people should have the right to sell their organs and whether they should be able to buy their way out of jury duty, the subjects not only disagreed but felt personally insulted and were outraged that anyone would raise the question.
  • The moral sense, then, may be rooted in the design of the normal human brain. Yet for all the awe that may fill our minds when we reflect on an innate moral law within, the idea is at best incomplete. Consider this moral dilemma: A runaway trolley is about to kill a schoolteacher. You can divert the trolley onto a sidetrack, but the trolley would trip a switch sending a signal to a class of 6-year-olds, giving them permission to name a teddy bear Muhammad. Is it permissible to pull the lever? This is no joke. Last month a British woman teaching in a private school in Sudan allowed her class to name a teddy bear after the most popular boy in the class, who bore the name of the founder of Islam. She was jailed for blasphemy and threatened with a public flogging, while a mob outside the prison demanded her death. To the protesters, the woman’s life clearly had less value than maximizing the dignity of their religion, and their judgment on whether it is right to divert the hypothetical trolley would have differed from ours. Whatever grammar guides people’s moral judgments can’t be all that universal. Anyone who stayed awake through Anthropology 101 can offer many other examples.
  • The impulse to avoid harm, which gives trolley ponderers the willies when they consider throwing a man off a bridge, can also be found in rhesus monkeys, who go hungry rather than pull a chain that delivers food to them and a shock to another monkey. Respect for authority is clearly related to the pecking orders of dominance and appeasement that are widespread in the animal kingdom. The purity-defilement contrast taps the emotion of disgust that is triggered by potential disease vectors like bodily effluvia, decaying flesh and unconventional forms of meat, and by risky sexual practices like incest.
  • All this brings us to a theory of how the moral sense can be universal and variable at the same time. The five moral spheres are universal, a legacy of evolution. But how they are ranked in importance, and which is brought in to moralize which area of social life — sex, government, commerce, religion, diet and so on — depends on the culture.
  • By analogy, we are born with a universal moral grammar that forces us to analyze human action in terms of its moral structure, with just as little awareness. The idea that the moral sense is an innate part of human nature is not far-fetched. A list of human universals collected by the anthropologist Donald E. Brown includes many moral concepts and emotions, including a distinction between right and wrong; empathy; fairness; admiration of generosity; rights and obligations; proscription of murder, rape and other forms of violence; redress of wrongs; sanctions for wrongs against the community; shame; and taboos.
  • Here is the worry. The scientific outlook has taught us that some parts of our subjective experience are products of our biological makeup and have no objective counterpart in the world. The qualitative difference between red and green, the tastiness of fruit and foulness of carrion, the scariness of heights and prettiness of flowers are design features of our common nervous system, and if our species had evolved in a different ecosystem or if we were missing a few genes, our reactions could go the other way. Now, if the distinction between right and wrong is also a product of brain wiring, why should we believe it is any more real than the distinction between red and green? And if it is just a collective hallucination, how could we argue that evils like genocide and slavery are wrong for everyone, rather than just distasteful to us?
  • Putting God in charge of morality is one way to solve the problem, of course, but Plato made short work of it 2,400 years ago. Does God have a good reason for designating certain acts as moral and others as immoral? If not — if his dictates are divine whims — why should we take them seriously? Suppose that God commanded us to torture a child. Would that make it all right, or would some other standard give us reasons to resist? And if, on the other hand, God was forced by moral reasons to issue some dictates and not others — if a command to torture a child was never an option — then why not appeal to those reasons directly?
Lawrence Hrubes

St James Ethics Centre - What we're about - 0 views

  •  
    "St James Ethics Centre is a unique centre for applied ethics, the only one its kind globally. Despite the fact that we have 'saint' and 'ethics' in our name, St James Ethics Centre is not a religious organisation and neither is it a sort of moral policeman. Working both in Australia and abroad for over twenty years, we're an independent not-for-profit organisation that provides an open forum for the promotion and exploration of ethical questions. We provide practical support to individuals and across organisations to help them to deal with the complex ethical questions that are part of everyday life."
markfrankel18

Why People Mistake Good Deals for Rip-Offs : The New Yorker - 5 views

  • Last Saturday, an elderly man set up a stall near Central Park and sold eight spray-painted canvases for less than one five-hundredth of their true value. The art works were worth more than two hundred and twenty-five thousand dollars, but the man walked away with just four hundred and twenty dollars. Each canvas was an original by the enigmatic British artist Banksy, who was approaching the midpoint of a monthlong residency in New York City. Banksy had asked the man to sell the works on his behalf. for several hours, hundreds of oblivious locals and tourists ignored the quiet salesman, along with the treasure he was hiding in plain sight. The day ended with thirty paintings left unsold. One Banksy aficionado, certain she could distinguish a fake from the real thing, quietly scolded the man for knocking off the artist’s work.
  • What makes Banksy’s subversive stunt so compelling is that it forces us to acknowledge how incoherently humans derive value. How can a person be willing to pay five hundred times more than another for the same art work born in the same artist’s studio?
  • Some concepts are easy to evaluate without a reference standard. You don’t need a yardstick, for example, when deciding whether you’re well-rested or exhausted, or hot or cold, because those states are “inherently evaluable”—they’re easy to measure in absolute terms because we have sensitive biological mechanisms that respond when our bodies demand rest, or when the temperature rises far above or falls far below seventy-two degrees. Everyone agrees that three days is too long a period without sleep, but art works satisfy far too abstract a need to attract a universal valuation. When you learn that your favorite abstract art work was actually painted by a child, its value declines precipitously (unless the child happens to be your prodigious four-year-old).
  • ...1 more annotation...
  • We’re swayed by all the wrong cues, and our valuation estimates are correspondingly incoherent. Banksy knew this when he asked an elderly man to sell his works in Central Park. It’s comforting to believe that we get what we pay for, but discerning true value is as difficult as spotting a genuine Banksy canvas in a city brimming with imitations.
markfrankel18

John Searle: The Philosopher in the World by Tim Crane | NYRblog | The New York Review of Books - 0 views

  • No, I’m not skeptical about the idea of universal human rights. I’m skeptical about what I call positive rights.
  • So I say that you can make a good case for universal human rights of a negative kind, but that you cannot make the comparable case for universal human rights of a positive kind.
  • As a professor in Berkeley I have certain rights, and certain obligations. But the idea of universal rights—that you have certain rights just in virtue of being a human being—is a fantastic idea. And I think, Why not extend the idea of universal rights to conscious animals? Just in virtue of being a conscious animal, you have certain rights. The fact that animals cannot undertake obligations does not imply that they cannot have rights against us who do have obligations. Babies have rights even before they are able to undertake obligations. Now I have to make a confession. I try not to think about animal rights because I fear I’d have to become a vegetarian if I worked it out consistently. But I think there is a very good case to be made for saying that if you grant the validity of universal human rights, then it looks like it would be some kind of special pleading if you said there’s no such thing as universal animal rights. I think there are animal rights.
  • ...1 more annotation...
  • For every right there’s an obligation. We’re under an obligation to treat animals as we arrogantly say, “humanely.” And I think that’s right. I think we are under an obligation to treat animals humanely. The sort of obligation is the sort that typically goes with rights. Animals have a right against us to be treated humanely. Now whether or not this gives us a right to slaughter animals For the sake of eating them, well, I’ve been eating them For so long that I’ve come to take it For granted. But I’m not sure that I could justify it if I was Forced to
Lawrence Hrubes

BBC - Culture - Eleven untranslatable words - 1 views

  • There is one clear difference, though: Iyer has not invented them. The definitions she illustrates – 60 so far, from 30 different languages – match The Meaning of Liff for absurdity, but all of them are real. There is komorebi, Japanese for ‘the sort of scattered dappled light effect that happens when sunlight shines in through trees’; or rire dans sa barbe, a French expression meaning ‘to laugh in your beard quietly while thinking about something that happened in the past’.
  • Iyer’s Found in Translation project will be published as a book later in 2014. She has been a polyglot since childhood. “My parents come from different parts of India, so I grew up learning five languages,” she says. “I’d always loved the word Fernweh, which is German for ‘longing for a place you’ve never been to’, and then one day I started collecting more.”Some are humorous, while others have definitions that read like poetry. “I love the German word Waldeinsamkeit, ‘the feeling of being alone in the woods’. “It captures a sense of solitude and at the same time that feeling of oneness with nature.” Her favourite is the Inuit word Iktsuarpok, which means ‘the frustration of waiting for someone to turn up’, because “it holds so much meaning. It’s waiting, whether you are waiting for the bus to show up or for the love of your life. It perfectly describes that inner anguish associated with waiting.”
markfrankel18

Let's do some math on Ebola before we start quarantining people - Quartz - 0 views

  • The magnitude of false positives for medical tests—a positive test for a condition that a patient does not actually have—is something that is not well understood, even by members of the medical community. We should remember this, as the United States prepares to lock away potential scores of individuals who test positive for Ebola. Many observers do not realize just how many people may spend some time in quarantine when they do not have the dreaded disease.
  • The cognitive psychologist Max Gigerenzer once asked 24 physicians the following hypothetical question, and only two got it right: A test for breast cancer is 90% accurate in identifying patients who actually have breast cancer, and 93% accurate in producing negative results for patients without breast cancer. The incidence of breast cancer in the population is 0.8%. What is the probability that a person who tests positive for breast cancer actually has the disease? Think you know the answer? Many of the physicians in Gigerenzer’s study said there was a 90% probability that a now-terrified patient flagged for breast cancer is an actual victim of the disease. However, the correct probability is less than 10%.
Lawrence Hrubes

An Artist with Amnesia - The New Yorker - 2 views

  • Lately, Johnson draws for pleasure, but for three decades she had a happily hectic career as an illustrator, sometimes presenting clients with dozens of sketches a day. Her playful watercolors once adorned packages of Lotus software; for a program called Magellan, she created a ship whose masts were tethered to billowing diskettes. She made a popular postcard of two red parachutes tied together, forming a heart; several other cards were sold for years at MOMA’s gift shop. Johnson produced half a dozen covers for this magazine, including one, from 1985, that presented a sunny vision of an artist’s life: a loft cluttered with pastel canvases, each of them depicting a fragment of the skyline that is framed by a picture window. It’s as if the paintings were jigsaw pieces, and the city a puzzle being solved. Now Johnson is obsessed with making puzzles. Many times a day, she uses her grids as foundations for elaborate arrangements of letters on a page—word searches by way of Mondrian. for all the dedication that goes into her puzzles, however, they are confounding creations: very few are complete. She is assembling one of the world’s largest bodies of unfinished art.
  • Nicholas Turk-Browne, a cognitive neuroscientist at Princeton, entered the lab and greeted Johnson in the insistently zippy manner of a kindergarten teacher: “Lonni Sue! We’re going to put you in a kind of space machine and take pictures of your brain!” A Canadian with droopy dark-brown hair, he typically speaks with mellow precision. Though they had met some thirty times before, Johnson continued to regard him as an amiable stranger. Turk-Browne is one of a dozen scientists, at Princeton and at Johns Hopkins, who have been studying her, with Aline and Maggi’s consent. Aline told me, “When we realized the magnitude of Lonni Sue’s illness, my mother and I promised each other to turn what could be a tragedy into something which could help others.” Cognitive science has often gained crucial insights by studying people with singular brains, and Johnson is the first person with profound amnesia to be examined extensively with an fMRI. Several papers have been published about Johnson, and the researchers say that she could fuel at least a dozen more.
Lawrence Hrubes

Esa-Pekka Salonen's Ad for Apple : The New Yorker - 1 views

  • For anyone who has endured clichéd, condescending, uncomprehending, or otherwise aggravating depictions of classical music in American TV ads—the snobs at the symphony, the sopranos screaming under Valkyrie helmets, the badly edited bowdlerizations of the “Ode to Joy”—a new ad For the Apple iPad featuring the conductor and composer Esa-Pekka Salonen may come as a pleasant shock. It is, first of all, a cool, elegant piece of work—not surprising, given Apple’s distinguished history of television propaganda. Salonen is shown receiving inspiration For a passage in his Violin Concerto and trying it out in his iPad; then, after a montage of scenes in London and Finland, we see the violinist Leila Josefowicz and the Philharmonia Orchestra, of London, digging in to the score. More notably, it is musical: the concerto dictates the rhythm of the editing, and the correlation between notation and sound is made excitingly clear.
  •  
    For discussion: Despite this ad's aesthetics, narrative, etc., why (arguably) is it not a piece of art? 
markfrankel18

Why We Need Answers: The Theory of Cognitive closure : The New Yorker - 0 views

  • The human mind is incredibly averse to uncertainty and ambiguity; from an early age, we respond to uncertainty or lack of clarity by spontaneously generating plausible explanations. What’s more, we hold on to these invented explanations as having intrinsic value of their own. Once we have them, we don’t like to let them go.
  • Heightened need for cognitive closure can bias our choices, change our preferences, and influence our mood. In our rush for definition, we tend to produce fewer hypotheses and search less thoroughly for information. We become more likely to form judgments based on early cues (something known as impressional primacy), and as a result become more prone to anchoring and correspondence biases (using first impressions as anchors for our decisions and not accounting enough for situational variables). And, perversely, we may not even realize how much we are biasing our own judgments.
  • In 2010, Kruglanski and colleagues looked specifically at the need for cognitive closure as part of the response to terrorism.
  • ...1 more annotation...
  • It’s a self-reinforcing loop: we search energetically, but once we’ve seized onto an idea we remain crystallized at that point. And if we’ve externally committed ourselves to our position by tweeting or posting or speaking? We crystallize our judgment all the more, so as not to appear inconsistent. It’s why false rumors start—and why they die such hard deaths. It’s a dynamic that can have consequences far nastier than a minor media snafu.
Lawrence Hrubes

The Responsibility of Knowledge: Developing Holocaust Education for the Third Generation by Kelly Bunch, Matthew Canfield, Birte Schöler | Humanity in Action - 2 views

  • In a radio address in 1966 the prominent German philosopher, Theodor Adorno, declared his dissatisfaction with the state of Holocaust consciousness. He claimed that ignorance of the barbarity of the Holocaust is “itself a symptom of the continuing potential for its recurrence as far as peoples’ conscious and unconscious is concerned.” (Adorno, Education After Auschwitz). It is for this reason that he envisioned education as the institution which would be most responsible for instilling values in the masses so that they have the agency to oppose barbarism.  Adorno spoke not only of education in childhood, but “then the general enlightenment that provides an intellectual, cultural, and social climate in which a recurrence would no longer be possible.” Almost 40 years later, the Holocaust education is still important, not only to combat another genocide but also to provide a consciousness of human rights necessary in a world where such standards are becoming commonplace. Holocaust education is in a state of constant evolution. As generations grow up and new ones are born, as distance from the Holocaust increases, it is necessary to reform the methods in which its history is taught. As survivors die and the third generation slowly drifts out of the Holocaust’s shadow, education must be buttressed with an understanding of the applicable lessons and principles that may derive from the Holocaust. for this education to have any meaning, those mechanisms that allowed the Holocaust to take place must be fully understood. History must empower pupils with the understanding of various choices they must make and their ultimate impact on society. 
Lawrence Hrubes

'Son of Saul,' Kierkegaard and the Holocaust - The New York Times - 1 views

  • The spectacular success of science in the past 300 years has raised hopes that it also holds the key to guiding human beings towards a good life. Psychology and neuroscience has become a main source of life advice in the popular media. But philosophers have long held reservations about this scientific orientation to how to live life.
  • The 18th century Scottish philosopher David Hume, for instance, famously pointed out, no amount of fact can legislate value, moral or otherwise. You cannot derive ought from is.
  • Science is the best method we have for approaching the world objectively. But in fact it is not science per se that is the problem, from the point of view of subjectivity. It is objectivizing, in any of its forms. One can frame a decision, for example, in objective terms. One might decide between career choices by weighing differences in workloads, prestige, pay and benefits between, say, working for an advanced technology company versus working for a studio in Hollywood. We are often encouraged to make choices by framing them in this way. Alternatively, one might try to frame the decision more in terms of what it might be like to work in either occupation; in this case, one needs to have the patience to dwell in experience long enough for one’s feelings about either alternative to emerge. In other words, one might deliberate subjectively.
  • ...1 more annotation...
  • Most commonly, we turn our back on subjectivity to escape from pain. Suffering, one’s own, or others’, might become bearable, one hopes, when one takes a step back and views it objectively, conceptually, abstractly. And when it comes to something as monumental as the Holocaust, one’s mind cannot help but be numbed by the sheer magnitude of it. How could one feel the pain of all those people, sympathize with millions? Instead one is left with the “facts,” the numbers.
markfrankel18

And the Word of the Year Is... Selfie! : The New Yorker - 0 views

  • Hold on to your monocles, friends—the Oxford Dictionaries Word of the Year for 2013 is “selfie.” It’s an informal noun (plural: selfies) defined as “a photograph that one has taken of oneself, typically one taken with a smartphone or webcam and uploaded to a social media website.” It was first used in 2002, in an Australian online forum (compare the Australian diminutives “barbie” for barbecue and “firie” for firefighter), and it first appeared as a hashtag, #selfie, on Flickr, in 2004.
  • The word “selfie” is not yet in the O.E.D., but it is currently being considered for future inclusion; whether the word makes it into the history books is truly for the teens to decide. As Ben Zimmer wrote at Language Log, “Youth slang is the obvious source for much of our lexical innovation, like it or not.” And despite its cloying tone, that Oxford Dictionaries blog post from August does allude to the increasingly important distinction between “acronym“ and “initialism”—either of which may describe the expression “LOL,” depending if you pronounce it “lawl” or “ell-oh-ell.” The kids are going to be all right. Not “alright.” But all right.
Lawrence Hrubes

The Case for Banning Laptops in the Classroom : The New Yorker - 0 views

  • I banned laptops in the classroom after it became common practice to carry them to school. When I created my “electronic etiquette policy” (as I call it in my syllabus), I was acting on a gut feeling based on personal experience. I’d always figured that, for the kinds of computer-science and math classes that I generally teach, which can have a significant theoretical component, any advantage that might be gained by having a machine at the ready, or available for the primary goal of taking notes, was negligible at best. We still haven’t made it easy to type notation-laden sentences, so the potential benefits were low. Meanwhile, the temptation for distraction was high. I know that I have a hard time staying on task when the option to check out at any momentary lull is available; I assumed that this must be true for my students, as well. Over time, a wealth of studies on students’ use of computers in the classroom has accumulated to support this intuition. Among the most famous is a landmark Cornell University study from 2003 called “The Laptop and the Lecture,” wherein half of a class was allowed unfettered access to their computers during a lecture while the other half was asked to keep their laptops closed. The experiment showed that, regardless of the kind or duration of the computer use, the disconnected students performed better on a post-lecture quiz. The message of the study aligns pretty well with the evidence that multitasking degrades task performance across the board.
Lawrence Hrubes

A Pioneer for Death With Dignity - NYTimes.com - 0 views

  • More than two decades before Brittany Maynard’s public advocacy for death with dignity inspired lawmakers in Washington, D.C., and at least 16 states to introduce legislation authorizing the medical practice of aid in dying for the terminally ill, Senator Frank Roberts of Oregon sponsored one of the nation’s first death-with-dignity bills.
  • Medical aid in dying has always had enormous public support. Recent polls by Gallup and Harris show that 69 to 74 percent of people believe terminally ill adults should have access to medical means to bring about a peaceful death. This belief is strong throughout the nation and across all demographic categories, including age, disability, religion and political party.
  • First, the phenomenon of Brittany Maynard has transformed the movement for end-of-life-choice into an unstoppable force. Ms. Maynard was the 29-year-old woman dying of brain cancer, who moved, with her family, from her home in California to establish residency in Oregon and gain access to aid in dying. As her pain and seizures escalated and as inevitable paralysis, blindness and stupor approached, she drank medication obtained under Oregon’s Death With Dignity Act and died quietly in a circle of her loved ones last fall. Her family vows to fulfill her legacy of legal reform in her native California and beyond. Young and old alike identify with Brittany Maynard. Her experience as a refugee for dignity sparks the “aha!” moment when people understand the grave injustice of government’s withholding from a competent, dying adult the elements of choice and control over suffering.
markfrankel18

What's a Metaphor For? - The Chronicle Review - The Chronicle of Higher Education - 1 views

  • "Metaphorical thinking—our instinct not just for describing but for comprehending one thing in terms of another—shapes our view of the world, and is essential to how we communicate, learn, discover and invent. ... Our understanding of metaphor is in the midst of a metamorphosis. for centuries, metaphor has been seen as a kind of cognitive frill, a pleasant but essentially useless embellishment to 'normal' thought. Now, the frill is gone. New research in the social and cognitive sciences makes it increasingly plain that metaphorical thinking influences our attitudes, beliefs, and actions in surprising, hidden, and often oddball ways." Geary further unpacks metaphor's influence in his foreword: "Metaphor conditions our interpretations of the stock market and, through advertising, it surreptitiously infiltrates our purchasing decisions. In the mouths of politicians, metaphor subtly nudges public opinion; in the minds of businesspeople, it spurs creativity and innovation. In science, metaphor is the preferred nomenclature for new theories and new discoveries; in psychology, it is the natural language of human relationships and emotions."
  • The upshot of the boom in metaphor studies, Geary makes clear, is the overturning of that presumption toward literalism: Nowadays, it's believers in a literalism that goes all the way down (so to speak) who are on the defensive in intellectual life, and explorers of metaphor who are on the ascendant. As a result, Geary hardly feels a need to address literalism, devoting most of his book to how metaphor connects to etymology, money, mind, politics, pleasure, science, children, the brain, the body, and such literary forms as the proverb and aphorism.
Lawrence Hrubes

My Great-Great-Aunt Discovered Francium. And It Killed Her. - NYTimes.com - 0 views

  • There is a common narrative in science of the tragic genius who suffers for a great reward, and the tale of Curie, who died from exposure to radiation as a result of her pioneering work, is one of the most famous. There is a sense of grandeur in the idea that paying heavily is a means of advancing knowledge. But in truth, you can’t control what it is that you find — whether you’ve sacrificed your health for it, or simply years of your time.
  • How quickly an element decayed and how it did so — meaning which of its component parts it shed — became the focus of researchers in radioactivity. Apart from purely scientific insights, there was a hope that radiation could lead to something marvelous. X-rays, a kind of radiation discovered by Wilhelm Roentgen and produced by accelerated electrons, had already been hailed as a major medical breakthrough and, in addition to showing doctors their patients’ insides, were being investigated as a treatment for skin lesions from tuberculosis and lupus. In her 1904 book “Investigations on Radioactive Substances,” Marie Curie wrote that radium had promise, too — diseased skin exposed to it later regrew in a healthy state. Radium’s curious ability to destroy tissue was being turned against cancer, with doctors sewing capsules of radium into the surgical wounds of cancer patients (including Henrietta Lacks, whose cells are used today in research). This enthusiasm for radioactivity was not confined to the doctor’s office. The element was in face creams, tonics, even candy. According to the Encyclopaedia Britannica article that Curie and her daughter wrote on radium in 1926, preliminary experiments suggested that radium could even improve the quality of soil.
  • Perhaps the most tragic demonstration of this involved workers at the United States Radium Corporation factory in Orange, N.J., which in 1917 began hiring young women to paint watch faces with glow-in-the-dark radium paint. The workers were told that the paint was harmless and were encouraged to lick the paintbrushes to make them pointy enough to inscribe small numbers. In the years that followed, the women began to suffer ghoulish physical deterioration. Their jaws melted and ballooned into masses of tumors larger than fists, and cancers riddled their bodies. They developed anemia and necrosis. The sensational court case started — and won — by the dying Radium Girls, as they were called, is a landmark in the history of occupational health. It was settled in June 1928, four months before Marguerite Perey arrived at the Radium Institute to begin a 30-year career of heavy exposure to radiation.
  • ...2 more annotations...
  • We know now that alpha and beta particles emitted in radiation attack DNA and that the mutations they cause can lead to cancer. Ingested radioactive elements can concentrate in the bones, where they continue their decay, in effect poisoning someone for as long as that person lives. By the time Perey made her discovery, she was already heavily contaminated. She spent the last 15 years of her life in treatment for a gruesome bone cancer that spread throughout her body, claiming her eyesight, pieces of her hand and most of the years in which she had planned to study francium. As the disease progressed, she warned her students of the horrible consequences of radiation exposure. Francis, my grandfather, says he recalls hearing that when she walked into labs with radiation counters in her later years, they would go off.
  • Over the years, historians have pondered what drove the Curies to throw caution so thoroughly to the wind. Perhaps it was inconceivable to them that the benefits of their research would not outweigh the risks to themselves and their employees. In a field in which groundbreaking discoveries were being made and the competition might arrive there first, speed was put above other concerns, Rona noted. But you almost get the impression that in the Curie lab, dedication to science was demonstrated by a willingness to poison yourself — as if what made a person’s research meaningful were the sacrifices made in the effort to learn something new.
markfrankel18

Paul Bloom: The Case Against Empathy : The New Yorker - 0 views

  • Empathy research is thriving these days, as cognitive neuroscience undergoes what some call an “affective revolution.” There is increasing focus on the emotions, especially those involved in moral thought and action. We’ve learned, for instance, that some of the same neural systems that are active when we are in pain become engaged when we observe the suffering of others. Other researchers are exploring how empathy emerges in chimpanzee and other primates, how it flowers in young children, and the sort of circumstances that trigger it.
  • Rifkin calls for us to make the leap to “global empathic consciousness.” He sees this as the last best hope for saving the world from environmental destruction, and concludes with the plaintive question “Can we reach biosphere consciousness and global empathy in time to avoid planetary collapse?”
  • This enthusiasm may be misplaced, however. Empathy has some unfortunate features—it is parochial, narrow-minded, and innumerate. We’re often at our best when we’re smart enough not to rely on it.
  • ...2 more annotations...
  • Empathy research is thriving these days, as cognitive neuroscience undergoes what some call an “affective revolution.” There is increasing focus on the emotions, especially those involved in moral thought and action. We’ve learned, for instance, that some of the same neural systems that are active when we are in pain become engaged when we observe the suffering of others. Other researchers are exploring how empathy emerges in chimpanzee and other primates, how it flowers in young children, and the sort of circumstances that trigger it.
  • This interest isn’t just theoretical. If we can figure out how empathy works, we might be able to produce more of it.
  •  
    "Empathy research is thriving these days, as cognitive neuroscience undergoes what some call an "affective revolution." There is increasing focus on the emotions, especially those involved in moral thought and action. We've learned, for instance, that some of the same neural systems that are active when we are in pain become engaged when we observe the suffering of others."
Lawrence Hrubes

Alyson McGregor: Why medicine often has dangerous side effects for women | TED Talk | TED.com - 0 views

  • For most of the past century, drugs approved and released to market have been tested only on male patients, leading to improper dosing and unacceptable side effects For women. The important physiological differences between men and women have only recently been taken into consideration in medical research. Emergency doctor Alyson McGregor studies these differences, and in this fascinating talk she discusses the history behind how the male model became our framework For medical research and how understanding differences between men and women can lead to more effective treatments For both sexes
Lawrence Hrubes

Pondering Miracles, Medical and Religious - The New York Times - 0 views

  • The tribunal that questioned me was not juridical, but ecclesiastical. I was not asked about my faith. (For the record, I’m an atheist.) I was not asked if it was a miracle. I was asked if I could explain it scientifically. I could not, though I had come armed For my testimony with the most up-to-date hematological literature, which showed that long survivals following relapses were not seen.
  • When, at the end, the Vatican committee asked if I had anything more to say, I blurted out that as much as her survival, thus far, was remarkable, I fully expected her to relapse some day sooner or later. What would the Vatican do then, revoke the canonization? The clerics recorded my doubts. But the case went forward and d’Youville was canonized on Dec. 9, 1990.
  • Respect for our religious patients demands understanding and tolerance; their beliefs are as true for them as the “facts” may be for physicians. Now almost 40 years later, that mystery woman is still alive and I still cannot explain why. Along with the Vatican, she calls it a miracle. Why should my inability to offer an explanation trump her belief? However they are interpreted, miracles exist, because that is how they are lived in our world.
markfrankel18

On the Face of It: How We Vote : The New Yorker - 0 views

  • In 2003, the Princeton psychologist Alexander Todorov began to suspect that, except for those people who have hard-core political beliefs, the reasons we vote for particular candidates could have less to do with politics and more to do with basic cognitive processes—in particular, perception. When people are asked about their ideal leader, one of the single most important characteristics that they say they look for is competence—how qualified and capable a candidate is. Todorov wondered whether that judgment was made on the basis of intuitive responses to basic facial features rather than on any deep, rational calculus. It would make sense: in the past, extensive research has shown just how quickly we form impressions of people’s character traits, even before we’ve had a conversation with them. That impression then colors whatever else we learn about them, from their hobbies to, presumably, their political abilities. In other words, when we think that we are making rational political judgments, we could be, in fact, judging someone at least partly based on a fleeting impression of his or her face.
  • Starting that fall, and through the following spring, Todorov showed pairs of portraits to roughly a thousand people, and asked them to rate the competence of each person. Unbeknownst to the test subjects, they were looking at candidates for the House and Senate in 2000, 2002, and 2004. In study after study, participants’ responses to the question of whether someone looked competent predicted actual election outcomes at a rate much higher than chance—from sixty-six to seventy-three per cent of the time. Even looking at the faces for as little as one second, Todorov found, yielded the exact same result: a snap judgment that generally identified the winners and losers.
1 - 20 of 549 Next › Last »
Showing 20 items per page