Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Beauty

Rss Feed Group items tagged

Weiye Loh

Miss Malaysia Toy Boy - 7 views

Yes, commodification has led to liberation. After all, capitalism is all about creating new markets for more production and consumption. Beauty has all along been commodified since the oldest trade...

Weiye Loh

The Great Beyond: Indian government competition for better skin whiteners draws fire - 0 views

  • In December 2010 the Department of Science and Technology (DST) launched a monthly competition in association with Cincinnati-based Proctor & Gamble (P&G) to solicit innovative ideas from Indian researchers. Winners were promised a cash award of $1000 and possible commercialization of their ideas by P&G, which has a beauty business worth over US$10 billion in global sales. But the competition's first call - for skin whitening alternatives to hydroquinone, which is not approved for use in many places including the European Union - has prompted criticism from researchers who argue that such products help to propagate racist attitudes in the country. Meanwhile, the department's January challenge for cheaper alternatives to silicones in shampoos, lotions, fabric softeners, and other beauty products marketed by P&G has fared little better. The principal drawback of silicones is their expense and poor biodegradability but some researchers argue that India has more pressing issues for its scientists to address.
  • However, the current DST secretary Thirumalachari Ramasami disagrees. The DST-P&G Challenge of The Month is only a small part of the department's overall activities, he says. “It is not a priority project but a very minor programme compared to larger issues of national importance that we are concerned with,” Ramasami told Nature adding that his department has earmarked only Rs.50 million (US$1.1 million) in total for the project. He says it is absurd to accuse the DST of promoting beauty research at the expense of more important problems. “Tell me which challenging issue has been ignored by DST?” he asks.
Weiye Loh

Rationally Speaking: Are Intuitions Good Evidence? - 0 views

  • Is it legitimate to cite one’s intuitions as evidence in a philosophical argument?
  • appeals to intuitions are ubiquitous in philosophy. What are intuitions? Well, that’s part of the controversy, but most philosophers view them as intellectual “seemings.” George Bealer, perhaps the most prominent defender of intuitions-as-evidence, writes, “For you to have an intuition that A is just for it to seem to you that A… Of course, this kind of seeming is intellectual, not sensory or introspective (or imaginative).”2 Other philosophers have characterized them as “noninferential belief due neither to perception nor introspection”3 or alternatively as “applications of our ordinary capacities for judgment.”4
  • Philosophers may not agree on what, exactly, intuition is, but that doesn’t stop them from using it. “Intuitions often play the role that observation does in science – they are data that must be explained, confirmers or the falsifiers of theories,” Brian Talbot says.5 Typically, the way this works is that a philosopher challenges a theory by applying it to a real or hypothetical case and showing that it yields a result which offends his intuitions (and, he presumes, his readers’ as well).
  • ...16 more annotations...
  • For example, John Searle famously appealed to intuition to challenge the notion that a computer could ever understand language: “Imagine a native English speaker who knows no Chinese locked in a room full of boxes of Chinese symbols (a data base) together with a book of instructions for manipulating the symbols (the program). Imagine that people outside the room send in other Chinese symbols which, unknown to the person in the room, are questions in Chinese (the input). And imagine that by following the instructions in the program the man in the room is able to pass out Chinese symbols which are correct answers to the questions (the output)… If the man in the room does not understand Chinese on the basis of implementing the appropriate program for understanding Chinese then neither does any other digital computer solely on that basis because no computer, qua computer, has anything the man does not have.” Should we take Searle’s intuition that such a system would not constitute “understanding” as good evidence that it would not? Many critics of the Chinese Room argument argue that there is no reason to expect our intuitions about intelligence and understanding to be reliable.
  • Ethics leans especially heavily on appeals to intuition, with a whole school of ethicists (“intuitionists”) maintaining that a person can see the truth of general ethical principles not through reason, but because he “just sees without argument that they are and must be true.”6
  • Intuitions are also called upon to rebut ethical theories such as utilitarianism: maximizing overall utility would require you to kill one innocent person if, in so doing, you could harvest her organs and save five people in need of transplants. Such a conclusion is taken as a reductio ad absurdum, requiring utilitarianism to be either abandoned or radically revised – not because the conclusion is logically wrong, but because it strikes nearly everyone as intuitively wrong.
  • British philosopher G.E. Moore used intuition to argue that the existence of beauty is good irrespective of whether anyone ever gets to see and enjoy that beauty. Imagine two planets, he said, one full of stunning natural wonders – trees, sunsets, rivers, and so on – and the other full of filth. Now suppose that nobody will ever have the opportunity to glimpse either of those two worlds. Moore concluded, “Well, even so, supposing them quite apart from any possible contemplation by human beings; still, is it irrational to hold that it is better that the beautiful world should exist than the one which is ugly? Would it not be well, in any case, to do what we could to produce it rather than the other? Certainly I cannot help thinking that it would."7
  • Although similar appeals to intuition can be found throughout all the philosophical subfields, their validity as evidence has come under increasing scrutiny over the last two decades, from philosophers such as Hilary Kornblith, Robert Cummins, Stephen Stich, Jonathan Weinberg, and Jaakko Hintikka (links go to representative papers from each philosopher on this issue). The severity of their criticisms vary from Weinberg’s warning that “We simply do not know enough about how intuitions work,” to Cummins’ wholesale rejection of philosophical intuition as “epistemologically useless.”
  • One central concern for the critics is that a single question can inspire totally different, and mutually contradictory, intuitions in different people.
  • For example, I disagree with Moore’s intuition that it would be better for a beautiful planet to exist than an ugly one even if there were no one around to see it. I can’t understand what the words “better” and “worse,” let alone “beautiful” and “ugly,” could possibly mean outside the domain of the experiences of conscious beings
  • If we want to take philosophers’ intuitions as reason to believe a proposition, then the existence of opposing intuitions leaves us in the uncomfortable position of having reason to believe both a proposition and its opposite.
  • “I suspect there is overall less agreement than standard philosophical practice presupposes, because having the ‘right’ intuitions is the entry ticket to various subareas of philosophy,” Weinberg says.
  • But the problem that intuitions are often not universally shared is overshadowed by another problem: even if an intuition is universally shared, that doesn’t mean it’s accurate. For in fact there are many universal intuitions that are demonstrably false.
  • People who have not been taught otherwise typically assume that an object dropped out of a moving plane will fall straight down to earth, at exactly the same latitude and longitude from which it was dropped. What will actually happen is that, because the object begins its fall with the same forward momentum it had while it was on the plane, it will continue to travel forward, tracing out a curve as it falls and not a straight line. “Considering the inadequacies of ordinary physical intuitions, it is natural to wonder whether ordinary moral intuitions might be similarly inadequate,” Princeton’s Gilbert Harman has argued,9 and the same could be said for our intuitions about consciousness, metaphysics, and so on.
  • We can’t usually “check” the truth of our philosophical intuitions externally, with an experiment or a proof, the way we can in physics or math. But it’s not clear why we should expect intuitions to be true. If we have an innate tendency towards certain intuitive beliefs, it’s likely because they were useful to our ancestors.
  • But there’s no reason to expect that the intuitions which were true in the world of our ancestors would also be true in other, unfamiliar contexts
  • And for some useful intuitions, such as moral ones, “truth” may have been beside the point. It’s not hard to see how moral intuitions in favor of fairness and generosity would have been crucial to the survival of our ancestors’ tribes, as would the intuition to condemn tribe members who betrayed those reciprocal norms. If we can account for the presence of these moral intuitions by the fact that they were useful, is there any reason left to hypothesize that they are also “true”? The same question could be asked of the moral intuitions which Jonathan Haidt has classified as “purity-based” – an aversion to incest, for example, would clearly have been beneficial to our ancestors. Since that fact alone suffices to explain the (widespread) presence of the “incest is morally wrong” intuition, why should we take that intuition as evidence that “incest is morally wrong” is true?
  • The still-young debate over intuition will likely continue to rage, especially since it’s intertwined with a rapidly growing body of cognitive and social psychological research examining where our intuitions come from and how they vary across time and place.
  • its resolution bears on the work of literally every field of analytic philosophy, except perhaps logic. Can analytic philosophy survive without intuition? (If so, what would it look like?) And can the debate over the legitimacy of appeals to intuition be resolved with an appeal to intuition?
Weiye Loh

Balderdash - 0 views

  • Addendum: People have notified me that after almost 2 1/2 years, many of the pictures are now missing. I have created galleries with the pictures and hosted them on my homepage:
  • I have no problem at all with people who have plastic surgery. Unlike those who believe that while it is great if you are born pretty, having a surgically constructed or enhanced face is a big no-no (ie A version of the Naturalistic fallacy), I have no problems with people getting tummy tucks, chin lifts, boob jobs or any other form of physical sculpting or enhancement. After all, she seems to have gotten quite a reception on Hottest Blogger.
  • Denying that you have gone under the knife and feigning, with a note of irritation, tired resignation about the accusations, however, is a very different matter. Considering that many sources know the truth about her plastic surgery, this is a most perilous assertion to make and I was riled enough to come up with this blog post. [Addendum: She also goes around online squashing accusations and allegations of surgery.]
  •  
    Two wrongs and two rights.
  •  
    Not exactly the most recent case, but still worth revisiting the ethical concerns behind it. It is easy to find more than one ethical question and problem in this case and it involves more than one technology. The dichotomies of lies versus truths, nature versus man-made, wrongs versus rights, beautiful versus ugly,and so on... So who is right and who is wrong in this case? Whose and what rights are invoked and/or violated? Can a right be wrong? Can a wrong be right? Do two wrongs make one right? What parts do the technologies play in this case?
  •  
    On a side note, given the internet's capability to dig up past issues and rehash them, is it ethical for us to open up old wounds in the name of academic freedom? Beyond research, with IRB and such, what about daily academic discourses and processes? What are the ethical concerns?
Weiye Loh

Balderdash - 0 views

  • I've interacted with more people who think a single counter-example is enough to prove a generalisation or stereotype wrong (in other words, those who confuse the concepts of range and mean [or, perhaps, confidence intervals]) than people who think that stereotypes always hold.
  • perhaps this reveals that the oft-mentioned person who cannot think beyond stereotypes is a classic bogeyman
  • the people I interact with on such issues are those who have a higher level of education, but then this would be proof that "education is a method whereby one acquires a higher grade of prejudices".
  • ...9 more annotations...
  • B: I dont classify this issue [of hiring women for sales jobs] under discrimination. If not, I can also say that hiring pretty girls for front line services is not only discrimination towards males BUT towards ugly girls too!
  • A: Maybe it is discrimination but that we are simply ignoring it? After all no one likes to look upon themselves in a bad light. When blacks were still slaves in America, no one wanted to think it was discrimination because majority felt it was right to enslave the blacks. Majority has always ruled society after all. Might and safety in numbers.
  • Noelle-Nueman's Spiral of Silence theory could apply not just to minorities but maybe in this scenario
  • It seems fine to portray beautiful people on the front line and indirectly discriminate against less appealing people and no one wants to speak out against it.
  • the "fat, ugly or even toothless" people who don't speak out. In this case, they constitute the "minority" section of the the Spiral's theory.
  • Me: We discriminate against the lazy and the stupid all the timeA: You uncharacteristic lack of support is disturbing. In addition to that, you make assumptous, sweeping statements that does not hold as much water as you seem to believe they do. Me: I thought it would be obvious that we discriminate against the lazy and the stupid all the time.There's something to be said about universal skepticism as a method for finding truth, but if you apply it in everyday life we'll never get anything done. Do you want me to support the assertion that Singapore is hot and humid as well, or will you again accuse me of making assumptous (sic), sweeping statements that does (sic) not hold as much water as I seem to believe they do?
  • A: It is clear and undeniable that Singapore is hot and humid, it is however, not clear that everyone discriminates against the lazy and stupid. Just because you don't know anyone who does not discriminate against them, doesn't mean everyone is the same.Me: *facepalm* Next you'll be saying that the US Declaration of Independence is invalid, because not everyone in the Thirteen Colonies held it to be "self-evident" that "all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness".
  • using Noelle-Nueman's Spiral of Silence effect as an explanation, you must understand that the "minorities", that is mentioned in Noelle-Nueman's Spiral of Silence, refers to the smaller population who holds the controversial opinion. The "minorities" does not necessarily mean the "victims" in the topic, that is being discussed.
  • When you want to use Noelle-Nueman's Spiral of Silence effect in here, you must understand that the "minorities" can be any random person (the sexy, ugly, pretty, beautiful, thin, fat, etc.) who is out to advocate that "Car-show can hire FAT chicks".
  •  
    "People demand freedom of speech as a compensation for the freedom of thought which they seldom use." - Soren Kierkegaard
Weiye Loh

Camera prettifies subjects, even adds makeup | Reuters - 0 views

  •  
    The LUMIX FX77, released last Friday, has a "beauty re-touch" function that will whiten your teeth, increase the translucency of your skin, remove dark eye circles, make your face look smaller and even magnify the size of your eyes. For the final touch, it will apply rouge, lipstick and even eye shadow.
Weiye Loh

Rationally Speaking: Is modern moral philosophy still in thrall to religion? - 0 views

  • Recently I re-read Richard Taylor’s An Introduction to Virtue Ethics, a classic published by Prometheus
  • Taylor compares virtue ethics to the other two major approaches to moral philosophy: utilitarianism (a la John Stuart Mill) and deontology (a la Immanuel Kant). Utilitarianism, of course, is roughly the idea that ethics has to do with maximizing pleasure and minimizing pain; deontology is the idea that reason can tell us what we ought to do from first principles, as in Kant’s categorical imperative (e.g., something is right if you can agree that it could be elevated to a universally acceptable maxim).
  • Taylor argues that utilitarianism and deontology — despite being wildly different in a variety of respects — share one common feature: both philosophies assume that there is such a thing as moral right and wrong, and a duty to do right and avoid wrong. But, he says, on the face of it this is nonsensical. Duty isn’t something one can have in the abstract, duty is toward a law or a lawgiver, which begs the question of what could arguably provide us with a universal moral law, or who the lawgiver could possibly be.
  • ...11 more annotations...
  • His answer is that both utilitarianism and deontology inherited the ideas of right, wrong and duty from Christianity, but endeavored to do without Christianity’s own answers to those questions: the law is given by God and the duty is toward Him. Taylor says that Mill, Kant and the like simply absorbed the Christian concept of morality while rejecting its logical foundation (such as it was). As a result, utilitarians and deontologists alike keep talking about the right thing to do, or the good as if those concepts still make sense once we move to a secular worldview. Utilitarians substituted pain and pleasure for wrong and right respectively, and Kant thought that pure reason can arrive at moral universals. But of course neither utilitarians nor deontologist ever give us a reason why it would be irrational to simply decline to pursue actions that increase global pleasure and diminish global pain, or why it would be irrational for someone not to find the categorical imperative particularly compelling.
  • The situation — again according to Taylor — is dramatically different for virtue ethics. Yes, there too we find concepts like right and wrong and duty. But, for the ancient Greeks they had completely different meanings, which made perfect sense then and now, if we are not mislead by the use of those words in a different context. For the Greeks, an action was right if it was approved by one’s society, wrong if it wasn’t, and duty was to one’s polis. And they understood perfectly well that what was right (or wrong) in Athens may or may not be right (or wrong) in Sparta. And that an Athenian had a duty to Athens, but not to Sparta, and vice versa for a Spartan.
  • But wait a minute. Does that mean that Taylor is saying that virtue ethics was founded on moral relativism? That would be an extraordinary claim indeed, and he does not, in fact, make it. His point is a bit more subtle. He suggests that for the ancient Greeks ethics was not (principally) about right, wrong and duty. It was about happiness, understood in the broad sense of eudaimonia, the good or fulfilling life. Aristotle in particular wrote in his Ethics about both aspects: the practical ethics of one’s duty to one’s polis, and the universal (for human beings) concept of ethics as the pursuit of the good life. And make no mistake about it: for Aristotle the first aspect was relatively trivial and understood by everyone, it was the second one that represented the real challenge for the philosopher.
  • For instance, the Ethics is famous for Aristotle’s list of the virtues (see Table), and his idea that the right thing to do is to steer a middle course between extreme behaviors. But this part of his work, according to Taylor, refers only to the practical ways of being a good Athenian, not to the universal pursuit of eudaimonia. Vice of Deficiency Virtuous Mean Vice of Excess Cowardice Courage Rashness Insensibility Temperance Intemperance Illiberality Liberality Prodigality Pettiness Munificence Vulgarity Humble-mindedness High-mindedness Vaingloriness Want of Ambition Right Ambition Over-ambition Spiritlessness Good Temper Irascibility Surliness Friendly Civility Obsequiousness Ironical Depreciation Sincerity Boastfulness Boorishness Wittiness Buffoonery</t
  • How, then, is one to embark on the more difficult task of figuring out how to live a good life? For Aristotle eudaimonia meant the best kind of existence that a human being can achieve, which in turns means that we need to ask what it is that makes humans different from all other species, because it is the pursuit of excellence in that something that provides for a eudaimonic life.
  • Now, Plato - writing before Aristotle - ended up construing the good life somewhat narrowly and in a self-serving fashion. He reckoned that the thing that distinguishes humanity from the rest of the biological world is our ability to use reason, so that is what we should be pursuing as our highest goal in life. And of course nobody is better equipped than a philosopher for such an enterprise... Which reminds me of Bertrand Russell’s quip that “A process which led from the amoeba to man appeared to the philosophers to be obviously a progress, though whether the amoeba would agree with this opinion is not known.”
  • But Aristotle's conception of "reason" was significantly broader, and here is where Taylor’s own update of virtue ethics begins to shine, particularly in Chapter 16 of the book, aptly entitled “Happiness.” Taylor argues that the proper way to understand virtue ethics is as the quest for the use of intelligence in the broadest possible sense, in the sense of creativity applied to all walks of life. He says: “Creative intelligence is exhibited by a dancer, by athletes, by a chess player, and indeed in virtually any activity guided by intelligence [including — but certainly not limited to — philosophy].” He continues: “The exercise of skill in a profession, or in business, or even in such things as gardening and farming, or the rearing of a beautiful family, all such things are displays of creative intelligence.”
  • what we have now is a sharp distinction between utilitarianism and deontology on the one hand and virtue ethics on the other, where the first two are (mistakenly, in Taylor’s assessment) concerned with the impossible question of what is right or wrong, and what our duties are — questions inherited from religion but that in fact make no sense outside of a religious framework. Virtue ethics, instead, focuses on the two things that really matter and to which we can find answers: the practical pursuit of a life within our polis, and the lifelong quest of eudaimonia understood as the best exercise of our creative faculties
  • &gt; So if one's profession is that of assassin or torturer would being the best that you can be still be your duty and eudaimonic? And what about those poor blighters who end up with an ugly family? &lt;Aristotle's philosophy is ver much concerned with virtue, and being an assassin or a torturer is not a virtue, so the concept of a eudaimonic life for those characters is oxymoronic. As for ending up in a "ugly" family, Aristotle did write that eudaimonia is in part the result of luck, because it is affected by circumstances.
  • &gt; So to the title question of this post: "Is modern moral philosophy still in thrall to religion?" one should say: Yes, for some residual forms of philosophy and for some philosophers &lt;That misses Taylor's contention - which I find intriguing, though I have to give it more thought - that *all* modern moral philosophy, except virtue ethics, is in thrall to religion, without realizing it.
  • “The exercise of skill in a profession, or in business, or even in such things as gardening and farming, or the rearing of a beautiful family, all such things are displays of creative intelligence.”So if one's profession is that of assassin or torturer would being the best that you can be still be your duty and eudaimonic? And what about those poor blighters who end up with an ugly family?
Weiye Loh

Designing Minds: Uncovered Video Profiles of Prominent Designers | Brain Pickings - 0 views

  • My favorite quote about what is art and what is design and what might be the difference comes from Donald Judd: ‘Design has to work, art doesn’t.’ And these things all have to work. They have a function outside my desire for self-expression.” ~ Stefan Sagmeister

  • When designers are given the opportunity to have a bigger role, real change, real transformation actually happens.” ~ Yves Behar

  •  
    In 2008, a now-defunct podcast program by Adobe called Designing Minds - not to be confused with frogdesign's excellent design mind magazine - did a series of video profiles of prominent artists and designers, including Stefan Sagmeister (whose Things I have learned in my life so far isn't merely one of the best-produced, most beautiful design books of the past decade, it's also a poignant piece of modern existential philosophy), Yves Behar (of One Laptop Per Child fame), Marian Bantjes (whose I Wonder remains my favorite typographic treasure) and many more, offering a rare glimpse of these remarkable creators' life stories, worldviews and the precious peculiarities that make them be who they are and create what they create
joanne ye

Measuring the effectiveness of online activism - 2 views

Reference: Krishnan, S. (2009, June 21). Measuring the effectiveness of online activism. The Hindu. Retrieved September 24, 2009, from Factiva. (Article can be found at bottom of the post) Summary...

online activism freedom control

started by joanne ye on 24 Sep 09 no follow-up yet
Weiye Loh

Titans of science: David Attenborough meets Richard Dawkins | Science | The Guardian - 0 views

  • What is the one bit of science from your field that you think everyone should know?David Attenborough: The unity of life.Richard Dawkins: The unity of life that comes about through evolution, since we're all descended from a single common ancestor. It's almost too good to be true, that on one planet this extraordinary complexity of life should have come about by what is pretty much an intelligible process. And we're the only species capable of understanding it.
  • RD: I know you're working on a programme about Cambrian and pre-Cambrian fossils, David. A lot of people might think, "These are very old animals, at the beginning of evolution; they&nbsp;weren't very good at what they did." I&nbsp;suspect that isn't the case?DA: They were just as good, but as generalists, most were ousted from the competition.RD: So it probably is true there's a progressive element to evolution in the short term but not in the long term – that when a lineage branches out, it gets better for about five million years but not 500 million years. You wouldn't see progressive improvement over that kind of time scale.DA: No, things get more and more specialised. Not necessarily better.RD: The "camera" eyes of any modern animal would be better than what had come before.DA: Certainly... but they don't elaborate beyond function. When I listen to a soprano sing a Handel aria with an astonishing coloratura from that particular larynx, I say to myself, there has to be a biological reason that was useful at some stage. The larynx of a human being did not evolve without having some function. And the only function I can see is sexual attraction.RD: Sexual selection is important and probably underrated.DA: What I like to think is that if I think the male bird of paradise is beautiful, my appreciation of it is precisely the same as a female bird of paradise.
    • Weiye Loh
       
      Is survivability really all about sex and reproduction of future generation? 
  • People say Richard Feynman had one of these extraordinary minds that could grapple with ideas of which I have no concept. And you hear all the ancillary bits – like he was a good bongo player – that make him human. So I&nbsp;admire this man who could not only deal with string theory but also play the bongos. But he is beyond me. I have no idea what he was talking of.
  • ...6 more annotations...
  • RD: There does seem to be a sense in which physics has gone beyond what human intuition can understand. We shouldn't be too surprised about that because we're evolved to understand things that move at a medium pace at a medium scale. We can't cope with the very tiny scale of quantum physics or the very large scale of relativity.
  • DA: A physicist will tell me that this armchair is made of vibrations and that it's not really here at all. But when Samuel Johnson was asked to prove the material existence of reality, he just went up to a big stone and kicked it. I'm with him.
  • RD: It's intriguing that the chair is mostly empty space and the thing that stops you going through it is vibrations or energy fields. But it's also fascinating that, because we're animals that evolved to survive, what solidity is to most of us is something you can't walk through.
  • the science of the future may be vastly different from the science of today, and you have to have the humility to admit when you don't know. But instead of filling that vacuum with goblins or spirits, I&nbsp;think you should say, "Science is working on it."
  • DA: Yes, there was a letter in the paper [about Stephen Hawking's comments on the nonexistence of God] saying, "It's absolutely clear that the function of the world is to declare the glory of God." I thought, what does that sentence mean?!
  • What is the most difficult ethical dilemma facing science today?DA: How far do you go to preserve individual human life?RD: That's a good one, yes.DA: I mean, what are we to do with the NHS? How can you put a value in pounds, shillings and pence on an individual's life? There was a case with a bowel cancer drug – if you gave that drug, which costs several thousand pounds, it continued life for six weeks on. How can you make that decision?
  •  
    Of mind and matter: David Attenborough meets Richard Dawkins We paired up Britain's most celebrated scientists to chat about the big issues: the unity of life, ethics, energy, Handel - and the joy of riding a snowmobile
Weiye Loh

Rationally Speaking: Human, know thy place! - 0 views

  • I kicked off a recent episode of the Rationally Speaking podcast on the topic of transhumanism by defining it as “the idea that we should be pursuing science and technology to improve the human condition, modifying our bodies and our minds to make us smarter, healthier, happier, and potentially longer-lived.”
  • Massimo understandably expressed some skepticism about why there needs to be a transhumanist movement at all, given how incontestable their mission statement seems to be. As he rhetorically asked, “Is transhumanism more than just the idea that we should be using technologies to improve the human condition? Because that seems a pretty uncontroversial point.” Later in the episode, referring to things such as radical life extension and modifications of our minds and genomes, Massimo said, “I don't think these are things that one can necessarily have objections to in principle.”
  • There are a surprising number of people whose reaction, when they are presented with the possibility of making humanity much healthier, smarter and longer-lived, is not “That would be great,” nor “That would be great, but it's infeasible,” nor even “That would be great, but it's too risky.” Their reaction is, “That would be terrible.”
  • ...14 more annotations...
  • The people with this attitude aren't just fringe fundamentalists who are fearful of messing with God's Plan. Many of them are prestigious professors and authors whose arguments make no mention of religion. One of the most prominent examples is political theorist Francis Fukuyama, author of End of History, who published a book in 2003 called “Our Posthuman Future: Consequences of the Biotechnology Revolution.” In it he argues that we will lose our “essential” humanity by enhancing ourselves, and that the result will be a loss of respect for “human dignity” and a collapse of morality.
  • Fukuyama's reasoning represents a prominent strain of thought about human enhancement, and one that I find doubly fallacious. (Fukuyama is aware of the following criticisms, but neither I nor other reviewers were impressed by his attempt to defend himself against them.) The idea that the status quo represents some “essential” quality of humanity collapses when you zoom out and look at the steady change in the human condition over previous millennia. Our ancestors were less knowledgable, more tribalistic, less healthy, shorter-lived; would Fukuyama have argued for the preservation of all those qualities on the grounds that, in their respective time, they constituted an “essential human nature”? And even if there were such a thing as a persistent “human nature,” why is it necessarily worth preserving? In other words, I would argue that Fukuyama is committing both the fallacy of essentialism (there exists a distinct thing that is “human nature”) and the appeal to nature (the way things naturally are is how they ought to be).
  • Writer Bill McKibben, who was called “probably the nation's leading environmentalist” by the Boston Globe this year, and “the world's best green journalist” by Time magazine, published a book in 2003 called “Enough: Staying Human in an Engineered Age.” In it he writes, “That is the choice... one that no human should have to make... To be launched into a future without bounds, where meaning may evaporate.” McKibben concludes that it is likely that “meaning and pain, meaning and transience are inextricably intertwined.” Or as one blogger tartly paraphrased: “If we all live long healthy happy lives, Bill’s favorite poetry will become obsolete.”
  • President George W. Bush's Council on Bioethics, which advised him from 2001-2009, was steeped in it. Harvard professor of political philosophy Michael J. Sandel served on the Council from 2002-2005 and penned an article in the Atlantic Monthly called “The Case Against Perfection,” in which he objected to genetic engineering on the grounds that, basically, it’s uppity. He argues that genetic engineering is “the ultimate expression of our resolve to see ourselves astride the world, the masters of our nature.” Better we should be bowing in submission than standing in mastery, Sandel feels. Mastery “threatens to banish our appreciation of life as a gift,” he warns, and submitting to forces outside our control “restrains our tendency toward hubris.”
  • If you like Sandel's “It's uppity” argument against human enhancement, you'll love his fellow Councilmember Dr. William Hurlbut's argument against life extension: “It's unmanly.” Hurlbut's exact words, delivered in a 2007 debate with Aubrey de Grey: “I actually find a preoccupation with anti-aging technologies to be, I think, somewhat spiritually immature and unmanly... I’m inclined to think that there’s something profound about aging and death.”
  • And Council chairman Dr. Leon Kass, a professor of bioethics from the University of Chicago who served from 2001-2005, was arguably the worst of all. Like McKibben, Kass has frequently argued against radical life extension on the grounds that life's transience is central to its meaningfulness. “Could the beauty of flowers depend on the fact that they will soon wither?” he once asked. “How deeply could one deathless ‘human’ being love another?”
  • Kass has also argued against human enhancements on the same grounds as Fukuyama, that we shouldn't deviate from our proper nature as human beings. “To turn a man into a cockroach— as we don’t need Kafka to show us —would be dehumanizing. To try to turn a man into more than a man might be so as well,” he said. And Kass completes the anti-transhumanist triad (it robs life of meaning; it's dehumanizing; it's hubris) by echoing Sandel's call for humility and gratitude, urging, “We need a particular regard and respect for the special gift that is our own given nature.”
  • By now you may have noticed a familiar ring to a lot of this language. The idea that it's virtuous to suffer, and to humbly surrender control of your own fate, is a cornerstone of Christian morality.
  • it's fairly representative of standard Christian tropes: surrendering to God, submitting to God, trusting that God has good reasons for your suffering.
  • I suppose I can understand that if you believe in an all-powerful entity who will become irate if he thinks you are ungrateful for anything, then this kind of groveling might seem like a smart strategic move. But what I can't understand is adopting these same attitudes in the absence of any religious context. When secular people chastise each other for the “hubris” of trying to improve the “gift” of life they've received, I want to ask them: just who, exactly, are you groveling to? Who, exactly, are you afraid of affronting if you dare to reach for better things?
  • This is why transhumanism is most needed, from my perspective – to counter the astoundingly widespread attitude that suffering and 80-year-lifespans are good things that are worth preserving. That attitude may make sense conditional on certain peculiarly masochistic theologies, but the rest of us have no need to defer to it. It also may have been a comforting thing to tell ourselves back when we had no hope of remedying our situation, but that's not necessarily the case anymore.
  • I think there is a seperation of Transhumanism and what Massimo is referring to. Things like robotic arms and the like come from trying to deal with a specific defect and thus seperate it from Transhumanism. I would define transhumanism the same way you would (the achievement of a better human), but I would exclude the inventions of many life altering devices as transhumanism. If we could invent a device that just made you smarter, then ideed that would be transhumanism, but if we invented a device that could make someone that was metally challenged to be able to be normal, I would define this as modern medicine. I just want to make sure we seperate advances in modern medicine from transhumanism. Modern medicine being the one that advances to deal with specific medical issues to improve quality of life (usually to restore it to normal conditions) and transhumanism being the one that can advance every single human (perhaps equally?).
    • Weiye Loh
       
      Assumes that "normal conditions" exist. 
  • I agree with all your points about why the arguments against transhumanism and for suffering are ridiculous. That being said, when I first heard about the ideas of Transhumanism, after the initial excitement wore off (since I'm a big tech nerd), my reaction was more of less the same as Massimo's. I don't particularly see the need for a philosophical movement for this.
  • if people believe that suffering is something God ordained for us, you're not going to convince them otherwise with philosophical arguments any more than you'll convince them there's no God at all. If the technologies do develop, acceptance of them will come as their use becomes more prevalent, not with arguments.
  •  
    Human, know thy place!
Weiye Loh

Is it a boy or a girl? You decide - Prospect Magazine « Prospect Magazine - 0 views

  • The only way to guarantee either a daughter or son is to undergo pre-implantation genetic diagnosis: a genetic analysis of an embryo before it is placed in the womb. This is illegal in Britain except for couples at risk of having a child with a life-threatening gender-linked disorder.
  • It’s also illegal for clinics to offer sex selection methods such as MicroSort, that sift the slightly larger X chromosome-bearing (female) sperm from their weedier Y chromosome-bearing (male) counterparts, and then use the preferred sperm in an IVF cycle. With a success rate hovering around 80-90 per cent, it’s better than Mother Nature’s odds of conception, but not immaculate.
  • Years ago I agreed with this ban on socially motivated sex selection. But I can’t defend that stance today. My opposition was based on two worries: the gender balance being skewed—look at China—and the perils of letting society think it’s acceptable to prize one sex more than the other. Unlike many politicians, however, I think it is only right and proper to perform an ideological U-turn when presented with convincing opposing evidence.
  • ...4 more annotations...
  • A 2003 survey published in the journal Human Reproduction showed that few British adults would be concerned enough about their baby’s gender to use the technology, and most adults wanted the same number of sons as daughters
  • Bioethics specialist Edgar Dahl of the University of Geissen found that 68 per cent of Britons craved an equal number of boys and girls; 6 per cent wanted more boys; 4 per cent more girls; 3 per cent only boys; and 2 per cent only girls. Fascinatingly, even if a baby’s sex could be decided by simply taking a blue pill or a pink pill, 90 per cent of British respondents said they wouldn’t take it.
  • What about the danger of stigmatising the unwanted sex if gender selection was allowed? According to experts on so-called “gender disappointment,” the unwanted sex would actually be male.
  • I may think it is old-fashioned to want a son so that he can inherit the family business, or a daughter to have someone to go shopping with. But how different is that from the other preferences and expectations we have for our children, such as hoping they will be gifted at mathematics, music or sport? We all nurture secret expectations for our children: I hope that mine will be clever, beautiful, witty and wise. Perhaps it is not the end of the world if we allow some parents to add “female” or “male” to the list.
  •  
    Is it a boy or a girl? You decide ANJANA AHUJA   28th April 2010  -  Issue 170 Choosing the sex of an unborn child is illegal, but would it harm society if it wasn't?
Weiye Loh

Science Warriors' Ego Trips - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • By Carlin Romano Standing up for science excites some intellectuals the way beautiful actresses arouse Warren Beatty, or career liberals boil the blood of Glenn Beck and Rush Limbaugh. It's visceral.
  • A brave champion of beleaguered science in the modern age of pseudoscience, this Ayn Rand protagonist sarcastically derides the benighted irrationalists and glows with a self-anointed superiority. Who wouldn't want to feel that sense of power and rightness?
  • You hear the voice regularly—along with far more sensible stuff—in the latest of a now common genre of science patriotism, Nonsense on Stilts: How to Tell Science From Bunk (University of Chicago Press), by Massimo Pigliucci, a philosophy professor at the City University of New York.
  • ...24 more annotations...
  • it mixes eminent common sense and frequent good reporting with a cocksure hubris utterly inappropriate to the practice it apotheosizes.
  • According to Pigliucci, both Freudian psychoanalysis and Marxist theory of history "are too broad, too flexible with regard to observations, to actually tell us anything interesting." (That's right—not one "interesting" thing.) The idea of intelligent design in biology "has made no progress since its last serious articulation by natural theologian William Paley in 1802," and the empirical evidence for evolution is like that for "an open-and-shut murder case."
  • Pigliucci offers more hero sandwiches spiced with derision and certainty. Media coverage of science is "characterized by allegedly serious journalists who behave like comedians." Commenting on the highly publicized Dover, Pa., court case in which U.S. District Judge John E. Jones III ruled that intelligent-design theory is not science, Pigliucci labels the need for that judgment a "bizarre" consequence of the local school board's "inane" resolution. Noting the complaint of intelligent-design advocate William Buckingham that an approved science textbook didn't give creationism a fair shake, Pigliucci writes, "This is like complaining that a textbook in astronomy is too focused on the Copernican theory of the structure of the solar system and unfairly neglects the possibility that the Flying Spaghetti Monster is really pulling each planet's strings, unseen by the deluded scientists."
  • Or is it possible that the alternate view unfairly neglected could be more like that of Harvard scientist Owen Gingerich, who contends in God's Universe (Harvard University Press, 2006) that it is partly statistical arguments—the extraordinary unlikelihood eons ago of the physical conditions necessary for self-conscious life—that support his belief in a universe "congenially designed for the existence of intelligent, self-reflective life"?
  • Even if we agree that capital "I" and "D" intelligent-design of the scriptural sort—what Gingerich himself calls "primitive scriptural literalism"—is not scientifically credible, does that make Gingerich's assertion, "I believe in intelligent design, lowercase i and lowercase d," equivalent to Flying-Spaghetti-Monsterism? Tone matters. And sarcasm is not science.
  • The problem with polemicists like Pigliucci is that a chasm has opened up between two groups that might loosely be distinguished as "philosophers of science" and "science warriors."
  • Philosophers of science, often operating under the aegis of Thomas Kuhn, recognize that science is a diverse, social enterprise that has changed over time, developed different methodologies in different subsciences, and often advanced by taking putative pseudoscience seriously, as in debunking cold fusion
  • The science warriors, by contrast, often write as if our science of the moment is isomorphic with knowledge of an objective world-in-itself—Kant be damned!—and any form of inquiry that doesn't fit the writer's criteria of proper science must be banished as "bunk." Pigliucci, typically, hasn't much sympathy for radical philosophies of science. He calls the work of Paul Feyerabend "lunacy," deems Bruno Latour "a fool," and observes that "the great pronouncements of feminist science have fallen as flat as the similarly empty utterances of supporters of intelligent design."
  • It doesn't have to be this way. The noble enterprise of submitting nonscientific knowledge claims to critical scrutiny—an activity continuous with both philosophy and science—took off in an admirable way in the late 20th century when Paul Kurtz, of the University at Buffalo, established the Committee for the Scientific Investigation of Claims of the Paranormal (Csicop) in May 1976. Csicop soon after launched the marvelous journal Skeptical Inquirer
  • Although Pigliucci himself publishes in Skeptical Inquirer, his contributions there exhibit his signature smugness. For an antidote to Pigliucci's overweening scientism 'tude, it's refreshing to consult Kurtz's curtain-raising essay, "Science and the Public," in Science Under Siege (Prometheus Books, 2009, edited by Frazier)
  • Kurtz's commandment might be stated, "Don't mock or ridicule—investigate and explain." He writes: "We attempted to make it clear that we were interested in fair and impartial inquiry, that we were not dogmatic or closed-minded, and that skepticism did not imply a priori rejection of any reasonable claim. Indeed, I insisted that our skepticism was not totalistic or nihilistic about paranormal claims."
  • Kurtz combines the ethos of both critical investigator and philosopher of science. Describing modern science as a practice in which "hypotheses and theories are based upon rigorous methods of empirical investigation, experimental confirmation, and replication," he notes: "One must be prepared to overthrow an entire theoretical framework—and this has happened often in the history of science ... skeptical doubt is an integral part of the method of science, and scientists should be prepared to question received scientific doctrines and reject them in the light of new evidence."
  • Pigliucci, alas, allows his animus against the nonscientific to pull him away from sensitive distinctions among various sciences to sloppy arguments one didn't see in such earlier works of science patriotism as Carl Sagan's The Demon-Haunted World: Science as a Candle in the Dark (Random House, 1995). Indeed, he probably sets a world record for misuse of the word "fallacy."
  • To his credit, Pigliucci at times acknowledges the nondogmatic spine of science. He concedes that "science is characterized by a fuzzy borderline with other types of inquiry that may or may not one day become sciences." Science, he admits, "actually refers to a rather heterogeneous family of activities, not to a single and universal method." He rightly warns that some pseudoscience—for example, denial of HIV-AIDS causation—is dangerous and terrible.
  • But at other points, Pigliucci ferociously attacks opponents like the most unreflective science fanatic
  • He dismisses Feyerabend's view that "science is a religion" as simply "preposterous," even though he elsewhere admits that "methodological naturalism"—the commitment of all scientists to reject "supernatural" explanations—is itself not an empirically verifiable principle or fact, but rather an almost Kantian precondition of scientific knowledge. An article of faith, some cold-eyed Feyerabend fans might say.
  • He writes, "ID is not a scientific theory at all because there is no empirical observation that can possibly contradict it. Anything we observe in nature could, in principle, be attributed to an unspecified intelligent designer who works in mysterious ways." But earlier in the book, he correctly argues against Karl Popper that susceptibility to falsification cannot be the sole criterion of science, because science also confirms. It is, in principle, possible that an empirical observation could confirm intelligent design—i.e., that magic moment when the ultimate UFO lands with representatives of the intergalactic society that planted early life here, and we accept their evidence that they did it.
  • "As long as we do not venture to make hypotheses about who the designer is and why and how she operates," he writes, "there are no empirical constraints on the 'theory' at all. Anything goes, and therefore nothing holds, because a theory that 'explains' everything really explains nothing."
  • Here, Pigliucci again mixes up what's likely or provable with what's logically possible or rational. The creation stories of traditional religions and scriptures do, in effect, offer hypotheses, or claims, about who the designer is—e.g., see the Bible.
  • Far from explaining nothing because it explains everything, such an explanation explains a lot by explaining everything. It just doesn't explain it convincingly to a scientist with other evidentiary standards.
  • A sensible person can side with scientists on what's true, but not with Pigliucci on what's rational and possible. Pigliucci occasionally recognizes that. Late in his book, he concedes that "nonscientific claims may be true and still not qualify as science." But if that's so, and we care about truth, why exalt science to the degree he does? If there's really a heaven, and science can't (yet?) detect it, so much the worse for science.
  • Pigliucci quotes a line from Aristotle: "It is the mark of an educated mind to be able to entertain a thought without accepting it." Science warriors such as Pigliucci, or Michael Ruse in his recent clash with other philosophers in these pages, should reflect on a related modern sense of "entertain." One does not entertain a guest by mocking, deriding, and abusing the guest. Similarly, one does not entertain a thought or approach to knowledge by ridiculing it.
  • Long live Skeptical Inquirer! But can we deep-six the egomania and unearned arrogance of the science patriots? As Descartes, that immortal hero of scientists and skeptics everywhere, pointed out, true skepticism, like true charity, begins at home.
  • Carlin Romano, critic at large for The Chronicle Review, teaches philosophy and media theory at the University of Pennsylvania.
  •  
    April 25, 2010 Science Warriors' Ego Trips
Weiye Loh

Science, Strong Inference -- Proper Scientific Method - 0 views

  • Scientists these days tend to keep up a polite fiction that all science is equal. Except for the work of the misguided opponent whose arguments we happen to be refuting at the time, we speak as though every scientist's field and methods of study are as good as every other scientist's and perhaps a little better. This keeps us all cordial when it comes to recommending each other for government grants.
  • Why should there be such rapid advances in some fields and not in others? I think the usual explanations that we tend to think of - such as the tractability of the subject, or the quality or education of the men drawn into it, or the size of research contracts - are important but inadequate. I have begun to believe that the primary factor in scientific advance is an intellectual one. These rapidly moving fields are fields where a particular method of doing scientific research is systematically used and taught, an accumulative method of inductive inference that is so effective that I think it should be given the name of "strong inference." I believe it is important to examine this method, its use and history and rationale, and to see whether other groups and individuals might learn to adopt it profitably in their own scientific and intellectual work. In its separate elements, strong inference is just the simple and old-fashioned method of inductive inference that goes back to Francis Bacon. The steps are familiar to every college student and are practiced, off and on, by every scientist. The difference comes in their systematic application. Strong inference consists of applying the following steps to every problem in science, formally and explicitly and regularly: Devising alternative hypotheses; Devising a crucial experiment (or several of them), with alternative possible outcomes, each of which will, as nearly is possible, exclude one or more of the hypotheses; Carrying out the experiment so as to get a clean result; Recycling the procedure, making subhypotheses or sequential hypotheses to refine the possibilities that remain, and so on.
  • On any new problem, of course, inductive inference is not as simple and certain as deduction, because it involves reaching out into the unknown. Steps 1 and 2 require intellectual inventions, which must be cleverly chosen so that hypothesis, experiment, outcome, and exclusion will be related in a rigorous syllogism; and the question of how to generate such inventions is one which has been extensively discussed elsewhere (2, 3). What the formal schema reminds us to do is to try to make these inventions, to take the next step, to proceed to the next fork, without dawdling or getting tied up in irrelevancies.
  • ...28 more annotations...
  • It is clear why this makes for rapid and powerful progress. For exploring the unknown, there is no faster method; this is the minimum sequence of steps. Any conclusion that is not an exclusion is insecure and must be rechecked. Any delay in recycling to the next set of hypotheses is only a delay. Strong inference, and the logical tree it generates, are to inductive reasoning what the syllogism is to deductive reasoning in that it offers a regular method for reaching firm inductive conclusions one after the other as rapidly as possible.
  • "But what is so novel about this?" someone will say. This is the method of science and always has been, why give it a special name? The reason is that many of us have almost forgotten it. Science is now an everyday business. Equipment, calculations, lectures become ends in themselves. How many of us write down our alternatives and crucial experiments every day, focusing on the exclusion of a hypothesis? We may write our scientific papers so that it looks as if we had steps 1, 2, and 3 in mind all along. But in between, we do busywork. We become "method- oriented" rather than "problem-oriented." We say we prefer to "feel our way" toward generalizations. We fail to teach our students how to sharpen up their inductive inferences. And we do not realize the added power that the regular and explicit use of alternative hypothesis and sharp exclusion could give us at every step of our research.
  • A distinguished cell biologist rose and said, "No two cells give the same properties. Biology is the science of heterogeneous systems." And he added privately. "You know there are scientists, and there are people in science who are just working with these over-simplified model systems - DNA chains and in vitro systems - who are not doing science at all. We need their auxiliary work: they build apparatus, they make minor studies, but they are not scientists." To which Cy Levinthal replied: "Well, there are two kinds of biologists, those who are looking to see if there is one thing that can be understood and those who keep saying it is very complicated and that nothing can be understood. . . . You must study the simplest system you think has the properties you are interested in."
  • At the 1958 Conference on Biophysics, at Boulder, there was a dramatic confrontation between the two points of view. Leo Szilard said: "The problems of how enzymes are induced, of how proteins are synthesized, of how antibodies are formed, are closer to solution than is generally believed. If you do stupid experiments, and finish one a year, it can take 50 years. But if you stop doing experiments for a little while and think how proteins can possibly be synthesized, there are only about 5 different ways, not 50! And it will take only a few experiments to distinguish these." One of the young men added: "It is essentially the old question: How small and elegant an experiment can you perform?" These comments upset a number of those present. An electron microscopist said. "Gentlemen, this is off the track. This is philosophy of science." Szilard retorted. "I was not quarreling with third-rate scientists: I was quarreling with first-rate scientists."
  • Any criticism or challenge to consider changing our methods strikes of course at all our ego-defenses. But in this case the analytical method offers the possibility of such great increases in effectiveness that it is unfortunate that it cannot be regarded more often as a challenge to learning rather than as challenge to combat. Many of the recent triumphs in molecular biology have in fact been achieved on just such "oversimplified model systems," very much along the analytical lines laid down in the 1958 discussion. They have not fallen to the kind of men who justify themselves by saying "No two cells are alike," regardless of how true that may ultimately be. The triumphs are in fact triumphs of a new way of thinking.
  • the emphasis on strong inference
  • is also partly due to the nature of the fields themselves. Biology, with its vast informational detail and complexity, is a "high-information" field, where years and decades can easily be wasted on the usual type of "low-information" observations or experiments if one does not think carefully in advance about what the most important and conclusive experiments would be. And in high-energy physics, both the "information flux" of particles from the new accelerators and the million-dollar costs of operation have forced a similar analytical approach. It pays to have a top-notch group debate every experiment ahead of time; and the habit spreads throughout the field.
  • Historically, I think, there have been two main contributions to the development of a satisfactory strong-inference method. The first is that of Francis Bacon (13). He wanted a "surer method" of "finding out nature" than either the logic-chopping or all-inclusive theories of the time or the laudable but crude attempts to make inductions "by simple enumeration." He did not merely urge experiments as some suppose, he showed the fruitfulness of interconnecting theory and experiment so that the one checked the other. Of the many inductive procedures he suggested, the most important, I think, was the conditional inductive tree, which proceeded from alternative hypothesis (possible "causes," as he calls them), through crucial experiments ("Instances of the Fingerpost"), to exclusion of some alternatives and adoption of what is left ("establishing axioms"). His Instances of the Fingerpost are explicitly at the forks in the logical tree, the term being borrowed "from the fingerposts which are set up where roads part, to indicate the several directions."
  • ere was a method that could separate off the empty theories! Bacon, said the inductive method could be learned by anybody, just like learning to "draw a straighter line or more perfect circle . . . with the help of a ruler or a pair of compasses." "My way of discovering sciences goes far to level men's wit and leaves but little to individual excellence, because it performs everything by the surest rules and demonstrations." Even occasional mistakes would not be fatal. "Truth will sooner come out from error than from confusion."
  • Nevertheless there is a difficulty with this method. As Bacon emphasizes, it is necessary to make "exclusions." He says, "The induction which is to be available for the discovery and demonstration of sciences and arts, must analyze nature by proper rejections and exclusions, and then, after a sufficient number of negatives come to a conclusion on the affirmative instances." "[To man] it is granted only to proceed at first by negatives, and at last to end in affirmatives after exclusion has been exhausted." Or, as the philosopher Karl Popper says today there is no such thing as proof in science - because some later alternative explanation may be as good or better - so that science advances only by disproofs. There is no point in making hypotheses that are not falsifiable because such hypotheses do not say anything, "it must be possible for all empirical scientific system to be refuted by experience" (14).
  • The difficulty is that disproof is a hard doctrine. If you have a hypothesis and I have another hypothesis, evidently one of them must be eliminated. The scientist seems to have no choice but to be either soft-headed or disputatious. Perhaps this is why so many tend to resist the strong analytical approach and why some great scientists are so disputatious.
  • Fortunately, it seems to me, this difficulty can be removed by the use of a second great intellectual invention, the "method of multiple hypotheses," which is what was needed to round out the Baconian scheme. This is a method that was put forward by T.C. Chamberlin (15), a geologist at Chicago at the turn of the century, who is best known for his contribution to the Chamberlain-Moulton hypothesis of the origin of the solar system.
  • Chamberlin says our trouble is that when we make a single hypothesis, we become attached to it. "The moment one has offered an original explanation for a phenomenon which seems satisfactory, that moment affection for his intellectual child springs into existence, and as the explanation grows into a definite theory his parental affections cluster about his offspring and it grows more and more dear to him. . . . There springs up also unwittingly a pressing of the theory to make it fit the facts and a pressing of the facts to make them fit the theory..." "To avoid this grave danger, the method of multiple working hypotheses is urged. It differs from the simple working hypothesis in that it distributes the effort and divides the affections. . . . Each hypothesis suggests its own criteria, its own method of proof, its own method of developing the truth, and if a group of hypotheses encompass the subject on all sides, the total outcome of means and of methods is full and rich."
  • The conflict and exclusion of alternatives that is necessary to sharp inductive inference has been all too often a conflict between men, each with his single Ruling Theory. But whenever each man begins to have multiple working hypotheses, it becomes purely a conflict between ideas. It becomes much easier then for each of us to aim every day at conclusive disproofs - at strong inference - without either reluctance or combativeness. In fact, when there are multiple hypotheses, which are not anyone's "personal property," and when there are crucial experiments to test them, the daily life in the laboratory takes on an interest and excitement it never had, and the students can hardly wait to get to work to see how the detective story will come out. It seems to me that this is the reason for the development of those distinctive habits of mind and the "complex thought" that Chamberlin described, the reason for the sharpness, the excitement, the zeal, the teamwork - yes, even international teamwork - in molecular biology and high- energy physics today. What else could be so effective?
  • Unfortunately, I think, there are other other areas of science today that are sick by comparison, because they have forgotten the necessity for alternative hypotheses and disproof. Each man has only one branch - or none - on the logical tree, and it twists at random without ever coming to the need for a crucial decision at any point. We can see from the external symptoms that there is something scientifically wrong. The Frozen Method, The Eternal Surveyor, The Never Finished, The Great Man With a Single Hypothcsis, The Little Club of Dependents, The Vendetta, The All-Encompassing Theory Which Can Never Be Falsified.
  • a "theory" of this sort is not a theory at all, because it does not exclude anything. It predicts everything, and therefore does not predict anything. It becomes simply a verbal formula which the graduate student repeats and believes because the professor has said it so often. This is not science, but faith; not theory, but theology. Whether it is hand-waving or number-waving, or equation-waving, a theory is not a theory unless it can be disproved. That is, unless it can be falsified by some possible experimental outcome.
  • the work methods of a number of scientists have been testimony to the power of strong inference. Is success not due in many cases to systematic use of Bacon's "surest rules and demonstrations" as much as to rare and unattainable intellectual power? Faraday's famous diary (16), or Fermi's notebooks (3, 17), show how these men believed in the effectiveness of daily steps in applying formal inductive methods to one problem after another.
  • Surveys, taxonomy, design of equipment, systematic measurements and tables, theoretical computations - all have their proper and honored place, provided they are parts of a chain of precise induction of how nature works. Unfortunately, all too often they become ends in themselves, mere time-serving from the point of view of real scientific advance, a hypertrophied methodology that justifies itself as a lore of respectability.
  • We speak piously of taking measurements and making small studies that will "add another brick to the temple of science." Most such bricks just lie around the brickyard (20). Tables of constraints have their place and value, but the study of one spectrum after another, if not frequently re-evaluated, may become a substitute for thinking, a sad waste of intelligence in a research laboratory, and a mistraining whose crippling effects may last a lifetime.
  • Beware of the man of one method or one instrument, either experimental or theoretical. He tends to become method-oriented rather than problem-oriented. The method-oriented man is shackled; the problem-oriented man is at least reaching freely toward that is most important. Strong inference redirects a man to problem-orientation, but it requires him to be willing repeatedly to put aside his last methods and teach himself new ones.
  • anyone who asks the question about scientific effectiveness will also conclude that much of the mathematizing in physics and chemistry today is irrelevant if not misleading. The great value of mathematical formulation is that when an experiment agrees with a calculation to five decimal places, a great many alternative hypotheses are pretty well excluded (though the Bohr theory and the Schrödinger theory both predict exactly the same Rydberg constant!). But when the fit is only to two decimal places, or one, it may be a trap for the unwary; it may be no better than any rule-of-thumb extrapolation, and some other kind of qualitative exclusion might be more rigorous for testing the assumptions and more important to scientific understanding than the quantitative fit.
  • Today we preach that science is not science unless it is quantitative. We substitute correlations for causal studies, and physical equations for organic reasoning. Measurements and equations are supposed to sharpen thinking, but, in my observation, they more often tend to make the thinking noncausal and fuzzy. They tend to become the object of scientific manipulation instead of auxiliary tests of crucial inferences.
  • Many - perhaps most - of the great issues of science are qualitative, not quantitative, even in physics and chemistry. Equations and measurements are useful when and only when they are related to proof; but proof or disproof comes first and is in fact strongest when it is absolutely convincing without any quantitative measurement.
  • you can catch phenomena in a logical box or in a mathematical box. The logical box is coarse but strong. The mathematical box is fine-grained but flimsy. The mathematical box is a beautiful way of wrapping up a problem, but it will not hold the phenomena unless they have been caught in a logical box to begin with.
  • Of course it is easy - and all too common - for one scientist to call the others unscientific. My point is not that my particular conclusions here are necessarily correct, but that we have long needed some absolute standard of possible scientific effectiveness by which to measure how well we are succeeding in various areas - a standard that many could agree on and one that would be undistorted by the scientific pressures and fashions of the times and the vested interests and busywork that they develop. It is not public evaluation I am interested in so much as a private measure by which to compare one's own scientific performance with what it might be. I believe that strong inference provides this kind of standard of what the maximum possible scientific effectiveness could be - as well as a recipe for reaching it.
  • The strong-inference point of view is so resolutely critical of methods of work and values in science that any attempt to compare specific cases is likely to sound but smug and destructive. Mainly one should try to teach it by example and by exhorting to self-analysis and self-improvement only in general terms
  • one severe but useful private test - a touchstone of strong inference - that removes the necessity for third-person criticism, because it is a test that anyone can learn to carry with him for use as needed. It is our old friend the Baconian "exclusion," but I call it "The Question." Obviously it should be applied as much to one's own thinking as to others'. It consists of asking in your own mind, on hearing any scientific explanation or theory put forward, "But sir, what experiment could disprove your hypothesis?"; or, on hearing a scientific experiment described, "But sir, what hypothesis does your experiment disprove?"
  • It is not true that all science is equal; or that we cannot justly compare the effectiveness of scientists by any method other than a mutual-recommendation system. The man to watch, the man to put your money on, is not the man who wants to make "a survey" or a "more detailed study" but the man with the notebook, the man with the alternative hypotheses and the crucial experiments, the man who knows how to answer your Question of disproof and is already working on it.
  •  
    There is so much bad science and bad statistics information in media reports, publications, and shared between conversants that I think it is important to understand about facts and proofs and the associated pitfalls.
Weiye Loh

Steve Martin Swindled: German Art Forgery Scandal Reaches Hollywood - SPIEGEL ONLINE - ... - 0 views

  • Before the purchase a Campendonk expert had confirmed the painting's authenticity and identified the painter's signature on a label attached to the back. But 15 months later Martin, who would later publish a novel about the New York art scene called "An Object of Beauty," tried to re-sell the work. Art auction house Christie's finally auctioned it off in February 2006 to a Swiss businesswoman for €500,000 -- a loss of €200,000 from Martin's original purchase price.
  • Some forgeries of Max Ernst paintings were so convincing that even Werner Spies, an art historian and Ernst expert, gave them his seal of approval. When the true origin of the paintings emerged last year it caused a commotion in the art community, where trading works by classic 20th century artists is a lucrative business.
Weiye Loh

Christina Patterson: Prejudice and the pursuit of 'cool' - Christina Patterson, Comment... - 0 views

  • We're confused because we seem to think that to be black is to be "cool". We seem to think that being black has something to do with playing sport very well, or being very handsome, or very beautiful, or very sexy. We seem to think it has something do with multi-millionaire musicians who make music that uses words like "nigger", "bitch" and "whore". We seem to think, or some people seem to think, that knowing a black person, or having had sex with a black person, is something to boast about to another black person, or even to a white person. Something that will make us look "cool".
Weiye Loh

Hamlet and the region of death - The Boston Globe - 0 views

  • To many readers — and to some of Moretti’s fellow academics — the very notion of quantitative literary studies can seem like an offense to that which made literature worth studying in the first place: its meaning and beauty. For Moretti, however, moving literary scholarship beyond reading is the key to producing new knowledge about old texts — even ones we’ve been studying for centuries.
  •  
    Franco Moretti, however, often doesn't read the books he studies. Instead, he analyzes them as data. Working with a small group of graduate students, the Stanford University English professor has fed thousands of digitized texts into databases and then mined the accumulated information for new answers to new questions. How far, on average, do characters in 19th-century English novels walk over the course of a book? How frequently are new genres of popular fiction invented? How many words does the average novel's protagonist speak? By posing these and other questions, Moretti has become the unofficial leader of a new, more quantitative kind of literary study.
Weiye Loh

Bloggers who get gifts or money may have to own up - 4 views

By Chua Hian Hou from Straits Times BLOGGERS and users of other new media may soon have to say so upfront if they receive gifts or money for their write-ups. The Media Development Authority (MDA)...

Regulations Blogs Subjectivity Ethics Transparency

started by Weiye Loh on 12 Oct 09 no follow-up yet
qiyi liao

Beauty in Nano - 4 views

started by qiyi liao on 28 Oct 09 no follow-up yet
Weiye Loh

Edge: HOW DOES OUR LANGUAGE SHAPE THE WAY WE THINK? By Lera Boroditsky - 0 views

  • Do the languages we speak shape the way we see the world, the way we think, and the way we live our lives? Do people who speak different languages think differently simply because they speak different languages? Does learning new languages change the way you think? Do polyglots think differently when speaking different languages?
  • For a long time, the idea that language might shape thought was considered at best untestable and more often simply wrong. Research in my labs at Stanford University and at MIT has helped reopen this question. We have collected data around the world: from China, Greece, Chile, Indonesia, Russia, and Aboriginal Australia.
  • What we have learned is that people who speak different languages do indeed think differently and that even flukes of grammar can profoundly affect how we see the world.
  • ...15 more annotations...
  • Suppose you want to say, "Bush read Chomsky's latest book." Let's focus on just the verb, "read." To say this sentence in English, we have to mark the verb for tense; in this case, we have to pronounce it like "red" and not like "reed." In Indonesian you need not (in fact, you can't) alter the verb to mark tense. In Russian you would have to alter the verb to indicate tense and gender. So if it was Laura Bush who did the reading, you'd use a different form of the verb than if it was George. In Russian you'd also have to include in the verb information about completion. If George read only part of the book, you'd use a different form of the verb than if he'd diligently plowed through the whole thing. In Turkish you'd have to include in the verb how you acquired this information: if you had witnessed this unlikely event with your own two eyes, you'd use one verb form, but if you had simply read or heard about it, or inferred it from something Bush said, you'd use a different verb form.
  • Clearly, languages require different things of their speakers. Does this mean that the speakers think differently about the world? Do English, Indonesian, Russian, and Turkish speakers end up attending to, partitioning, and remembering their experiences differently just because they speak different languages?
  • For some scholars, the answer to these questions has been an obvious yes. Just look at the way people talk, they might say. Certainly, speakers of different languages must attend to and encode strikingly different aspects of the world just so they can use their language properly. Scholars on the other side of the debate don't find the differences in how people talk convincing. All our linguistic utterances are sparse, encoding only a small part of the information we have available. Just because English speakers don't include the same information in their verbs that Russian and Turkish speakers do doesn't mean that English speakers aren't paying attention to the same things; all it means is that they're not talking about them. It's possible that everyone thinks the same way, notices the same things, but just talks differently.
  • Believers in cross-linguistic differences counter that everyone does not pay attention to the same things: if everyone did, one might think it would be easy to learn to speak other languages. Unfortunately, learning a new language (especially one not closely related to those you know) is never easy; it seems to require paying attention to a new set of distinctions. Whether it's distinguishing modes of being in Spanish, evidentiality in Turkish, or aspect in Russian, learning to speak these languages requires something more than just learning vocabulary: it requires paying attention to the right things in the world so that you have the correct information to include in what you say.
  • Follow me to Pormpuraaw, a small Aboriginal community on the western edge of Cape York, in northern Australia. I came here because of the way the locals, the Kuuk Thaayorre, talk about space. Instead of words like "right," "left," "forward," and "back," which, as commonly used in English, define space relative to an observer, the Kuuk Thaayorre, like many other Aboriginal groups, use cardinal-direction terms — north, south, east, and west — to define space.1 This is done at all scales, which means you have to say things like "There's an ant on your southeast leg" or "Move the cup to the north northwest a little bit." One obvious consequence of speaking such a language is that you have to stay oriented at all times, or else you cannot speak properly. The normal greeting in Kuuk Thaayorre is "Where are you going?" and the answer should be something like " Southsoutheast, in the middle distance." If you don't know which way you're facing, you can't even get past "Hello."
  • The result is a profound difference in navigational ability and spatial knowledge between speakers of languages that rely primarily on absolute reference frames (like Kuuk Thaayorre) and languages that rely on relative reference frames (like English).2 Simply put, speakers of languages like Kuuk Thaayorre are much better than English speakers at staying oriented and keeping track of where they are, even in unfamiliar landscapes or inside unfamiliar buildings. What enables them — in fact, forces them — to do this is their language. Having their attention trained in this way equips them to perform navigational feats once thought beyond human capabilities. Because space is such a fundamental domain of thought, differences in how people think about space don't end there. People rely on their spatial knowledge to build other, more complex, more abstract representations. Representations of such things as time, number, musical pitch, kinship relations, morality, and emotions have been shown to depend on how we think about space. So if the Kuuk Thaayorre think differently about space, do they also think differently about other things, like time? This is what my collaborator Alice Gaby and I came to Pormpuraaw to find out.
  • To test this idea, we gave people sets of pictures that showed some kind of temporal progression (e.g., pictures of a man aging, or a crocodile growing, or a banana being eaten). Their job was to arrange the shuffled photos on the ground to show the correct temporal order. We tested each person in two separate sittings, each time facing in a different cardinal direction. If you ask English speakers to do this, they'll arrange the cards so that time proceeds from left to right. Hebrew speakers will tend to lay out the cards from right to left, showing that writing direction in a language plays a role.3 So what about folks like the Kuuk Thaayorre, who don't use words like "left" and "right"? What will they do? The Kuuk Thaayorre did not arrange the cards more often from left to right than from right to left, nor more toward or away from the body. But their arrangements were not random: there was a pattern, just a different one from that of English speakers. Instead of arranging time from left to right, they arranged it from east to west. That is, when they were seated facing south, the cards went left to right. When they faced north, the cards went from right to left. When they faced east, the cards came toward the body and so on. This was true even though we never told any of our subjects which direction they faced. The Kuuk Thaayorre not only knew that already (usually much better than I did), but they also spontaneously used this spatial orientation to construct their representations of time.
  • I have described how languages shape the way we think about space, time, colors, and objects. Other studies have found effects of language on how people construe events, reason about causality, keep track of number, understand material substance, perceive and experience emotion, reason about other people's minds, choose to take risks, and even in the way they choose professions and spouses.8 Taken together, these results show that linguistic processes are pervasive in most fundamental domains of thought, unconsciously shaping us from the nuts and bolts of cognition and perception to our loftiest abstract notions and major life decisions. Language is central to our experience of being human, and the languages we speak profoundly shape the way we think, the way we see the world, the way we live our lives.
  • The fact that even quirks of grammar, such as grammatical gender, can affect our thinking is profound. Such quirks are pervasive in language; gender, for example, applies to all nouns, which means that it is affecting how people think about anything that can be designated by a noun.
  • How does an artist decide whether death, say, or time should be painted as a man or a woman? It turns out that in 85 percent of such personifications, whether a male or female figure is chosen is predicted by the grammatical gender of the word in the artist's native language. So, for example, German painters are more likely to paint death as a man, whereas Russian painters are more likely to paint death as a woman.
  • Does treating chairs as masculine and beds as feminine in the grammar make Russian speakers think of chairs as being more like men and beds as more like women in some way? It turns out that it does. In one study, we asked German and Spanish speakers to describe objects having opposite gender assignment in those two languages. The descriptions they gave differed in a way predicted by grammatical gender. For example, when asked to describe a "key" — a word that is masculine in German and feminine in Spanish — the German speakers were more likely to use words like "hard," "heavy," "jagged," "metal," "serrated," and "useful," whereas Spanish speakers were more likely to say "golden," "intricate," "little," "lovely," "shiny," and "tiny." To describe a "bridge," which is feminine in German and masculine in Spanish, the German speakers said "beautiful," "elegant," "fragile," "peaceful," "pretty," and "slender," and the Spanish speakers said "big," "dangerous," "long," "strong," "sturdy," and "towering." This was true even though all testing was done in English, a language without grammatical gender. The same pattern of results also emerged in entirely nonlinguistic tasks (e.g., rating similarity between pictures). And we can also show that it is aspects of language per se that shape how people think: teaching English speakers new grammatical gender systems influences mental representations of objects in the same way it does with German and Spanish speakers. Apparently even small flukes of grammar, like the seemingly arbitrary assignment of gender to a noun, can have an effect on people's ideas of concrete objects in the world.
  • Even basic aspects of time perception can be affected by language. For example, English speakers prefer to talk about duration in terms of length (e.g., "That was a short talk," "The meeting didn't take long"), while Spanish and Greek speakers prefer to talk about time in terms of amount, relying more on words like "much" "big", and "little" rather than "short" and "long" Our research into such basic cognitive abilities as estimating duration shows that speakers of different languages differ in ways predicted by the patterns of metaphors in their language. (For example, when asked to estimate duration, English speakers are more likely to be confused by distance information, estimating that a line of greater length remains on the test screen for a longer period of time, whereas Greek speakers are more likely to be confused by amount, estimating that a container that is fuller remains longer on the screen.)
  • An important question at this point is: Are these differences caused by language per se or by some other aspect of culture? Of course, the lives of English, Mandarin, Greek, Spanish, and Kuuk Thaayorre speakers differ in a myriad of ways. How do we know that it is language itself that creates these differences in thought and not some other aspect of their respective cultures? One way to answer this question is to teach people new ways of talking and see if that changes the way they think. In our lab, we've taught English speakers different ways of talking about time. In one such study, English speakers were taught to use size metaphors (as in Greek) to describe duration (e.g., a movie is larger than a sneeze), or vertical metaphors (as in Mandarin) to describe event order. Once the English speakers had learned to talk about time in these new ways, their cognitive performance began to resemble that of Greek or Mandarin speakers. This suggests that patterns in a language can indeed play a causal role in constructing how we think.6 In practical terms, it means that when you're learning a new language, you're not simply learning a new way of talking, you are also inadvertently learning a new way of thinking. Beyond abstract or complex domains of thought like space and time, languages also meddle in basic aspects of visual perception — our ability to distinguish colors, for example. Different languages divide up the color continuum differently: some make many more distinctions between colors than others, and the boundaries often don't line up across languages.
  • To test whether differences in color language lead to differences in color perception, we compared Russian and English speakers' ability to discriminate shades of blue. In Russian there is no single word that covers all the colors that English speakers call "blue." Russian makes an obligatory distinction between light blue (goluboy) and dark blue (siniy). Does this distinction mean that siniy blues look more different from goluboy blues to Russian speakers? Indeed, the data say yes. Russian speakers are quicker to distinguish two shades of blue that are called by the different names in Russian (i.e., one being siniy and the other being goluboy) than if the two fall into the same category. For English speakers, all these shades are still designated by the same word, "blue," and there are no comparable differences in reaction time. Further, the Russian advantage disappears when subjects are asked to perform a verbal interference task (reciting a string of digits) while making color judgments but not when they're asked to perform an equally difficult spatial interference task (keeping a novel visual pattern in memory). The disappearance of the advantage when performing a verbal task shows that language is normally involved in even surprisingly basic perceptual judgments — and that it is language per se that creates this difference in perception between Russian and English speakers.
  • What it means for a language to have grammatical gender is that words belonging to different genders get treated differently grammatically and words belonging to the same grammatical gender get treated the same grammatically. Languages can require speakers to change pronouns, adjective and verb endings, possessives, numerals, and so on, depending on the noun's gender. For example, to say something like "my chair was old" in Russian (moy stul bil' stariy), you'd need to make every word in the sentence agree in gender with "chair" (stul), which is masculine in Russian. So you'd use the masculine form of "my," "was," and "old." These are the same forms you'd use in speaking of a biological male, as in "my grandfather was old." If, instead of speaking of a chair, you were speaking of a bed (krovat'), which is feminine in Russian, or about your grandmother, you would use the feminine form of "my," "was," and "old."
  •  
    For a long time, the idea that language might shape thought was considered at best untestable and more often simply wrong. Research in my labs at Stanford University and at MIT has helped reopen this question. We have collected data around the world: from China, Greece, Chile, Indonesia, Russia, and Aboriginal Australia. What we have learned is that people who speak different languages do indeed think differently and that even flukes of grammar can profoundly affect how we see the world. Language is a uniquely human gift, central to our experience of being human. Appreciating its role in constructing our mental lives brings us one step closer to understanding the very nature of humanity.
1 - 20 of 20
Showing 20 items per page