Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Natural Language

Rss Feed Group items tagged

Weiye Loh

Edge: HOW DOES OUR LANGUAGE SHAPE THE WAY WE THINK? By Lera Boroditsky - 0 views

  • Do the languages we speak shape the way we see the world, the way we think, and the way we live our lives? Do people who speak different languages think differently simply because they speak different languages? Does learning new languages change the way you think? Do polyglots think differently when speaking different languages?
  • For a long time, the idea that language might shape thought was considered at best untestable and more often simply wrong. Research in my labs at Stanford University and at MIT has helped reopen this question. We have collected data around the world: from China, Greece, Chile, Indonesia, Russia, and Aboriginal Australia.
  • What we have learned is that people who speak different languages do indeed think differently and that even flukes of grammar can profoundly affect how we see the world.
  • ...15 more annotations...
  • Suppose you want to say, "Bush read Chomsky's latest book." Let's focus on just the verb, "read." To say this sentence in English, we have to mark the verb for tense; in this case, we have to pronounce it like "red" and not like "reed." In Indonesian you need not (in fact, you can't) alter the verb to mark tense. In Russian you would have to alter the verb to indicate tense and gender. So if it was Laura Bush who did the reading, you'd use a different form of the verb than if it was George. In Russian you'd also have to include in the verb information about completion. If George read only part of the book, you'd use a different form of the verb than if he'd diligently plowed through the whole thing. In Turkish you'd have to include in the verb how you acquired this information: if you had witnessed this unlikely event with your own two eyes, you'd use one verb form, but if you had simply read or heard about it, or inferred it from something Bush said, you'd use a different verb form.
  • Clearly, languages require different things of their speakers. Does this mean that the speakers think differently about the world? Do English, Indonesian, Russian, and Turkish speakers end up attending to, partitioning, and remembering their experiences differently just because they speak different languages?
  • For some scholars, the answer to these questions has been an obvious yes. Just look at the way people talk, they might say. Certainly, speakers of different languages must attend to and encode strikingly different aspects of the world just so they can use their language properly. Scholars on the other side of the debate don't find the differences in how people talk convincing. All our linguistic utterances are sparse, encoding only a small part of the information we have available. Just because English speakers don't include the same information in their verbs that Russian and Turkish speakers do doesn't mean that English speakers aren't paying attention to the same things; all it means is that they're not talking about them. It's possible that everyone thinks the same way, notices the same things, but just talks differently.
  • Believers in cross-linguistic differences counter that everyone does not pay attention to the same things: if everyone did, one might think it would be easy to learn to speak other languages. Unfortunately, learning a new language (especially one not closely related to those you know) is never easy; it seems to require paying attention to a new set of distinctions. Whether it's distinguishing modes of being in Spanish, evidentiality in Turkish, or aspect in Russian, learning to speak these languages requires something more than just learning vocabulary: it requires paying attention to the right things in the world so that you have the correct information to include in what you say.
  • Follow me to Pormpuraaw, a small Aboriginal community on the western edge of Cape York, in northern Australia. I came here because of the way the locals, the Kuuk Thaayorre, talk about space. Instead of words like "right," "left," "forward," and "back," which, as commonly used in English, define space relative to an observer, the Kuuk Thaayorre, like many other Aboriginal groups, use cardinal-direction terms — north, south, east, and west — to define space.1 This is done at all scales, which means you have to say things like "There's an ant on your southeast leg" or "Move the cup to the north northwest a little bit." One obvious consequence of speaking such a language is that you have to stay oriented at all times, or else you cannot speak properly. The normal greeting in Kuuk Thaayorre is "Where are you going?" and the answer should be something like " Southsoutheast, in the middle distance." If you don't know which way you're facing, you can't even get past "Hello."
  • The result is a profound difference in navigational ability and spatial knowledge between speakers of languages that rely primarily on absolute reference frames (like Kuuk Thaayorre) and languages that rely on relative reference frames (like English).2 Simply put, speakers of languages like Kuuk Thaayorre are much better than English speakers at staying oriented and keeping track of where they are, even in unfamiliar landscapes or inside unfamiliar buildings. What enables them — in fact, forces them — to do this is their language. Having their attention trained in this way equips them to perform navigational feats once thought beyond human capabilities. Because space is such a fundamental domain of thought, differences in how people think about space don't end there. People rely on their spatial knowledge to build other, more complex, more abstract representations. Representations of such things as time, number, musical pitch, kinship relations, morality, and emotions have been shown to depend on how we think about space. So if the Kuuk Thaayorre think differently about space, do they also think differently about other things, like time? This is what my collaborator Alice Gaby and I came to Pormpuraaw to find out.
  • To test this idea, we gave people sets of pictures that showed some kind of temporal progression (e.g., pictures of a man aging, or a crocodile growing, or a banana being eaten). Their job was to arrange the shuffled photos on the ground to show the correct temporal order. We tested each person in two separate sittings, each time facing in a different cardinal direction. If you ask English speakers to do this, they'll arrange the cards so that time proceeds from left to right. Hebrew speakers will tend to lay out the cards from right to left, showing that writing direction in a language plays a role.3 So what about folks like the Kuuk Thaayorre, who don't use words like "left" and "right"? What will they do? The Kuuk Thaayorre did not arrange the cards more often from left to right than from right to left, nor more toward or away from the body. But their arrangements were not random: there was a pattern, just a different one from that of English speakers. Instead of arranging time from left to right, they arranged it from east to west. That is, when they were seated facing south, the cards went left to right. When they faced north, the cards went from right to left. When they faced east, the cards came toward the body and so on. This was true even though we never told any of our subjects which direction they faced. The Kuuk Thaayorre not only knew that already (usually much better than I did), but they also spontaneously used this spatial orientation to construct their representations of time.
  • I have described how languages shape the way we think about space, time, colors, and objects. Other studies have found effects of language on how people construe events, reason about causality, keep track of number, understand material substance, perceive and experience emotion, reason about other people's minds, choose to take risks, and even in the way they choose professions and spouses.8 Taken together, these results show that linguistic processes are pervasive in most fundamental domains of thought, unconsciously shaping us from the nuts and bolts of cognition and perception to our loftiest abstract notions and major life decisions. Language is central to our experience of being human, and the languages we speak profoundly shape the way we think, the way we see the world, the way we live our lives.
  • The fact that even quirks of grammar, such as grammatical gender, can affect our thinking is profound. Such quirks are pervasive in language; gender, for example, applies to all nouns, which means that it is affecting how people think about anything that can be designated by a noun.
  • How does an artist decide whether death, say, or time should be painted as a man or a woman? It turns out that in 85 percent of such personifications, whether a male or female figure is chosen is predicted by the grammatical gender of the word in the artist's native language. So, for example, German painters are more likely to paint death as a man, whereas Russian painters are more likely to paint death as a woman.
  • Does treating chairs as masculine and beds as feminine in the grammar make Russian speakers think of chairs as being more like men and beds as more like women in some way? It turns out that it does. In one study, we asked German and Spanish speakers to describe objects having opposite gender assignment in those two languages. The descriptions they gave differed in a way predicted by grammatical gender. For example, when asked to describe a "key" — a word that is masculine in German and feminine in Spanish — the German speakers were more likely to use words like "hard," "heavy," "jagged," "metal," "serrated," and "useful," whereas Spanish speakers were more likely to say "golden," "intricate," "little," "lovely," "shiny," and "tiny." To describe a "bridge," which is feminine in German and masculine in Spanish, the German speakers said "beautiful," "elegant," "fragile," "peaceful," "pretty," and "slender," and the Spanish speakers said "big," "dangerous," "long," "strong," "sturdy," and "towering." This was true even though all testing was done in English, a language without grammatical gender. The same pattern of results also emerged in entirely nonlinguistic tasks (e.g., rating similarity between pictures). And we can also show that it is aspects of language per se that shape how people think: teaching English speakers new grammatical gender systems influences mental representations of objects in the same way it does with German and Spanish speakers. Apparently even small flukes of grammar, like the seemingly arbitrary assignment of gender to a noun, can have an effect on people's ideas of concrete objects in the world.
  • Even basic aspects of time perception can be affected by language. For example, English speakers prefer to talk about duration in terms of length (e.g., "That was a short talk," "The meeting didn't take long"), while Spanish and Greek speakers prefer to talk about time in terms of amount, relying more on words like "much" "big", and "little" rather than "short" and "long" Our research into such basic cognitive abilities as estimating duration shows that speakers of different languages differ in ways predicted by the patterns of metaphors in their language. (For example, when asked to estimate duration, English speakers are more likely to be confused by distance information, estimating that a line of greater length remains on the test screen for a longer period of time, whereas Greek speakers are more likely to be confused by amount, estimating that a container that is fuller remains longer on the screen.)
  • An important question at this point is: Are these differences caused by language per se or by some other aspect of culture? Of course, the lives of English, Mandarin, Greek, Spanish, and Kuuk Thaayorre speakers differ in a myriad of ways. How do we know that it is language itself that creates these differences in thought and not some other aspect of their respective cultures? One way to answer this question is to teach people new ways of talking and see if that changes the way they think. In our lab, we've taught English speakers different ways of talking about time. In one such study, English speakers were taught to use size metaphors (as in Greek) to describe duration (e.g., a movie is larger than a sneeze), or vertical metaphors (as in Mandarin) to describe event order. Once the English speakers had learned to talk about time in these new ways, their cognitive performance began to resemble that of Greek or Mandarin speakers. This suggests that patterns in a language can indeed play a causal role in constructing how we think.6 In practical terms, it means that when you're learning a new language, you're not simply learning a new way of talking, you are also inadvertently learning a new way of thinking. Beyond abstract or complex domains of thought like space and time, languages also meddle in basic aspects of visual perception — our ability to distinguish colors, for example. Different languages divide up the color continuum differently: some make many more distinctions between colors than others, and the boundaries often don't line up across languages.
  • To test whether differences in color language lead to differences in color perception, we compared Russian and English speakers' ability to discriminate shades of blue. In Russian there is no single word that covers all the colors that English speakers call "blue." Russian makes an obligatory distinction between light blue (goluboy) and dark blue (siniy). Does this distinction mean that siniy blues look more different from goluboy blues to Russian speakers? Indeed, the data say yes. Russian speakers are quicker to distinguish two shades of blue that are called by the different names in Russian (i.e., one being siniy and the other being goluboy) than if the two fall into the same category. For English speakers, all these shades are still designated by the same word, "blue," and there are no comparable differences in reaction time. Further, the Russian advantage disappears when subjects are asked to perform a verbal interference task (reciting a string of digits) while making color judgments but not when they're asked to perform an equally difficult spatial interference task (keeping a novel visual pattern in memory). The disappearance of the advantage when performing a verbal task shows that language is normally involved in even surprisingly basic perceptual judgments — and that it is language per se that creates this difference in perception between Russian and English speakers.
  • What it means for a language to have grammatical gender is that words belonging to different genders get treated differently grammatically and words belonging to the same grammatical gender get treated the same grammatically. Languages can require speakers to change pronouns, adjective and verb endings, possessives, numerals, and so on, depending on the noun's gender. For example, to say something like "my chair was old" in Russian (moy stul bil' stariy), you'd need to make every word in the sentence agree in gender with "chair" (stul), which is masculine in Russian. So you'd use the masculine form of "my," "was," and "old." These are the same forms you'd use in speaking of a biological male, as in "my grandfather was old." If, instead of speaking of a chair, you were speaking of a bed (krovat'), which is feminine in Russian, or about your grandmother, you would use the feminine form of "my," "was," and "old."
  •  
    For a long time, the idea that language might shape thought was considered at best untestable and more often simply wrong. Research in my labs at Stanford University and at MIT has helped reopen this question. We have collected data around the world: from China, Greece, Chile, Indonesia, Russia, and Aboriginal Australia. What we have learned is that people who speak different languages do indeed think differently and that even flukes of grammar can profoundly affect how we see the world. Language is a uniquely human gift, central to our experience of being human. Appreciating its role in constructing our mental lives brings us one step closer to understanding the very nature of humanity.
Weiye Loh

CultureLab: Thoughts within thoughts make us human - 0 views

  • Corballis reckons instead that the thought processes that made language possible were non-linguistic, but had recursive properties to which language adapted: "Where Chomsky views thought through the lens of language, I prefer to view language though the lens of thought." From this, says Corballis, follows a better understanding of how humans actually think - and a very different perspective on language and its evolution.
  • So how did recursion help ancient humans pull themselves up by their cognitive bootstraps? It allowed us to engage in mental time travel, says Corballis, the recursive operation whereby we recall past episodes into present consciousness and imagine future ones, and sometimes even insert fictions into reality.
  • theory of mind is uniquely highly developed in humans: I may know not only what you are thinking, says Corballis, but also that you know what I am thinking. Most - but not all - language depends on this capability.
  • ...3 more annotations...
  • Corballis's theories also help make sense of apparent anomalies such as linguist and anthropologist Daniel's Everett's work on the Pirahã, an Amazonian people who hit the headlines because of debates over whether their language has any words for colours, and, crucially, numbers. Corballis now thinks that the Pirahã language may not be that unusual, and cites the example of other languages from oral cultures, such as the Iatmul language of New Guinea, which is also said to lack recursion.
  • The emerging point is that recursion developed in the mind and need not be expressed in a language. But, as Corballis is at pains to point out, although recursion was critical to the evolution of the human mind, it is not one of those "modules" much beloved of evolutionary psychologists, many of which are said to have evolved in the Pleistocene. Nor did it depend on some genetic mutation or the emergence of some new neuron or brain structure. Instead, he suggests it came of progressive increases in short-term memory and capacity for hierarchical organisation - all dependent in turn on incremental increases in brain size.
  • But as Corballis admits, this brain size increase was especially rapid in the Pleistocene. These incremental changes can lead to sudden more substantial jumps - think water boiling or balloons popping. In mathematics these shifts are called catastrophes. So, notes Corballis, wryly, "we may perhaps conclude that the emergence of the human mind was catastrophic". Let's hope that's not too prescient.
  •  
    His new book, The Recursive Mind: The origins of human language, thought, and civilization, is a fascinating and well-grounded exposition of the nature and power of recursion. In its ultra-reasonable way, this is quite a revolutionary book because it attacks key notions about language and thought. Most notably, it disputes the idea, argued especially by linguist Noam Chomsky, that thought is fundamentally linguistic - in other words, you need language before you can have thoughts.
Weiye Loh

Rationally Speaking: Human, know thy place! - 0 views

  • I kicked off a recent episode of the Rationally Speaking podcast on the topic of transhumanism by defining it as “the idea that we should be pursuing science and technology to improve the human condition, modifying our bodies and our minds to make us smarter, healthier, happier, and potentially longer-lived.”
  • Massimo understandably expressed some skepticism about why there needs to be a transhumanist movement at all, given how incontestable their mission statement seems to be. As he rhetorically asked, “Is transhumanism more than just the idea that we should be using technologies to improve the human condition? Because that seems a pretty uncontroversial point.” Later in the episode, referring to things such as radical life extension and modifications of our minds and genomes, Massimo said, “I don't think these are things that one can necessarily have objections to in principle.”
  • There are a surprising number of people whose reaction, when they are presented with the possibility of making humanity much healthier, smarter and longer-lived, is not “That would be great,” nor “That would be great, but it's infeasible,” nor even “That would be great, but it's too risky.” Their reaction is, “That would be terrible.”
  • ...14 more annotations...
  • The people with this attitude aren't just fringe fundamentalists who are fearful of messing with God's Plan. Many of them are prestigious professors and authors whose arguments make no mention of religion. One of the most prominent examples is political theorist Francis Fukuyama, author of End of History, who published a book in 2003 called “Our Posthuman Future: Consequences of the Biotechnology Revolution.” In it he argues that we will lose our “essential” humanity by enhancing ourselves, and that the result will be a loss of respect for “human dignity” and a collapse of morality.
  • Fukuyama's reasoning represents a prominent strain of thought about human enhancement, and one that I find doubly fallacious. (Fukuyama is aware of the following criticisms, but neither I nor other reviewers were impressed by his attempt to defend himself against them.) The idea that the status quo represents some “essential” quality of humanity collapses when you zoom out and look at the steady change in the human condition over previous millennia. Our ancestors were less knowledgable, more tribalistic, less healthy, shorter-lived; would Fukuyama have argued for the preservation of all those qualities on the grounds that, in their respective time, they constituted an “essential human nature”? And even if there were such a thing as a persistent “human nature,” why is it necessarily worth preserving? In other words, I would argue that Fukuyama is committing both the fallacy of essentialism (there exists a distinct thing that is “human nature”) and the appeal to nature (the way things naturally are is how they ought to be).
  • Writer Bill McKibben, who was called “probably the nation's leading environmentalist” by the Boston Globe this year, and “the world's best green journalist” by Time magazine, published a book in 2003 called “Enough: Staying Human in an Engineered Age.” In it he writes, “That is the choice... one that no human should have to make... To be launched into a future without bounds, where meaning may evaporate.” McKibben concludes that it is likely that “meaning and pain, meaning and transience are inextricably intertwined.” Or as one blogger tartly paraphrased: “If we all live long healthy happy lives, Bill’s favorite poetry will become obsolete.”
  • President George W. Bush's Council on Bioethics, which advised him from 2001-2009, was steeped in it. Harvard professor of political philosophy Michael J. Sandel served on the Council from 2002-2005 and penned an article in the Atlantic Monthly called “The Case Against Perfection,” in which he objected to genetic engineering on the grounds that, basically, it’s uppity. He argues that genetic engineering is “the ultimate expression of our resolve to see ourselves astride the world, the masters of our nature.” Better we should be bowing in submission than standing in mastery, Sandel feels. Mastery “threatens to banish our appreciation of life as a gift,” he warns, and submitting to forces outside our control “restrains our tendency toward hubris.”
  • If you like Sandel's “It's uppity” argument against human enhancement, you'll love his fellow Councilmember Dr. William Hurlbut's argument against life extension: “It's unmanly.” Hurlbut's exact words, delivered in a 2007 debate with Aubrey de Grey: “I actually find a preoccupation with anti-aging technologies to be, I think, somewhat spiritually immature and unmanly... I’m inclined to think that there’s something profound about aging and death.”
  • And Council chairman Dr. Leon Kass, a professor of bioethics from the University of Chicago who served from 2001-2005, was arguably the worst of all. Like McKibben, Kass has frequently argued against radical life extension on the grounds that life's transience is central to its meaningfulness. “Could the beauty of flowers depend on the fact that they will soon wither?” he once asked. “How deeply could one deathless ‘human’ being love another?”
  • Kass has also argued against human enhancements on the same grounds as Fukuyama, that we shouldn't deviate from our proper nature as human beings. “To turn a man into a cockroach— as we don’t need Kafka to show us —would be dehumanizing. To try to turn a man into more than a man might be so as well,” he said. And Kass completes the anti-transhumanist triad (it robs life of meaning; it's dehumanizing; it's hubris) by echoing Sandel's call for humility and gratitude, urging, “We need a particular regard and respect for the special gift that is our own given nature.”
  • By now you may have noticed a familiar ring to a lot of this language. The idea that it's virtuous to suffer, and to humbly surrender control of your own fate, is a cornerstone of Christian morality.
  • it's fairly representative of standard Christian tropes: surrendering to God, submitting to God, trusting that God has good reasons for your suffering.
  • I suppose I can understand that if you believe in an all-powerful entity who will become irate if he thinks you are ungrateful for anything, then this kind of groveling might seem like a smart strategic move. But what I can't understand is adopting these same attitudes in the absence of any religious context. When secular people chastise each other for the “hubris” of trying to improve the “gift” of life they've received, I want to ask them: just who, exactly, are you groveling to? Who, exactly, are you afraid of affronting if you dare to reach for better things?
  • This is why transhumanism is most needed, from my perspective – to counter the astoundingly widespread attitude that suffering and 80-year-lifespans are good things that are worth preserving. That attitude may make sense conditional on certain peculiarly masochistic theologies, but the rest of us have no need to defer to it. It also may have been a comforting thing to tell ourselves back when we had no hope of remedying our situation, but that's not necessarily the case anymore.
  • I think there is a seperation of Transhumanism and what Massimo is referring to. Things like robotic arms and the like come from trying to deal with a specific defect and thus seperate it from Transhumanism. I would define transhumanism the same way you would (the achievement of a better human), but I would exclude the inventions of many life altering devices as transhumanism. If we could invent a device that just made you smarter, then ideed that would be transhumanism, but if we invented a device that could make someone that was metally challenged to be able to be normal, I would define this as modern medicine. I just want to make sure we seperate advances in modern medicine from transhumanism. Modern medicine being the one that advances to deal with specific medical issues to improve quality of life (usually to restore it to normal conditions) and transhumanism being the one that can advance every single human (perhaps equally?).
    • Weiye Loh
       
      Assumes that "normal conditions" exist. 
  • I agree with all your points about why the arguments against transhumanism and for suffering are ridiculous. That being said, when I first heard about the ideas of Transhumanism, after the initial excitement wore off (since I'm a big tech nerd), my reaction was more of less the same as Massimo's. I don't particularly see the need for a philosophical movement for this.
  • if people believe that suffering is something God ordained for us, you're not going to convince them otherwise with philosophical arguments any more than you'll convince them there's no God at all. If the technologies do develop, acceptance of them will come as their use becomes more prevalent, not with arguments.
  •  
    Human, know thy place!
Weiye Loh

Religion as a catalyst of rationalization « The Immanent Frame - 0 views

  • For Habermas, religion has been a continuous concern precisely because it is related to both the emergence of reason and the development of a public space of reason-giving. Religious ideas, according to Habermas, are never mere irrational speculation. Rather, they possess a form, a grammar or syntax, that unleashes rational insights, even arguments; they contain, not just specific semantic contents about God, but also a particular structure that catalyzes rational argumentation.
  • in his earliest, anthropological-philosophical stage, Habermas approaches religion from a predominantly philosophical perspective. But as he undertakes the task of “transforming historical materialism” that will culminate in his magnum opus, The Theory of Communicative Action, there is a shift from philosophy to sociology and, more generally, social theory. With this shift, religion is treated, not as a germinal for philosophical concepts, but instead as the source of the social order.
  • What is noteworthy about this juncture in Habermas’s writings is that secularization is explained as “pressure for rationalization” from “above,” which meets the force of rationalization from below, from the realm of technical and practical action oriented to instrumentalization. Additionally, secularization here is not simply the process of the profanation of the world—that is, the withdrawal of religious perspectives as worldviews and the privatization of belief—but, perhaps most importantly, religion itself becomes the means for the translation and appropriation of the rational impetus released by its secularization.
  • ...6 more annotations...
  • religion becomes its own secular catalyst, or, rather, secularization itself is the result of religion. This approach will mature in the most elaborate formulation of what Habermas calls the “linguistification of the sacred,” in volume two of The Theory of Communicative Action. There, basing himself on Durkheim and Mead, Habermas shows how ritual practices and religious worldviews release rational imperatives through the establishment of a communicative grammar that conditions how believers can and should interact with each other, and how they relate to the idea of a supreme being. Habermas writes: worldviews function as a kind of drive belt that transforms the basic religious consensus into the energy of social solidarity and passes it on to social institutions, thus giving them a moral authority. [. . .] Whereas ritual actions take place at a pregrammatical level, religious worldviews are connected with full-fledged communicative actions.
  • The thrust of Habermas’s argumentation in this section of The Theory of Communicative Action is to show that religion is the source of the normative binding power of ethical and moral commandments. Yet there is an ambiguity here. While the contents of worldviews may be sublimated into the normative, binding of social systems, it is not entirely clear that the structure, or the grammar, of religious worldviews is itself exhausted. Indeed, in “A Genealogical Analysis of the Cognitive Content of Morality,” Habermas resolves this ambiguity by claiming that the horizontal relationship among believers and the vertical relationship between each believer and God shape the structure of our moral relationship to our neighbour, but now under two corresponding aspects: that of solidarity and that of justice. Here, the grammar of one’s religious relationship to God and the corresponding community of believers are like the exoskeleton of a magnificent species, which, once the religious worldviews contained in them have desiccated under the impact of the forces of secularization, leave behind a casing to be used as a structuring shape for other contents.
  • Metaphysical thinking, which for Habermas has become untenable by the very logic of philosophical development, is characterized by three aspects: identity thinking, or the philosophy of origins that postulates the correspondence between being and thought; the doctrine of ideas, which becomes the foundation for idealism, which in turn postulates a tension between what is perceived and what can be conceptualized; and a concomitant strong concept of theory, where the bios theoretikos takes on a quasi-sacred character, and where philosophy becomes the path to salvation through dedication to a life of contemplation. By “postmetaphysical” Habermas means the new self-understanding of reason that we are able to obtain after the collapse of the Hegelian idealist system—the historicization of reason, or the de-substantivation that turns it into a procedural rationality, and, above all, its humbling. It is noteworthy that one of the main aspects of the new postmetaphysical constellation is that in the wake of the collapse of metaphysics, philosophy is forced to recognize that it must co-exist with religious practices and language: Philosophy, even in its postmetaphysical form, will be able neither to replace nor to repress religion as long as religious language is the bearer of semantic content that is inspiring and even indispensable, for this content eludes (for the time being?) the explanatory force of philosophical language and continues to resist translation into reasoning discourses.
  • metaphysical thinking either surrendered philosophy to religion or sought to eliminate religion altogether. In contrast, postmetaphysical thinking recognizes that philosophy can neither replace nor dismissively reject religion, for religion continues to articulate a language whose syntax and content elude philosophy, but from which philosophy continues to derive insights into the universal dimensions of human existence.
  • Habermas claims that even moral discourse cannot translate religious language without something being lost: “Secular languages which only eliminate the substance once intended leave irritations. When sin was converted to culpability, and the breaking of divine commands to an offence against human laws, something was lost.” Still, Habermas’s concern with religion is no longer solely philosophical, nor merely socio-theoretical, but has taken on political urgency. Indeed, he now asks whether modern rule of law and constitutional democracies can generate the motivational resources that nourish them and make them durable. In a series of essays, now gathered in Between Naturalism and Religion, as well as in his Europe: The Faltering Project, Habermas argues that as we have become members of a world society (Weltgesellschaft), we have also been forced to adopt a societal “post-secular self-consciousness.” By this term Habermas does not mean that secularization has come to an end, and even less that it has to be reversed. Instead, he now clarifies that secularization refers very specifically to the secularization of state power and to the general dissolution of metaphysical, overarching worldviews (among which religious views are to be counted). Additionally, as members of a world society that has, if not a fully operational, at least an incipient global public sphere, we have been forced to witness the endurance and vitality of religion. As members of this emergent global public sphere, we are also forced to recognize the plurality of forms of secularization. Secularization did not occur in one form, but in a variety of forms and according to different chronologies.
  • through a critical reading of Rawls, Habermas has begun to translate the postmetaphysical orientation of modern philosophy into a postsecular self-understanding of modern rule of law societies in such a way that religious citizens as well as secular citizens can co-exist, not just by force of a modus vivendi, but out of a sincere mutual respect. “Mutual recognition implies, among other things, that religious and secular citizens are willing to listen and to learn from each other in public debates. The political virtue of treating each other civilly is an expression of distinctive cognitive attitudes.” The cognitive attitudes Habermas is referring to here are the very cognitive competencies that are distinctive of modern, postconventional social agents. Habermas’s recent work on religion, then, is primarily concerned with rescuing for the modern liberal state those motivational and moral resources that it cannot generate or provide itself. At the same time, his recent work is concerned with foregrounding the kind of ethical and moral concerns, preoccupations, and values that can guide us between the Scylla of a society administered from above by the system imperatives of a global economy and political power and the Charybdis of a technological frenzy that places us on the slippery slope of a liberally sanctioned eugenics.
  •  
    Religion in the public sphere: Religion as a catalyst of rationalization posted by Eduardo Mendieta
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

Google is funding a new software project that will automate writing local news - Recode - 0 views

  •  
    "Radar aims to automate local reporting with large public databases from government agencies or local law enforcement - basically roboticizing the work of reporters. Stories from the data will be penned using Natural Language Generation, which converts information gleaned from the data into words. The robotic reporters won't be working alone. The grant includes funds allocated to hire five journalists to identify datasets, as well as curate and edit the news articles generated from Radar. The project also aims to create automated ways to add images and video to robot-made stories."
Weiye Loh

Our ever-changing English | Alison Flood | Comment is free | guardian.co.uk - 0 views

  • Perhaps the Daily Mail should take a leaf out of Jonathan Swift's book and instead of blaming changes in English on "a tidal wave of mindless Americanisms", start calling those damned poets to book.
  • We've been whining on about the deterioration in English for years and years and years, and perhaps we need to get over ourselves. Looking at Swift's 300-year-old plea to keep things the same I'm minded to think that, actually, part of the glory of English, from Shakespeare's insults to Bombaugh's txt speak to the ever-expanding dictionaries of today, is its constantly changing nature, its adaptability, its responsiveness.
  •  
    Our ever-changing English I get grumpy about crimes against language. But we Brits have been lamenting declining standards of English for centuries
Weiye Loh

The Ashtray: The Ultimatum (Part 1) - NYTimes.com - 0 views

  • “Under no circumstances are you to go to those lectures. Do you hear me?” Kuhn, the head of the Program in the History and Philosophy of Science at Princeton where I was a graduate student, had issued an ultimatum. It concerned the philosopher Saul Kripke’s lectures — later to be called “Naming and Necessity” — which he had originally given at Princeton in 1970 and planned to give again in the Fall, 1972.
  • Whiggishness — in history of science, the tendency to evaluate and interpret past scientific theories not on their own terms, but in the context of current knowledge. The term comes from Herbert Butterfield’s “The Whig Interpretation of History,” written when Butterfield, a future Regius professor of history at Cambridge, was only 31 years old. Butterfield had complained about Whiggishness, describing it as “…the study of the past with direct and perpetual reference to the present” – the tendency to see all history as progressive, and in an extreme form, as an inexorable march to greater liberty and enlightenment. [3] For Butterfield, on the other hand, “…real historical understanding” can be achieved only by “attempting to see life with the eyes of another century than our own.” [4][5].
  • Kuhn had attacked my Whiggish use of the term “displacement current.” [6] I had failed, in his view, to put myself in the mindset of Maxwell’s first attempts at creating a theory of electricity and magnetism. I felt that Kuhn had misinterpreted my paper, and that he — not me — had provided a Whiggish interpretation of Maxwell. I said, “You refuse to look through my telescope.” And he said, “It’s not a telescope, Errol. It’s a kaleidoscope.” (In this respect, he was probably right.) [7].
  • ...9 more annotations...
  • I asked him, “If paradigms are really incommensurable, how is history of science possible? Wouldn’t we be merely interpreting the past in the light of the present? Wouldn’t the past be inaccessible to us? Wouldn’t it be ‘incommensurable?’ ” [8] ¶He started moaning. He put his head in his hands and was muttering, “He’s trying to kill me. He’s trying to kill me.” ¶And then I added, “…except for someone who imagines himself to be God.” ¶It was at this point that Kuhn threw the ashtray at me.
  • I call Kuhn’s reply “The Ashtray Argument.” If someone says something you don’t like, you throw something at him. Preferably something large, heavy, and with sharp edges. Perhaps we were engaged in a debate on the nature of language, meaning and truth. But maybe we just wanted to kill each other.
  • That's the problem with relativism: Who's to say who's right and who's wrong? Somehow I'm not surprised to hear Kuhn was an ashtray-hurler. In the end, what other argument could he make?
  • For us to have a conversation and come to an agreement about the meaning of some word without having to refer to some outside authority like a dictionary, we would of necessity have to be satisfied that our agreement was genuine and not just a polite acknowledgement of each others' right to their opinion, can you agree with that? If so, then let's see if we can agree on the meaning of the word 'know' because that may be the crux of the matter. When I use the word 'know' I mean more than the capacity to apprehend some aspect of the world through language or some other represenational symbolism. Included in the word 'know' is the direct sensorial perception of some aspect of the world. For example, I sense the floor that my feet are now resting upon. I 'know' the floor is really there, I can sense it. Perhaps I don't 'know' what the floor is made of, who put it there, and other incidental facts one could know through the usual symbolism such as language as in a story someone tells me. Nevertheless, the reality I need to 'know' is that the floor, or whatever you may wish to call the solid - relative to my body - flat and level surface supported by more structure then the earth, is really there and reliably capable of supporting me. This is true and useful knowledge that goes directly from the floor itself to my knowing about it - via sensation - that has nothing to do with my interpretive system.
  • Now I am interested in 'knowing' my feet in the same way that my feet and the whole body they are connected to 'know' the floor. I sense my feet sensing the floor. My feet are as real as the floor and I know they are there, sensing the floor because I can sense them. Furthermore, now I 'know' that it is 'I' sensing my feet, sensing the floor. Do you see where I am going with this line of thought? I am including in the word 'know' more meaning than it is commonly given by everyday language. Perhaps it sounds as if I want to expand on the Cartesian formula of cogito ergo sum, and in truth I prefer to say I sense therefore I am. It is my sensations of the world first and foremost that my awareness, such as it is, is actively engaged with reality. Now, any healthy normal animal senses the world but we can't 'know' if they experience reality as we do since we can't have a conversation with them to arrive at agreement. But we humans can have this conversation and possibly agree that we can 'know' the world through sensation. We can even know what is 'I' through sensation. In fact, there is no other way to know 'I' except through sensation. Thought is symbolic representation, not direct sensing, so even though the thoughtful modality of regarding the world may be a far more reliable modality than sensation in predicting what might happen next, its very capacity for such accurate prediction is its biggest weakness, which is its capacity for error
  • Sensation cannot be 'wrong' unless it is used to predict outcomes. Thought can be wrong for both predicting outcomes and for 'knowing' reality. Sensation alone can 'know' reality even though it is relatively unreliable, useless even, for making predictions.
  • If we prioritize our interests by placing predictability over pure knowing through sensation, then of course we will not value the 'knowledge' to be gained through sensation. But if we can switch the priorities - out of sheer curiosity perhaps - then we can enter a realm of knowledge through sensation that is unbelievably spectacular. Our bodies are 'made of' reality, and by methodically exercising our nascent capacity for self sensing, we can connect our knowing 'I' to reality directly. We will not be able to 'know' what it is that we are experiencing in the way we might wish, which is to be able to predict what will happen next or to represent to ourselves symbolically what we might experience when we turn our attention to that sensation. But we can arrive at a depth and breadth of 'knowing' that is utterly unprecedented in our lives by operating that modality.
  • One of the impressions that comes from a sustained practice of self sensing is a clearer feeling for what "I" is and why we have a word for that self referential phenomenon, seemingly located somewhere behind our eyes and between our ears. The thing we call "I" or "me" depending on the context, turns out to be a moving point, a convergence vector for a variety of images, feelings and sensations. It is a reference point into which certain impressions flow and out of which certain impulses to act diverge and which may or may not animate certain muscle groups into action. Following this tricky exercize in attention and sensation, we can quickly see for ourselves that attention is more like a focused beam and awareness is more like a diffuse cloud, but both are composed of energy, and like all energy they vibrate, they oscillate with a certain frequency. That's it for now.
  • I loved the writer's efforts to find a fixed definition of “Incommensurability;” there was of course never a concrete meaning behind the word. Smoke and mirrors.
Weiye Loh

Genome Biology | Full text | A Faustian bargain - 0 views

  • on October 1st, you announced that the departments of French, Italian, Classics, Russian and Theater Arts were being eliminated. You gave several reasons for your decision, including that 'there are comparatively fewer students enrolled in these degree programs.' Of course, your decision was also, perhaps chiefly, a cost-cutting measure - in fact, you stated that this decision might not have been necessary had the state legislature passed a bill that would have allowed your university to set its own tuition rates. Finally, you asserted that the humanities were a drain on the institution financially, as opposed to the sciences, which bring in money in the form of grants and contracts.
  • I'm sure that relatively few students take classes in these subjects nowadays, just as you say. There wouldn't have been many in my day, either, if universities hadn't required students to take a distribution of courses in many different parts of the academy: humanities, social sciences, the fine arts, the physical and natural sciences, and to attain minimal proficiency in at least one foreign language. You see, the reason that humanities classes have low enrollment is not because students these days are clamoring for more relevant courses; it's because administrators like you, and spineless faculty, have stopped setting distribution requirements and started allowing students to choose their own academic programs - something I feel is a complete abrogation of the duty of university faculty as teachers and mentors. You could fix the enrollment problem tomorrow by instituting a mandatory core curriculum that included a wide range of courses.
  • the vast majority of humanity cannot handle freedom. In giving humans the freedom to choose, Christ has doomed humanity to a life of suffering.
  • ...7 more annotations...
  • in Dostoyevsky's parable of the Grand Inquisitor, which is told in Chapter Five of his great novel, The Brothers Karamazov. In the parable, Christ comes back to earth in Seville at the time of the Spanish Inquisition. He performs several miracles but is arrested by Inquisition leaders and sentenced to be burned at the stake. The Grand Inquisitor visits Him in his cell to tell Him that the Church no longer needs Him. The main portion of the text is the Inquisitor explaining why. The Inquisitor says that Jesus rejected the three temptations of Satan in the desert in favor of freedom, but he believes that Jesus has misjudged human nature.
  • I'm sure the budgetary problems you have to deal with are serious. They certainly are at Brandeis University, where I work. And we, too, faced critical strategic decisions because our income was no longer enough to meet our expenses. But we eschewed your draconian - and authoritarian - solution, and a team of faculty, with input from all parts of the university, came up with a plan to do more with fewer resources. I'm not saying that all the specifics of our solution would fit your institution, but the process sure would have. You did call a town meeting, but it was to discuss your plan, not let the university craft its own. And you called that meeting for Friday afternoon on October 1st, when few of your students or faculty would be around to attend. In your defense, you called the timing 'unfortunate', but pleaded that there was a 'limited availability of appropriate large venue options.' I find that rather surprising. If the President of Brandeis needed a lecture hall on short notice, he would get one. I guess you don't have much clout at your university.
  • As for the argument that the humanities don't pay their own way, well, I guess that's true, but it seems to me that there's a fallacy in assuming that a university should be run like a business. I'm not saying it shouldn't be managed prudently, but the notion that every part of it needs to be self-supporting is simply at variance with what a university is all about.
  • You seem to value entrepreneurial programs and practical subjects that might generate intellectual property more than you do 'old-fashioned' courses of study. But universities aren't just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment.
  • what seems to be archaic today can become vital in the future. I'll give you two examples of that. The first is the science of virology, which in the 1970s was dying out because people felt that infectious diseases were no longer a serious health problem in the developed world and other subjects, such as molecular biology, were much sexier. Then, in the early 1990s, a little problem called AIDS became the world's number 1 health concern. The virus that causes AIDS was first isolated and characterized at the National Institutes of Health in the USA and the Institute Pasteur in France, because these were among the few institutions that still had thriving virology programs. My second example you will probably be more familiar with. Middle Eastern Studies, including the study of foreign languages such as Arabic and Persian, was hardly a hot subject on most campuses in the 1990s. Then came September 11, 2001. Suddenly we realized that we needed a lot more people who understood something about that part of the world, especially its Muslim culture. Those universities that had preserved their Middle Eastern Studies departments, even in the face of declining enrollment, suddenly became very important places. Those that hadn't - well, I'm sure you get the picture.
  • one of your arguments is that not every place should try to do everything. Let other institutions have great programs in classics or theater arts, you say; we will focus on preparing students for jobs in the real world. Well, I hope I've just shown you that the real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today's backwater is often tomorrow's hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren't too narrowly trained. If none of that convinces you, then I'm willing to let you turn your institution into a place that focuses on the practical, but only if you stop calling it a university and yourself the President of one. You see, the word 'university' derives from the Latin 'universitas', meaning 'the whole'. You can't be a university without having a thriving humanities program. You will need to call SUNY Albany a trade school, or perhaps a vocational college, but not a university. Not anymore.
  • I started out as a classics major. I'm now Professor of Biochemistry and Chemistry. Of all the courses I took in college and graduate school, the ones that have benefited me the most in my career as a scientist are the courses in classics, art history, sociology, and English literature. These courses didn't just give me a much better appreciation for my own culture; they taught me how to think, to analyze, and to write clearly. None of my sciences courses did any of that.
Weiye Loh

MacIntyre on money « Prospect Magazine - 0 views

  • MacIntyre has often given the impression of a robe-ripping Savonarola. He has lambasted the heirs to the principal western ethical schools: John Locke’s social contract, Immanuel Kant’s categorical imperative, Jeremy Bentham’s utilitarian “the greatest happiness for the greatest number.” Yet his is not a lone voice in the wilderness. He can claim connections with a trio of 20th-century intellectual heavyweights: the late Elizabeth Anscombe, her surviving husband, Peter Geach, and the Canadian philosopher Charles Taylor, winner in 2007 of the Templeton prize. What all four have in common is their Catholic faith, enthusiasm for Aristotle’s telos (life goals), and promotion of Thomism, the philosophy of St Thomas Aquinas who married Christianity and Aristotle. Leo XIII (pope from 1878 to 1903), who revived Thomism while condemning communism and unfettered capitalism, is also an influence.
  • MacIntyre’s key moral and political idea is that to be human is to be an Aristotelian goal-driven, social animal. Being good, according to Aristotle, consists in a creature (whether plant, animal, or human) acting according to its nature—its telos, or purpose. The telos for human beings is to generate a communal life with others; and the good society is composed of many independent, self-reliant groups.
  • MacIntyre differs from all these influences and alliances, from Leo XIII onwards, in his residual respect for Marx’s critique of capitalism.
  • ...6 more annotations...
  • MacIntyre begins his Cambridge talk by asserting that the 2008 economic crisis was not due to a failure of business ethics.
  • he has argued that moral behaviour begins with the good practice of a profession, trade, or art: playing the violin, cutting hair, brick-laying, teaching philosophy.
  • In other words, the virtues necessary for human flourishing are not a result of the top-down application of abstract ethical principles, but the development of good character in everyday life.
  • After Virtue, which is in essence an attack on the failings of the Enlightenment, has in its sights a catalogue of modern assumptions of beneficence: liberalism, humanism, individualism, capitalism. MacIntyre yearns for a single, shared view of the good life as opposed to modern pluralism’s assumption that there can be many competing views of how to live well.
  • In philosophy he attacks consequentialism, the view that what matters about an action is its consequences, which is usually coupled with utilitarianism’s “greatest happiness” principle. He also rejects Kantianism—the identification of universal ethical maxims based on reason and applied to circumstances top down. MacIntyre’s critique routinely cites the contradictory moral principles adopted by the allies in the second world war. Britain invoked a Kantian reason for declaring war on Germany: that Hitler could not be allowed to invade his neighbours. But the bombing of Dresden (which for a Kantian involved the treatment of people as a means to an end, something that should never be countenanced) was justified under consequentialist or utilitarian arguments: to bring the war to a swift end.
  • MacIntyre seeks to oppose utilitarianism on the grounds that people are called on by their very nature to be good, not merely to perform acts that can be interpreted as good. The most damaging consequence of the Enlightenment, for MacIntyre, is the decline of the idea of a tradition within which an individual’s desires are disciplined by virtue. And that means being guided by internal rather than external “goods.” So the point of being a good footballer is the internal good of playing beautifully and scoring lots of goals, not the external good of earning a lot of money. The trend away from an Aristotelian perspective has been inexorable: from the empiricism of David Hume, to Darwin’s account of nature driven forward without a purpose, to the sterile analytical philosophy of AJ Ayer and the “demolition of metaphysics” in his 1936 book Language, Truth and Logic.
  •  
    The influential moral philosopher Alasdair MacIntyre has long stood outside the mainstream. Has the financial crisis finally vindicated his critique of global capitalism?
Weiye Loh

Rethinking the gene » Scienceline - 0 views

  • Currently, the public views genes primarily as self-contained packets of information that come from parents and are distinct from the environment. “The popular notion of the gene is an attractive idea—it’s so magical,” said Mark Blumberg, a developmental biologist at the University of Iowa in Iowa City. But it ignores the growing scientific understanding of how genes and local environments interplay, he said.
  • With the rise of molecular biology in the 1930s and genomics (the study of entire genomes) in the 1970s, scientists have developed a much more dynamic and complex picture of this interplay. The simplistic notion of the gene has been replaced with gene-environment interactions and developmental influences—nature and nurture as completely intertwined.
  • But the public hasn’t quite kept up. There remains a “huge chasm” between the way scientists understand genetics and the way the public understands it, said David Shenk, an author who has written extensively on genetics and intelligence.
  • ...8 more annotations...
  • the public still thinks of genes as blueprints, providing precise instructions for each individual.
  • “The elegant simplicity of the idea is so powerful,” said Shenk. Unfortunately, it is also false. The blueprint metaphor is fundamentally deceptive, he said, and “leads people to believe that any difference they see can be tied back to specific genes.”
  • Instead, Shenk advocates the metaphor of a giant mixing board, in which genes are a multitude of knobs and switches that get turned on and off depending on various factors in their environment. Interaction is key, though it goes against how most people see genetics: the classic, but inaccurate, dichotomies of nature versus nurture, innate versus acquired and genes versus environment.
  • Belief in those dichotomies is hard to eliminate because people tend to understand genetics through the two separate “tracks” of genes and the environment, according to speech communication expert Celeste Condit from the University of Georgia in Athens. Condit suggests that, whenever possible, explanations of genetics—by scientists, authors, journalists, or doctors—should draw connections between the two tracks, effectively merging them into one. “We need to link up the gene and environment tracks,” she said, “so that [people] can’t think of one without thinking of the other.”
  • Part of what makes these concepts so difficult lies in the language of genetics itself. A recent study by Condit in the September issue of Clinical Genetics found that when people hear the word genetics, they primarily think of heredity, or the quality of being heritable (passed from one generation to the next). Unfortunately, the terms heredity and heritable are often confused with heritability, which has a very different meaning.
  • heritability has single-handedly muddled the discourse of genetics to such a degree that even experts can’t keep it straight, argues historian of science Evelyn Fox Keller at the Massachusetts Institute of Technology in her recent book, The Mirage of a Space Between Nature and Nurture. Keller describes how heritability (in the technical literature) refers to how much of the variation in a trait is due to genetic explanation. But the term has seeped out into the general public and is, understandably, taken to mean heritable, or ability to be inherited. These concepts are fundamentally different, but often hard to grasp.
  • For example, let’s say that in a population with people of different heights, 60 percent of the variation in height is attributable to genes (as opposed to nutrition). The heritability of height is 60 percent. This does not mean, however, that 60 percent of an individual’s height comes from her genes, and 40 percent from what she ate growing up. Heritability refers to causes of variations (between people), not to causes of traits themselves (in each particular individual). The conflation of crucially different terms like traits and variations has wreaked havoc on the public understanding of genetics.
  • The stakes are high. Condit emphasizes how important a solid understanding of genetics is for making health decisions. Whether people see diabetes or lung cancer as determined by family history or responsive to changes in behavior depends greatly on how they understand genetics. Policy decisions about education, childcare, or the workplace are all informed by a proper understanding of the dynamic interplay of genes and the environment, and this means looking beyond the Mendelian lens of heredity. According to Shenk, everyone in the business of communicating these issues “needs to bend over backwards to help people understand.”
Weiye Loh

Hashtags, a New Way for Tweets - Cultural Studies - NYTimes.com - 0 views

  • hashtags have transcended the 140-characters-or-less microblogging platform, and have become a new cultural shorthand, finding their way into chat windows, e-mail and face-to-face conversations.
  • people began using hashtags to add humor, context and interior monologues to their messages — and everyday conversation. As Susan Orlean wrote in a New Yorker blog post titled “Hash,” the symbol can be “a more sophisticated, verbal version of the dread winking emoticon that tweens use to signify that they’re joking.”
  • “Because you have a hashtag embedded in a short message with real language, it starts exhibiting other characteristics of natural language, which means basically that people start playing with it and manipulating it,” said Jacob Eisenstein, a postdoctoral fellow at Carnegie Mellon University in computational linguistics. “You’ll see them used as humor, as sort of meta-commentary, where you’ll write a message and maybe you don’t really believe it, and what you really think is in the hashtag.”
  • ...2 more annotations...
  • Hashtags then began popping up outside of Twitter, in e-mails, chat windows and text messages.
  • Using a hashtag is also a way for someone to convey that they’re part of a certain scene.
Weiye Loh

Hayek, The Use of Knowledge in Society | Library of Economics and Liberty - 0 views

  • the "data" from which the economic calculus starts are never for the whole society "given" to a single mind which could work out the implications and can never be so given.
  • The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess.
  • The economic problem of society
  • ...14 more annotations...
  • is a problem of the utilization of knowledge which is not given to anyone in its totality.
  • who is to do the planning. It is about this question that all the dispute about "economic planning" centers. This is not a dispute about whether planning is to be done or not. It is a dispute as to whether planning is to be done centrally, by one authority for the whole economic system, or is to be divided among many individuals. Planning in the specific sense in which the term is used in contemporary controversy necessarily means central planning—direction of the whole economic system according to one unified plan. Competition, on the other hand, means decentralized planning by many separate persons. The halfway house between the two, about which many people talk but which few like when they see it, is the
  • Which of these systems is likely to be more efficient depends mainly on the question under which of them we can expect that fuller use will be made of the existing knowledge.
  • It may be admitted that, as far as scientific knowledge is concerned, a body of suitably chosen experts may be in the best position to command all the best knowledge available—though this is of course merely shifting the difficulty to the problem of selecting the experts.
  • Today it is almost heresy to suggest that scientific knowledge is not the sum of all knowledge. But a little reflection will show that there is beyond question a body of very important but unorganized knowledge which cannot possibly be called scientific in the sense of knowledge of general rules: the knowledge of the particular circumstances of time and place. It is with respect to this that practically every individual has some advantage over all others because he possesses unique information of which beneficial use might be made, but of which use can be made only if the decisions depending on it are left to him or are made with his active coöperation.
  • the relative importance of the different kinds of knowledge; those more likely to be at the disposal of particular individuals and those which we should with greater confidence expect to find in the possession of an authority made up of suitably chosen experts. If it is today so widely assumed that the latter will be in a better position, this is because one kind of knowledge, namely, scientific knowledge, occupies now so prominent a place in public imagination that we tend to forget that it is not the only kind that is relevant.
  • It is a curious fact that this sort of knowledge should today be generally regarded with a kind of contempt and that anyone who by such knowledge gains an advantage over somebody better equipped with theoretical or technical knowledge is thought to have acted almost disreputably. To gain an advantage from better knowledge of facilities of communication or transport is sometimes regarded as almost dishonest, although it is quite as important that society make use of the best opportunities in this respect as in using the latest scientific discoveries.
  • The common idea now seems to be that all such knowledge should as a matter of course be readily at the command of everybody, and the reproach of irrationality leveled against the existing economic order is frequently based on the fact that it is not so available. This view disregards the fact that the method by which such knowledge can be made as widely available as possible is precisely the problem to which we have to find an answer.
  • One reason why economists are increasingly apt to forget about the constant small changes which make up the whole economic picture is probably their growing preoccupation with statistical aggregates, which show a very much greater stability than the movements of the detail. The comparative stability of the aggregates cannot, however, be accounted for—as the statisticians occasionally seem to be inclined to do—by the "law of large numbers" or the mutual compensation of random changes.
  • the sort of knowledge with which I have been concerned is knowledge of the kind which by its nature cannot enter into statistics and therefore cannot be conveyed to any central authority in statistical form. The statistics which such a central authority would have to use would have to be arrived at precisely by abstracting from minor differences between the things, by lumping together, as resources of one kind, items which differ as regards location, quality, and other particulars, in a way which may be very significant for the specific decision. It follows from this that central planning based on statistical information by its nature cannot take direct account of these circumstances of time and place and that the central planner will have to find some way or other in which the decisions depending on them can be left to the "man on the spot."
  • We need decentralization because only thus can we insure that the knowledge of the particular circumstances of time and place will be promptly used. But the "man on the spot" cannot decide solely on the basis of his limited but intimate knowledge of the facts of his immediate surroundings. There still remains the problem of communicating to him such further information as he needs to fit his decisions into the whole pattern of changes of the larger economic system.
  • The problem which we meet here is by no means peculiar to economics but arises in connection with nearly all truly social phenomena, with language and with most of our cultural inheritance, and constitutes really the central theoretical problem of all social science. As Alfred Whitehead has said in another connection, "It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them." This is of profound significance in the social field. We make constant use of formulas, symbols, and rules whose meaning we do not understand and through the use of which we avail ourselves of the assistance of knowledge which individually we do not possess. We have developed these practices and institutions by building upon habits and institutions which have proved successful in their own sphere and which have in turn become the foundation of the civilization we have built up.
  • To assume all the knowledge to be given to a single mind in the same manner in which we assume it to be given to us as the explaining economists is to assume the problem away and to disregard everything that is important and significant in the real world.
  • That an economist of Professor Schumpeter's standing should thus have fallen into a trap which the ambiguity of the term "datum" sets to the unwary can hardly be explained as a simple error. It suggests rather that there is something fundamentally wrong with an approach which habitually disregards an essential part of the phenomena with which we have to deal: the unavoidable imperfection of man's knowledge and the consequent need for a process by which knowledge is constantly communicated and acquired. Any approach, such as that of much of mathematical economics with its simultaneous equations, which in effect starts from the assumption that people's knowledge corresponds with the objective facts of the situation, systematically leaves out what is our main task to explain. I am far from denying that in our system equilibrium analysis has a useful function to perform. But when it comes to the point where it misleads some of our leading thinkers into believing that the situation which it describes has direct relevance to the solution of practical problems, it is high time that we remember that it does not deal with the social process at all and that it is no more than a useful preliminary to the study of the main problem.
  •  
    The Use of Knowledge in Society Hayek, Friedrich A.(1899-1992)
Weiye Loh

Climate Change: Study Says Dire Warnings Fuel Skepticism - TIME - 0 views

  • I had the chance to sift through TIME's decades of environment coverage. I came to two conclusions: First, we were writing stories about virtually the same subjects 40 years ago as we do now. (Air pollution, endangered species, the polluted oceans, dwindling natural resources.) Second, our coverage of climate change has been really scary — by which I mean, we've emphasized the catastrophic threats of global warming in dire language.
  • Scientists were telling us that global warming really had the potential to wreck the future of the planet, and we wanted to get that message across to readers — even if it meant scaring the hell out of them.
  • According to forthcoming research by the Berkeley psychologists Robb Willer and Matthew Feinberg, when people are shown scientific evidence or news stories on climate change that emphasize the most negative aspects of warming — extinguished species, melting ice caps, serial natural disasters — they are actually more likely to dismiss or deny what they're seeing. Far from scaring people into taking action on climate change, such messages seem to scare them straight into denial.
  • ...4 more annotations...
  • Willer and Feinberg tested participants' belief in global warming, and then their belief in what's called the just-world theory, which holds that life is generally fair and predictable. The subjects were then randomly assigned to read one of two newspaper-style articles. Both pieces were identical through the first four paragraphs, providing basic scientific information about climate change, but they differed in their conclusions, with one article detailing the possibly apocalyptic consequences of climate change, and the other ending with a more upbeat message about potential solutions to global warming.
  • participants given the doomsday articles came out more skeptical of climate change, while those who read the bright-side pieces came out less skeptical. The increase in skepticism was especially acute among subjects who'd scored high on the just-world scale, perhaps because the worst victims of global warming — the poor of the developing world, future generations, blameless polar bears — are the ones least responsible for it. Such unjust things couldn't possibly occur, and so the predictions can't be true. The results, Willer and Feinberg wrote, "demonstrate how dire messages warning of the severity of global warming and its presumed dangers can backfire ... by contradicting individuals' deeply held beliefs that the world is fundamentally just."
  • a climate scientist armed with data might argue that worldviews should be trumped by facts. But there's no denying that climate skepticism is on the rise
  • politicians — mostly on the right — have aggressively pushed the climate-change-is-a-hoax trope. The Climategate controversy of a year ago certainly might have played a role, too, though the steady decline in belief began well before those hacked e-mails were published. Still, the fact remains that if the point of the frightening images in global-warming documentaries like An Inconvenient Truth was to push audiences to act on climate change, they've been a failure theoretically and practically.
Weiye Loh

"Asian Values": a credible alternative to a universal conception of human rig... - 0 views

  • Singapore has not ratified the International Covenant on Civil and Political Rights, but as a member state of the United Nations is bound to respect “fundamental human rights”. But who decides these rights? Many commentators will argue that they are those enshrined in the Universal Declaration on Human Rights, in which Freedom of Expression is guaranteed by Article 19.
  • The United Nations Human Rights Committee has stressed that freedom of expression ensures the free political debate essential to democracy[ii] and has expressed concern that overbearing government controls of the media are incompatible with Freedom of Expression.
  • The Singapore government’s view is different. They have long asserted that human rights principles and conceptions are dominated by Western perceptions and argue for an “Asian Values” interpretation of human rights. This has been characterised as the assertion of the primacy of duty to the community over individual rights and the expectation of trust in authority and dominance of the state leaders.
  • ...4 more annotations...
  • The “Asian Values” hypothesis is equally suspect. The UDHR recognises the universal applicability of human rights and any nation party to this treaty is not permitted to restrict rights purely on cultural, religious or political grounds.
  • “Asian governments are justified in restricting civil and political rights in some circumstances in favour of social stability and economic growth. Civil and political rights are immaterial when people are destitute and society is unstable.  Accordingly, as luxuries to be enjoyed once there is social order, civil and political liberties must be temporarily suspended so as to not inhibit the government’s delivery of economic and social necessities and so as to not threaten or destroy future development plans.” Whilst this argument may have been slightly more palatable if Singapore’s citizens were, in fact destitute, the reality is that Singapore is ranked as one of the world’s wealthiest countries and boasts a high life expectancy. Thus in Singapore’s case, arguments made in favour of a “liberty trade-off” are rendered completely untenable.
  • these cultural and religious justifications for violating rights are as unacceptable as Singapore’s purported assertion of an “Asian Values” conception of human rights. Even though the Singapore government’s language is more subtle, their arguments amount to same basic tenet: the purported justification of the denial of fundamental human rights, by reference to cultural, religious or political specific norms. Speaking recently in New York, the UN Secretary-General, Ban Ki Moon warned against such an interpretation of human rights:
  • “Yes, we recognize that social attitudes run deep.  Yes, social change often comes only with time.  Yet, let there be no confusion: where there is tension between cultural attitudes and universal human rights, universal human rights must carry the day. ” The universal and fundamental nature of human rights is the founding principle on which the United Nations was built: the right to freedom of expression must be guaranteed, “Asian Values” notwithstanding.
Weiye Loh

Rationally Speaking: Are Intuitions Good Evidence? - 0 views

  • Is it legitimate to cite one’s intuitions as evidence in a philosophical argument?
  • appeals to intuitions are ubiquitous in philosophy. What are intuitions? Well, that’s part of the controversy, but most philosophers view them as intellectual “seemings.” George Bealer, perhaps the most prominent defender of intuitions-as-evidence, writes, “For you to have an intuition that A is just for it to seem to you that A… Of course, this kind of seeming is intellectual, not sensory or introspective (or imaginative).”2 Other philosophers have characterized them as “noninferential belief due neither to perception nor introspection”3 or alternatively as “applications of our ordinary capacities for judgment.”4
  • Philosophers may not agree on what, exactly, intuition is, but that doesn’t stop them from using it. “Intuitions often play the role that observation does in science – they are data that must be explained, confirmers or the falsifiers of theories,” Brian Talbot says.5 Typically, the way this works is that a philosopher challenges a theory by applying it to a real or hypothetical case and showing that it yields a result which offends his intuitions (and, he presumes, his readers’ as well).
  • ...16 more annotations...
  • For example, John Searle famously appealed to intuition to challenge the notion that a computer could ever understand language: “Imagine a native English speaker who knows no Chinese locked in a room full of boxes of Chinese symbols (a data base) together with a book of instructions for manipulating the symbols (the program). Imagine that people outside the room send in other Chinese symbols which, unknown to the person in the room, are questions in Chinese (the input). And imagine that by following the instructions in the program the man in the room is able to pass out Chinese symbols which are correct answers to the questions (the output)… If the man in the room does not understand Chinese on the basis of implementing the appropriate program for understanding Chinese then neither does any other digital computer solely on that basis because no computer, qua computer, has anything the man does not have.” Should we take Searle’s intuition that such a system would not constitute “understanding” as good evidence that it would not? Many critics of the Chinese Room argument argue that there is no reason to expect our intuitions about intelligence and understanding to be reliable.
  • Ethics leans especially heavily on appeals to intuition, with a whole school of ethicists (“intuitionists”) maintaining that a person can see the truth of general ethical principles not through reason, but because he “just sees without argument that they are and must be true.”6
  • Intuitions are also called upon to rebut ethical theories such as utilitarianism: maximizing overall utility would require you to kill one innocent person if, in so doing, you could harvest her organs and save five people in need of transplants. Such a conclusion is taken as a reductio ad absurdum, requiring utilitarianism to be either abandoned or radically revised – not because the conclusion is logically wrong, but because it strikes nearly everyone as intuitively wrong.
  • British philosopher G.E. Moore used intuition to argue that the existence of beauty is good irrespective of whether anyone ever gets to see and enjoy that beauty. Imagine two planets, he said, one full of stunning natural wonders – trees, sunsets, rivers, and so on – and the other full of filth. Now suppose that nobody will ever have the opportunity to glimpse either of those two worlds. Moore concluded, “Well, even so, supposing them quite apart from any possible contemplation by human beings; still, is it irrational to hold that it is better that the beautiful world should exist than the one which is ugly? Would it not be well, in any case, to do what we could to produce it rather than the other? Certainly I cannot help thinking that it would."7
  • Although similar appeals to intuition can be found throughout all the philosophical subfields, their validity as evidence has come under increasing scrutiny over the last two decades, from philosophers such as Hilary Kornblith, Robert Cummins, Stephen Stich, Jonathan Weinberg, and Jaakko Hintikka (links go to representative papers from each philosopher on this issue). The severity of their criticisms vary from Weinberg’s warning that “We simply do not know enough about how intuitions work,” to Cummins’ wholesale rejection of philosophical intuition as “epistemologically useless.”
  • One central concern for the critics is that a single question can inspire totally different, and mutually contradictory, intuitions in different people.
  • For example, I disagree with Moore’s intuition that it would be better for a beautiful planet to exist than an ugly one even if there were no one around to see it. I can’t understand what the words “better” and “worse,” let alone “beautiful” and “ugly,” could possibly mean outside the domain of the experiences of conscious beings
  • If we want to take philosophers’ intuitions as reason to believe a proposition, then the existence of opposing intuitions leaves us in the uncomfortable position of having reason to believe both a proposition and its opposite.
  • “I suspect there is overall less agreement than standard philosophical practice presupposes, because having the ‘right’ intuitions is the entry ticket to various subareas of philosophy,” Weinberg says.
  • But the problem that intuitions are often not universally shared is overshadowed by another problem: even if an intuition is universally shared, that doesn’t mean it’s accurate. For in fact there are many universal intuitions that are demonstrably false.
  • People who have not been taught otherwise typically assume that an object dropped out of a moving plane will fall straight down to earth, at exactly the same latitude and longitude from which it was dropped. What will actually happen is that, because the object begins its fall with the same forward momentum it had while it was on the plane, it will continue to travel forward, tracing out a curve as it falls and not a straight line. “Considering the inadequacies of ordinary physical intuitions, it is natural to wonder whether ordinary moral intuitions might be similarly inadequate,” Princeton’s Gilbert Harman has argued,9 and the same could be said for our intuitions about consciousness, metaphysics, and so on.
  • We can’t usually “check” the truth of our philosophical intuitions externally, with an experiment or a proof, the way we can in physics or math. But it’s not clear why we should expect intuitions to be true. If we have an innate tendency towards certain intuitive beliefs, it’s likely because they were useful to our ancestors.
  • But there’s no reason to expect that the intuitions which were true in the world of our ancestors would also be true in other, unfamiliar contexts
  • And for some useful intuitions, such as moral ones, “truth” may have been beside the point. It’s not hard to see how moral intuitions in favor of fairness and generosity would have been crucial to the survival of our ancestors’ tribes, as would the intuition to condemn tribe members who betrayed those reciprocal norms. If we can account for the presence of these moral intuitions by the fact that they were useful, is there any reason left to hypothesize that they are also “true”? The same question could be asked of the moral intuitions which Jonathan Haidt has classified as “purity-based” – an aversion to incest, for example, would clearly have been beneficial to our ancestors. Since that fact alone suffices to explain the (widespread) presence of the “incest is morally wrong” intuition, why should we take that intuition as evidence that “incest is morally wrong” is true?
  • The still-young debate over intuition will likely continue to rage, especially since it’s intertwined with a rapidly growing body of cognitive and social psychological research examining where our intuitions come from and how they vary across time and place.
  • its resolution bears on the work of literally every field of analytic philosophy, except perhaps logic. Can analytic philosophy survive without intuition? (If so, what would it look like?) And can the debate over the legitimacy of appeals to intuition be resolved with an appeal to intuition?
Weiye Loh

Rationally Speaking: Between scientists and citizens, part I - 0 views

  • The authors suggest that there are two publics for science communication, one that is liberal, educated and with a number of resources at its disposals; the other with less predictable and less-formed opinions. The authors explored empirically (via a survey of 108 Colorado citizens) the responses of liberal and educated people to scientific jargon by exposing them to two “treatments”: jargon-laden vs lay terminology news articles. The results found that scientists were considered the most credible sources in the specific area of environmental science (94.3% agreed), followed by activists (61.1%). The least credible were industry representatives, clergy and celebrities. (Remember, this is among liberal educated people.) Interestingly, the use of jargon per se did not increase acceptance of the news source or of the content of the story. So the presence of scientific expertise is important, not so the presence of actual scientific details in the story.
  • There is no complete account of the scientific method, and again one can choose certain methods rather than others, depending on what one is trying to accomplish (a choice that is itself informed by one’s values). And of course the Duhem-Quine thesis shows that there is no straightforward way to falsify scientific theories (contra Popper). If there were supernatural causes that interact with (or override) the causes being studied by science, but are themselves undiscoverable, this would lead to false conclusions and bad predictions. Which means that the truth is discoverable empirically only if such supernatural causes are not active. Science cannot answer the question of whether such factors are present, which raises the question of whether we ought to proceed as if they were not (i.e., methodological naturalism).
  •  
    Expertise is often thought of in terms of skills, but within the context of science communication it really refers to authority and credibility. Expertise is communicated at least in part through the use of jargon, with which of course most journalists are not familiar. Jargon provides an air of authority, but at the same time the concepts referred to become inaccessible to non-specialists. Interestingly, journalists prefer sources that limit the use of jargon, but they themselves deploy jargon to demonstrate scientific proficiency.
Weiye Loh

The Fake Scandal of Climategate - 0 views

  • The most comprehensive inquiry was the Independent Climate Change Email Review led by Sir Muir Russell, commissioned by UEA to examine the behaviour of the CRU scientists (but not the scientific validity of their work). It published its final report in July 2010
  • It focused on what the CRU scientists did, not what they said, investigating the evidence for and against each allegation. It interviewed CRU and UEA staff, and took 111 submissions including one from CRU itself. And it also did something the media completely failed to do: it attempted to put the actions of CRU scientists into context.
    • Weiye Loh
       
      Data, in the form of email correspondence, requires context to be interpreted "objectively" and "accurately" =)
  • The Review went back to primary sources to see if CRU really was hiding or falsifying their data. It considered how much CRU’s actions influenced the IPCC’s conclusions about temperatures during the past millennium. It commissioned a paper by Dr Richard Horton, editor of The Lancet, on the context of scientific peer review. And it asked IPCC Review Editors how much influence individuals could wield on writing groups.
  • ...16 more annotations...
  • Many of these are things any journalist could have done relatively easily, but few ever bothered to do.
  • the emergence of the blogosphere requires significantly more openness from scientists. However, providing the details necessary to validate large datasets can be difficult and time-consuming, and how FoI laws apply to research is still an evolving area. Meanwhile, the public needs to understand that science cannot and does not produce absolutely precise answers. Though the uncertainties may become smaller and better constrained over time, uncertainty in science is a fact of life which policymakers have to deal with. The chapter concludes: “the Review would urge all scientists to learn to communicate their work in ways that the public can access and understand”.
  • email is less formal than other forms of communication: “Extreme forms of language are frequently applied to quite normal situations by people who would never use it in other communication channels.” The CRU scientists assumed their emails to be private, so they used “slang, jargon and acronyms” which would have been more fully explained had they been talking to the public. And although some emails suggest CRU went out of their way to make life difficult for their critics, there are others which suggest they were bending over backwards to be honest. Therefore the Review found “the e-mails cannot always be relied upon as evidence of what actually occurred, nor indicative of actual behaviour that is extreme, exceptional or unprofessional.” [section 4.3]
  • when put into the proper context, what do these emails actually reveal about the behaviour of the CRU scientists? The report concluded (its emphasis):
  • we find that their rigour and honesty as scientists are not in doubt.
  • we did not find any evidence of behaviour that might undermine the conclusions of the IPCC assessments.
  • “But we do find that there has been a consistent pattern of failing to display the proper degree of openness, both on the part of the CRU scientists and on the part of the UEA, who failed to recognize not only the significance of statutory requirements but also the risk to the reputation of the University and indeed, to the credibility of UK climate science.” [1.3]
  • The argument that Climategate reveals an international climate science conspiracy is not really a very skeptical one. Sure, it is skeptical in the weak sense of questioning authority, but it stops there. Unlike true skepticism, it doesn’t go on to objectively examine all the evidence and draw a conclusion based on that evidence. Instead, it cherry-picks suggestive emails, seeing everything as incontrovertible evidence of a conspiracy, and concludes all of mainstream climate science is guilty by association. This is not skepticism; this is conspiracy theory.
    • Weiye Loh
       
      How then do we know that we have examined ALL the evidence? What about the context of evidence then? 
  • The media dropped the ball There is a famous quotation attributed to Mark Twain: “A lie can travel halfway around the world while the truth is putting on its shoes.” This is more true in the internet age than it was when Mark Twain was alive. Unfortunately, it took months for the Climategate inquiries to put on their shoes, and by the time they reported, the damage had already been done. The media acted as an uncritical loudspeaker for the initial allegations, which will now continue to circulate around the world forever, then failed to give anywhere near the same amount of coverage to the inquiries clearing the scientists involved. For instance, Rupert Murdoch’s The Australian published no less than 85 stories about Climategate, but not one about the Muir Russell inquiry.
  • Even the Guardian, who have a relatively good track record on environmental reporting and were quick to criticize the worst excesses of climate conspiracy theorists, could not resist the lure of stolen emails. As George Monbiot writes, journalists see FoI requests and email hacking as a way of keeping people accountable, rather than the distraction from actual science which they are to scientists. In contrast, CRU director Phil Jones says: “I wish people would spend as much time reading my scientific papers as they do reading my e-mails.”
  • This is part of a broader problem with climate change reporting: the media holds scientists to far higher standards than it does contrarians. Climate scientists have to be right 100% of the time, but contrarians apparently can get away with being wrong nearly 100% of the time. The tiniest errors of climate scientists are nitpicked and blown out of all proportion, but contrarians get away with monstrous distortions and cherry-picking of evidence. Around the same time The Australian was bashing climate scientists, the same newspaper had no problem publishing Viscount Monckton’s blatant misrepresentations of IPCC projections (not to mention his demonstrably false conspiracy theory that the Copenhagen summit was a plot to establish a world government).
  • In the current model of environmental reporting, the contrarians do not lose anything by making baseless accusations. In fact, it is in their interests to throw as much mud at scientists as possible to increase the chance that some of it will stick in the public consciousness. But there is untold damage to the reputation of the scientists against whom the accusations are being made. We can only hope that in future the media will be less quick to jump to conclusions. If only editors and producers would stop and think for a moment about what they’re doing: they are playing with the future of the planet.
  • As worthy as this defense is, surely this is the kind of political bun-fight SkS has resolutely stayed away from since its inception. The debate can only become a quagmire of competing claims, because this is part of an adversarial process that does not depend on, or even require, scientific evidence. Only by sticking resolutely to the science and the advocacy of the scientific method can SkS continue to avoid being drowned in the kind of mud through which we are obliged to wade elsewhere.
  • I disagree with gp. It is past time we all got angry, very angry, at what these people have done and continue to do. Dispassionate science doesn't cut it with the denial industry or with the media (and that "or" really isn't there). It's time to fight back with everything we can throw back at them.
  • The fact that three quick fire threads have been run on Climatgate on this excellent blog in the last few days is an indication that Climategate (fairly or not) has does serious damage to the cause of AGW activism. Mass media always overshoots and exaggerates. The AGW alarmists had a very good run - here in Australia protagonists like Tim Flannery and our living science legend Robin Williams were talking catastrophe - the 10 year drought was definitely permanent climate change - rivers might never run again - Robin (100 metre sea level rise) Williams refused to even read the Climategate emails. Climategate swung the pendumum to the other extreme - the scientists (nearly all funded by you and me) were under the pump. Their socks rubbed harder on their sandals as they scrambled for clear air. Cries about criminal hackers funded by big oil, tobacco, rightist conspirators etc were heard. Panchuri cried 'voodoo science' as he denied ever knowing about objections to the preposterous 2035 claim. How things change in a year. The drought is broken over most of Australia - Tim Flannery has gone quiet and Robin Williams is airing a science journo who says that AGW scares have been exaggerated. Some balance might have been restored as the pendulum swung, and our hard working misunderstood scientist bretheren will take more care with their emails in future.
  • "Perhaps a more precise description would be that a common pattern in global warming skeptic arguments is to focus on narrow pieces of evidence while ignoring other evidence that contradicts their argument." And this is the issue the article discuss, but in my opinion this article is in guilt of this as well. It focus on a narrow set of non representative claims, claims which is indeed pure propaganda by some skeptics, however the article also suggest guilt buy association and as such these propaganda claims then gets attributed to the be opinions of the entire skeptic camp. In doing so, the OP becomes guilty of the very same issue the OP tries to address. In other words, the issue I try to raise is not about the exact numbers or figures or any particular facts but the fact that the claim I quoted is obvious nonsense. It is nonsense because it a sweeping statement with no specifics and as such it is an empty statement and means nothing. A second point I been thinking about when reading this article is why should scientist be granted immunity to dirty tricks/propaganda in a political debate? Is it because they speak under the name of science? If that is the case, why shall we not grant the same right to other spokesmen for other organization?
    • Weiye Loh
       
      The aspiration to examine ALL evidence is again called into question here. Is it really possible to examine ALL evidence? Even if we have examined them, can we fully represent our examination? From our lab, to the manuscript, to the journal paper, to the news article, to 140characters tweets?
Weiye Loh

Skepticblog » Throwing Cold Water on a Hot Topic - 0 views

  • I FIRST MET BJORN LOMBORG IN 2001 upon the publication of his Cambridge University Press book, The Skeptical Environmentalist, which I found to be a refreshing perspective on what had been the doom-and-gloom, end-of-the-world scenarios that I had been hearing since I was an undergraduate in the early 1970s. Back then we were told that overpopulation would lead to worldwide hunger and starvation, that there would be massive oil depletion, precious mineral exhaustion, and rainforest extinction by the 1990s. These predictions failed utterly. I felt I had been lied to for decades by the environmentalist movement that seemed to me to be little more than a political movement that raised money by raising fears.
  • Lomborg’s publicist thought that I might be interested in hosting him for the Skeptics Society’s public science lecture series at the California Institute of Technology that I organize and host. I was, but given the highly debatable nature of many of Lomborg’s claims I only agreed to host him if it could be a debate. Lomborg agreed at once to debate anyone, and this is where the trouble began. I could not find anyone to debate Lomborg. I contacted all of the top environmental organizations, and to a one they all refused to participate.
  • “There is no debate,” one told me. “We don’t want to dignify that book,” said another. I even called Paul Ehrlich, the author of the wildly popular bestselling book The Population Bomb — another apocalyptic prognostication that served as something of a catalyst in the 1970s for delimiting population growth — but he turned me down flat, warning me in no uncertain language that my reputation within the scientific community would be irreparably harmed if I went through with it.
Weiye Loh

Rationally Speaking: A different kind of moral relativism - 0 views

  • Prinz’s basic stance is that moral values stem from our cognitive hardware, upbringing, and social environment. These equip us with deep-seated moral emotions, but these emotions express themselves in a contingent way due to cultural circumstances. And while reason can help, it has limited influence, and can only reshape our ethics up to a point, it cannot settle major differences between different value systems. Therefore, it is difficult, if not impossible, to construct an objective morality that transcends emotions and circumstance.
  • As Prinz writes, in part:“No amount of reasoning can engender a moral value, because all values are, at bottom, emotional attitudes. … Reason cannot tell us which facts are morally good. Reason is evaluatively neutral. At best, reason can tell us which of our values are inconsistent, and which actions will lead to fulfillment of our goals. But, given an inconsistency, reason cannot tell us which of our conflicting values to drop or which goals to follow. If my goals come into conflict with your goals, reason tells me that I must either thwart your goals, or give up caring about mine; but reason cannot tell me to favor one choice over the other. … Moral judgments are based on emotions, and reasoning normally contributes only by helping us extrapolate from our basic values to novel cases. Reasoning can also lead us to discover that our basic values are culturally inculcated, and that might impel us to search for alternative values, but reason alone cannot tell us which values to adopt, nor can it instill new values.”
  • This moral relativism is not the absolute moral relativism of, supposedly, bands of liberal intellectuals, or of postmodernist philosophers. It presents a more serious challenge to those who argue there can be objective morality. To be sure, there is much Prinz and I agree on. At the least, we agree that morality is largely constructed by our cognition, upbringing, and social environment; and that reason has the power synthesize and clarify our worldviews, and help us plan for and react to life’s situations
  • ...5 more annotations...
  • Suppose I concede to Prinz that reason cannot settle differences in moral values and sentiments. Difference of opinion doesn’t mean that there isn’t a true or rational answer. In fact, there are many reasons why our cognition, emotional reactions or previous values could be wrong or irrational — and why people would not pick up on their deficiencies. In his article, Prinz uses the case of sociopaths, who simply lack certain cognitive abilities. There are many reasons other than sociopathy why human beings can get things wrong, morally speaking, often and badly. It could be that people are unable to adopt a more objective morality because of their circumstances — from brain deficiencies to lack of access to relevant information. But, again, none of this amounts to an argument against the existence of objective morality.
  • As it turns out, Prinz’s conception of objective morality does not quite reflect the thinking of most people who believe in objective morality. He writes that: “Objectivism holds that there is one true morality binding upon all of us.” This is a particular strand of moral realism, but there are many. For instance, one can judge some moral precepts as better than others, yet remain open to the fact that there are probably many different ways to establish a good society. This is a pluralistic conception of objective morality which doesn’t assume one absolute moral truth. For all that has been said, Sam Harris’ idea of a moral landscape does help illustrate this concept. Thinking in terms of better and worse morality gets us out of relativism and into an objectivist approach. The important thing to note is that one need not go all the way to absolute objectivity to work toward a rational, non-arbitrary morality.
  • even Prinz admits that “Relativism does not entail that we should tolerate murderous tyranny. When someone threatens us or our way of life, we are strongly motivated to protect ourselves.” That is, there are such things as better and worse values: the worse ones kill us, the better ones don’t. This is a very broad criterion, but it is an objective standard. Prinz is arguing for a tighter moral relativism – a sort of stripped down objective morality that is constricted by nature, experience, and our (modest) reasoning abilities.
  • I proposed at the discussion that a more objective morality could be had with the help of a robust public discourse on the issues at hand. Prinz does not necessarily disagree. He wrote that “Many people have overlapping moral values, and one can settle debates by appeal to moral common ground.” But Prinz pointed out a couple of limitations on public discourse. For example, the agreements we reach on “moral common ground” are often exclusive of some, and abstract in content. Consider the United Nations Declaration of Human Rights, a seemingly good example of global moral agreement. Yet, it was ratified by a small sample of 48 countries, and it is based on suspiciously Western sounding language. Everyone has a right to education and health care, but — Prinz pointed out during the discussion — what level of education and health care? Still, the U.N. declaration was passed 48-0 with just 8 abstentions (Belarus, Czechoslovakia, Poland, Ukraine, USSR, Yugoslavia, South Africa and Saudi Arabia). It includes 30 articles of ethical standards agreed upon by 48 countries around the world. Such a document does give us more reason to think that public discourse can lead to significant agreement upon values.
  • Reason might not be able to arrive at moral truths, but it can push us to test and question the rationality of our values — a crucial part in the process that leads to the adoption of new, or modified values. The only way to reduce disputes about morality is to try to get people on the same page about their moral goals. Given the above, this will not be easy, and perhaps we shouldn’t be too optimistic in our ability to employ reason to figure things out. But reason is still the best, and even only, tool we can wield, and while it might not provide us with a truly objective morality, it’s enough to save us from complete moral relativism.
1 - 20 of 20
Showing 20 items per page