Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Narratives

Rss Feed Group items tagged

Weiye Loh

Roger Pielke Jr.'s Blog: Flawed Food Narrative in the New York Times - 0 views

  • The article relies heavily on empty appeals to authority.  For example, it makes an unsupported assertion about what "scientists believe": Many of the failed harvests of the past decade were a consequence of weather disasters, like floods in the United States, drought in Australia and blistering heat waves in Europe and Russia. Scientists believe some, though not all, of those events were caused or worsened by human-induced global warming.  Completely unmentioned are the many (most?) scientists who believe that evidence is lacking to connect recent floods and heat waves to "human-induced global warming."
  • Some important issues beyond carbon dioxide are raised in the article, but are presented as secondary to the carbon narrative.  Other important issues are completely ignored -- for example, wheat rust goes unmentioned, and it probably has a greater risk to food supplies in the short term than anything to do with carbon dioxide. The carbon dioxide-centric focus on the article provides a nice illustration of how an obsession with "global warming" can serve to distract attention from factors that actually matter more for issues of human and environmental concern.
  • The central thesis of the NYT article is the following statement: The rapid growth in farm output that defined the late 20th century has slowed to the point that it is failing to keep up with the demand for food, driven by population increases and rising affluence in once-poor countries. But this claim of slowing output is shown to be completely false by the graphic that accompanies the article, shown below.  Far from slowing, farm output has increased dramatically over the past half-century (left panel) and on a per capita basis in 2009 was higher than at any point since the early 1980s (right panel).  
  •  
    Today's New York Times has an article by Justin Gillis on global food production that strains itself to the breaking point to make a story fit a narrative.  The narrative, of course, is that climate change "is helping to destabilize the food system."  The problem with the article is that the data that it presents don't support this narrative. Before proceeding, let me reiterate that human-caused climate change is a threat and one that we should be taking seriously. But taking climate change seriously does not mean shoehorning every global concern into that narrative, and especially conflating concerns about the future with what has been observed in the past. The risk of course of putting a carbon-centric spin on every issue is that other important dimensions are neglected.
Weiye Loh

Rationally Speaking: Truth from fiction: truth or fiction? - 0 views

  • Literature teaches us about life. Literature helps us understand the world.
  • this belief in truth-from-fiction is the party line for those who champion the merits of literature. Eminent English professor and critic Harold Bloom proclaims, in his bestselling How to Read and Why, that one of the main reasons to read literature is because "we require knowledge, not just of self and others, but of the way things are."
  • why would we expect literature to be a reliable source of knowledge about "the way things are"? After all, the narratives which are the most gripping and satisfying to read are not the most representative of how the world actually works. They have dramatic resolutions, foreshadowing, conflict, climax, and surprise. People tend to get their comeuppance after they misbehave. People who pursue their dream passionately tend to succeed. Disaster tends to strike when you least expect it. These narratives are over-represented in literature because they're more gratifying to read; why would we expect to learn from them about "the way things are"?
  • ...2 more annotations...
  • even if authors were all trying to faithfully represent the world as they perceived it, why would we expect their perceptions to be any more universally true than anyone else's?
  • I can't see any reason to give any more weight to the implicit arguments of a novel than we would give to the explicit arguments of any individual person. And yet when we read a novel or study it in school, especially if it's a hallowed classic, we tend to treat its arguments as truths.
  •  
    FRIDAY, JUNE 18, 2010 Truth from fiction: truth or fiction?
Weiye Loh

An insider's view of academic censorship in Singapore | Asian Correspondent - 0 views

  • Mark, who is now assistant professor of history at the University of Hong Kong, talks candidly about the censorship, both self-imposed and external, that guided his research and writing.
  • During my 6 years in the city, I definitely became ever more acutely aware of "political sensitivities". Thus, there were comments that came up in interviews with some of Singapore's former political detainees (interviews which are cited in the book) that were not included because they would have possibly resulted in libel actions. There were other things, such as the deviousness of LKY's political negotiations with the British in the late 50s and early 60s, which we could have gone into further (the details have been published) rather than just pointing to them in the footnotes. Was this the result of a subconscious self-censorship or a desire to move the story on? I'm still thinking about that one. But I do recall that, as a foreign academic working at the National Univ. of Singapore, you inevitably became careful about what sort of public criticism you directed at your paymasters. No doubt, this carefulness ultimately seeps into you (though I think good work can be done in Singapore, nevertheless, and many people in academia there continue to do it).
  • The decision to halt Singapore: a Biography in 1965, and in that sense narrow the narrative, was a very conscious one. I am still not comfortable tackling Singapore's political history after 1965, given the current political constraints in the Republic, and the official control of the archive. I have told publishers who have enquired about us extending the story or writing a sequel that this would involve a narrative far more critical of the ruling party. Repressive political measures that might have garnered a degree of popular support in the turbulent early-60s became, I believe, for many Singaporeans, less justifiable and more reprehensible in the 70s and 80s (culminating with the disgust that many people felt over the treatment of Catholic agitators involved in the so-called "Marxist conspiracy" of 1987).
  • ...2 more annotations...
  • As for the rise of the PAP, my personal view is that in the late 1950s the PAP was the only viable alternative to colonial rule, once Marshall had bailed - that is, in terms of getting Singapore out of its postwar social and economic predicament. As much as my heart is with the idealists who founded the Barisan, I'm not sure they would have achieved the same practical results as the PAP did in its first 5 years, had they got into power. There were already rifts in the Barisan prior to Operation Cold Store in 1963, and the more one looks into the party at this time, the more chaotic it appears. (Undoubtedly, this chaos was also a result of the pressures exerted upon it by the PAP.)
  • when the Barisan was systematically destroyed, hopeless though its leaders might have proved as technocrats, Singapore turned a corner. From 1963, economic success and political stability were won at the expense of freedom of expression and 'responsible dissent', generating a conformity, an intellectual sterility and a deep loss of historical identity that I hope the Epilogue to the book conveys. That's basically my take on the rise of the PAP. The party became something very different from 1963.
  •  
    An insider's view of academic censorship in Singapore
Weiye Loh

'There Is No Values-Free Form Of Education,' Says U.S. Philosopher - Radio Fr... - 0 views

  • from the earliest years, education should be based primarily on exploration, understanding in depth, and the development of logical, critical thinking. Such an emphasis, she says, not only produces a citizenry capable of recognizing and rooting out political jingoism and intolerance. It also produces people capable of questioning authority and perceived wisdom in ways that enhance innovation and economic competitiveness. Nussbaum warns against a narrow educational focus on technical competence.
  • a successful, long-term democracy depends on a citizenry with certain qualities that can be fostered by education.
  • The first is the capacity we associate in the Western tradition with Socrates, but it certainly appears in all traditions -- that is, the ability to think critically about proposals that are brought your way, to analyze an argument, to distinguish a good argument from a bad argument. And just in general, to lead what Socrates called “the examined life.” Now that’s, of course, important because we know that people are very prone to go along with authority, with fashion, with peer pressure. And this kind of critical enlivened citizenry is the only thing that can keep democracy vital.
  • ...15 more annotations...
  • it can be trained from very early in a child’s education. There’re ways that you can get quite young children to recognize what’s a good argument and what’s a bad argument. And as children grow older, it can be done in a more and more sophisticated form until by the time they’re undergraduates in universities they would be studying Plato’s dialogues for example and really looking at those tricky arguments and trying to figure out how to think. And this is important not just for the individual thinking about society, but it’s important for the way people talk to each other. In all too many public discussions people just throw out slogans and they throw out insults. And what democracy needs is listening. And respect. And so when people learn how to analyze an argument, then they look at what the other person’s saying differently. And they try to take it apart, and they think: “Well, do I share some of those views and where do I differ here?” and so on. And this really does produce a much more deliberative, respectful style of public interaction.
  • The second [quality] is what I call “the ability to think as a citizen of the whole world.” We’re all narrow and this is again something that we get from our animal heritage. Most non-human animals just think about the group. But, of course, in this world we need to think, first of all, our whole nation -- its many different groups, minority and majority. And then we need to think outside the nation, about how problems involving, let’s say, the environment or global economy and so on need cooperative resolution that brings together people from many different nations.
  • That’s complicated and it requires learning a lot of history, and it means learning not just to parrot some facts about history but to think critically about how to assess historical evidence. It means learning how to think about the global economy. And then I think particularly important in this era, it means learning something about the major world religions. Learning complicated, nonstereotypical accounts of those religions because there’s so much fear that’s circulating around in every country that’s based usually on just inadequate stereotypes of what Muslims are or whatever. So knowledge can at least begin to address that.
  • the third thing, which I think goes very closely with the other two, is what I call “the narrative imagination,” which is the ability to put yourself in the shoes of another person to have some understanding of how the world looks from that point of view. And to really have that kind of educated sympathy with the lives of others. Now again this is something we come into the world with. Psychologists have now found that babies less than a year old are able to take up the perspective of another person and do things, see things from that perspective. But it’s very narrow and usually people learn how to think about what their parents are thinking and maybe other family members but we need to extend that and develop it, and learn how the world looks from the point of view of minorities in our own culture, people outside our culture, and so on.
  • since we can’t go to all the places that we need to understand -- it’s accomplished by reading narratives, reading literature, drama, participating through the arts in the thought processes of another culture. So literature and the arts are the major ways we would develop and extend that capacity.
  • For many years, the leading model of development ... used by economists and international agencies measuring welfare was simply that for a country to develop means to increase [its] gross domestic product per capita. Now, in recent years, there has been a backlash to that because people feel that it just doesn’t ask enough about what goods are really doing for people, what can people really do and be.
  • so since 1990s the United Nations’ development program has produced annually what’s called a “Human Development Report” that looks at things like access to education, access to health care. In other words, a much richer menu of human chances and opportunities that people have. And at the theoretical end I’ve worked for about 20 years now with economist Amartya Sen, who won the Nobel Prize in 1998 for economics. And we’ve developed this as account of -- so for us what it is for a country to do better is to enhance the set of capabilities meaning substantial opportunities that people have to lead meaningful, fruitful lives. And then I go on to focus on a certain core group of those capabilities that I think ought to be protected by constitutional law in every country.
  • Life; health; bodily integrity; the development of senses, imagination, and thought; the development of practical reason; opportunities to have meaningful affiliations both friendly and political with other people; the ability to have emotional health -- not to be in other words dominated by overwhelming fear and so on; the ability to have a productive relationship with the environment and the world of nature; the ability to play and have leisure time, which is something that I think people don’t think enough about; and then, finally, control over one’s material and social environment, some measure of control. Now of course, each of these is very abstract, and I specify them further. Although I also think that each country needs to finally specify them with its own particular circumstances in view.
  • when kids learn in a classroom that just makes them sit in a chair, well, they can take in something in their heads, but it doesn’t make them competent at negotiating in the world. And so starting, at least, with Jean Jacques Rousseau in the 18th century, people thought: “Well, if we really want people to be independent citizens in a democracy that means that we can’t have whole classes of people who don’t know how to do anything, who are just simply sitting there waiting to be waited on in practical matters.” And so the idea that children should participate in their practical environment came out of the initial democratizing tendencies that went running through the 18th century.
  • even countries who absolutely do not want that kind of engaged citizenry see that for the success of business these abilities are pretty important. Both Singapore and China have conducted mass education reforms over the last five years because they realized that their business cultures don’t have enough imagination and they also don’t have enough critical thinking, because you can have awfully corrupt business culture if no one is willing to say the unpleasant word or make a criticism.
  • So they have striven to introduce more critical thinking and more imagination into their curricula. But, of course, for them, they want to cordon it off -- they want to do it in the science classroom, in the business classroom, but not in the politics classroom. Well, we’ll see -- can they do that? Can they segment it that way? I think democratic thinking is awfully hard to segment as current events in the Middle East are showing us. It does have the tendency to spread.
  • so maybe the people in Singapore and China will not like the end result of what they tried to do or maybe the reform will just fail, which is equally likely -- I mean the educational reform.
  • if you really don’t want democracy, this is not the education for you. It had its origins in the ancient Athenian democracy which was a very, very strong participatory democracy and it is most at home in really true democracy, where our whole goal is to get each and every person involved and to get them thinking about things. So, of course, if politicians have ambivalence about that goal they may well not want this kind of education.
  • when we bring up children in the family or in the school, we are always engineering. I mean, there is no values-free form of education in the world. Even an education that just teaches you a list of facts has values built into it. Namely, it gives a negative value to imagination and to the critical faculties and a very high value to a kind of rote, technical competence. So, you can't avoid shaping children.
  • ncreasingly the child should be in control and should become free. And that's what the critical thinking is all about -- it's about promoting freedom as the child goes on. So, the end product should be an adult who is really thinking for him- or herself about the direction of society. But you don't get freedom just by saying, "Oh, you are free." Progressive educators that simply stopped teaching found out very quickly that that didn't produce freedom. Even some of the very extreme forms of progressive school where children were just allowed to say every day what it was they wanted to learn, they found that didn't give the child the kind of mastery of self and of the world that you really need to be a free person.
Weiye Loh

Would Society Benefit from Good Digital Hoaxes? | The Utopianist - Think Bigger - 0 views

  •  
    can such hoaxes be beneficial? If a Western audience was in fact impelled to learn more about the social woes in Syria, is this a net gain for society in general? Should such well-intentioned projects be condoned, even perhaps emulated in certain ways if deemed an effective educational tool? Could we use this format - a narrative-driven account of important far-flung events that allows audience a portal into such events that may be more engaging than typical AP newswire reportage? People tend to connect better to emotion-filled story arcs than recitation of facts, after all. Perhaps instead of merely piling on MacMaster, we can learn something from his communication strategy …
Weiye Loh

Greening the screen » Scienceline - 0 views

  • But not all documentaries take such a novel approach. Randy Olson, a marine biologist-turned-filmmaker at the University of Southern California, is a harsh critic of what he sees as a very literal-minded, information-heavy approach within the environmental film genre. Well-intentioned environmental documentary filmmakers are just “making their same, boring, linear, one-dimensional explorations of issues,” said Olson. “The public’s not buying it.”
  • The problem may run deeper than audience tallies — after all, An Inconvenient Truth currently ranks as the sixth-highest grossing documentary in the United States. However, a 2010 study by social psychologist Jessica Nolan found that while the film increased viewers’ concern about global warming, that concern didn’t translate into any substantial action a month later.
  • To move a larger audience to action, Olson advocates a shift from the literal-minded world of documentary into the imaginative world of narrative.
  • ...4 more annotations...
  • One organization using this approach is the Science and Entertainment Exchange, a program of the National Academy of Sciences. The Exchange puts writers, producers, and directors in touch with scientists and engineers who can answer specific questions or just brainstorm ideas. For example, writers for the TV show Fringe changed their original plot point of mind control through hypnosis to magnetic manipulation of brain waves after speaking with a neuroscientist at the Salk Institute for Biological Studies in La Jolla, California.
  • Hollywood, Health and Society (HHS), a program of the Centers for Disease Control and Prevention, takes a similar approach by providing free resources to the entertainment industry. HHS connects writers and producers — from prime time dramas like Law and Order and House to daytime soap operas – with experts who can provide accurate health information for their scripts.
  • HHS Director Sandra Buffington admits that environmental issues, especially climate change, pose particular challenges for communicators because at first glance, they are not as immediately relevant as personal health issues. However, she believes that by focusing on real, human stories — climate refugees displaced by rising water levels, farmers unable to grow food because of drought, children sick because of outbreaks of malaria — the issues of the planet will crystallize into something tangible. All scientists need to do is provide the information, and the professional creative storytellers will do the rest, she says.
  • Olson also takes a cue from television. He points to the rise of reality TV shows as a clear indication of where the general public interest lies. If environmentalists want to capture that interest, Olson thinks they need to start experimenting with these innovative types of unscripted forms. “That’s where the cutting edge exists,” he said.
  •  
    For environmentalists trying to use entertainment to shape broad public attitudes and behaviors, nothing could be more important than understanding how to reach these hard-to-get people. Something that will speak to them, something that will change their minds, and most importantly, something that will incite them to action. A documentary might not be that something.
Weiye Loh

Open data, democracy and public sector reform - 0 views

  •  
    Governments are increasingly making their data available online in standard formats and under licenses that permit the free re-use of data. The justifications advanced for this include claims regarding the economic potential of open government data (OGD), the potential for OGD to promote transparency and accountability of government and the role of OGD in supporting the reform and reshaping of public services. This paper takes a pragmatic mixed-methods approach to exploring uses of data from the UK national open government data portal, data.gov.uk, and identifies how the emerging practices of OGD use are developing. It sets out five 'processes' of data use, and describes a series of embedded cases of education OGD use, and use of public-spending OGD. Drawing upon quantitative and qualitative data it presents an outline account of the motivations driving different individuals to engage with open government data, and it identifies a range of connections between open government data use of processes of civic change. It argues that a "data for developers" narrative that assumes OGD use will primarily be mediated by technology developers is misplaced, and that whilst innovation-based routes to OGD-driven public sector reform are evident, the relationship between OGD and democracy is less clear. As strategic research it highlights a number of emerging policy issues for developing OGD provision and use, and makes a contribution towards theoretical understandings of OGD use in practice.
Weiye Loh

"Cancer by the Numbers" by John Allen Paulos | Project Syndicate - 0 views

  • The USPSTF recently issued an even sharper warning about the prostate-specific antigen test for prostate cancer, after concluding that the test’s harms outweigh its benefits. Chest X-rays for lung cancer and Pap tests for cervical cancer have received similar, albeit less definitive, criticism.CommentsView/Create comment on this paragraphThe next step in the reevaluation of cancer screening was taken last year, when researchers at the Dartmouth Institute for Health Policy announced that the costs of screening for breast cancer were often minimized, and that the benefits were much exaggerated. Indeed, even a mammogram (almost 40 million are given annually in the US) that detects a cancer does not necessarily save a life.CommentsView/Create comment on this paragraphThe Dartmouth researchers found that, of the estimated 138,000 breast cancers detected annually in the US, the test did not help 120,000-134,000 of the afflicted women. The cancers either were growing so slowly that they did not pose a problem, or they would have been treated successfully if discovered clinically later (or they were so aggressive that little could be done).
Weiye Loh

The Death of Postmodernism And Beyond | Philosophy Now - 0 views

  • Most of the undergraduates who will take ‘Postmodern Fictions’ this year will have been born in 1985 or after, and all but one of the module’s primary texts were written before their lifetime. Far from being ‘contemporary’, these texts were published in another world, before the students were born: The French Lieutenant’s Woman, Nights at the Circus, If on a Winter’s Night a Traveller, Do Androids Dream of Electric Sheep? (and Blade Runner), White Noise: this is Mum and Dad’s culture. Some of the texts (‘The Library of Babel’) were written even before their parents were born. Replace this cache with other postmodern stalwarts – Beloved, Flaubert’s Parrot, Waterland, The Crying of Lot 49, Pale Fire, Slaughterhouse 5, Lanark, Neuromancer, anything by B.S. Johnson – and the same applies. It’s all about as contemporary as The Smiths, as hip as shoulder pads, as happening as Betamax video recorders. These are texts which are just coming to grips with the existence of rock music and television; they mostly do not dream even of the possibility of the technology and communications media – mobile phones, email, the internet, computers in every house powerful enough to put a man on the moon – which today’s undergraduates take for granted.
  • somewhere in the late 1990s or early 2000s, the emergence of new technologies re-structured, violently and forever, the nature of the author, the reader and the text, and the relationships between them.
  • Postmodernism, like modernism and romanticism before it, fetishised [ie placed supreme importance on] the author, even when the author chose to indict or pretended to abolish him or herself. But the culture we have now fetishises the recipient of the text to the degree that they become a partial or whole author of it. Optimists may see this as the democratisation of culture; pessimists will point to the excruciating banality and vacuity of the cultural products thereby generated (at least so far).
  • ...17 more annotations...
  • Pseudo-modernism also encompasses contemporary news programmes, whose content increasingly consists of emails or text messages sent in commenting on the news items. The terminology of ‘interactivity’ is equally inappropriate here, since there is no exchange: instead, the viewer or listener enters – writes a segment of the programme – then departs, returning to a passive role. Pseudo-modernism also includes computer games, which similarly place the individual in a context where they invent the cultural content, within pre-delineated limits. The content of each individual act of playing the game varies according to the particular player.
  • The pseudo-modern cultural phenomenon par excellence is the internet. Its central act is that of the individual clicking on his/her mouse to move through pages in a way which cannot be duplicated, inventing a pathway through cultural products which has never existed before and never will again. This is a far more intense engagement with the cultural process than anything literature can offer, and gives the undeniable sense (or illusion) of the individual controlling, managing, running, making up his/her involvement with the cultural product. Internet pages are not ‘authored’ in the sense that anyone knows who wrote them, or cares. The majority either require the individual to make them work, like Streetmap or Route Planner, or permit him/her to add to them, like Wikipedia, or through feedback on, for instance, media websites. In all cases, it is intrinsic to the internet that you can easily make up pages yourself (eg blogs).
  • Where once special effects were supposed to make the impossible appear credible, CGI frequently [inadvertently] works to make the possible look artificial, as in much of Lord of the Rings or Gladiator. Battles involving thousands of individuals have really happened; pseudo-modern cinema makes them look as if they have only ever happened in cyberspace.
  • Similarly, television in the pseudo-modern age favours not only reality TV (yet another unapt term), but also shopping channels, and quizzes in which the viewer calls to guess the answer to riddles in the hope of winning money.
  • The purely ‘spectacular’ function of television, as with all the arts, has become a marginal one: what is central now is the busy, active, forging work of the individual who would once have been called its recipient. In all of this, the ‘viewer’ feels powerful and is indeed necessary; the ‘author’ as traditionally understood is either relegated to the status of the one who sets the parameters within which others operate, or becomes simply irrelevant, unknown, sidelined; and the ‘text’ is characterised both by its hyper-ephemerality and by its instability. It is made up by the ‘viewer’, if not in its content then in its sequence – you wouldn’t read Middlemarch by going from page 118 to 316 to 401 to 501, but you might well, and justifiably, read Ceefax that way.
  • A pseudo-modern text lasts an exceptionally brief time. Unlike, say, Fawlty Towers, reality TV programmes cannot be repeated in their original form, since the phone-ins cannot be reproduced, and without the possibility of phoning-in they become a different and far less attractive entity.
  • If scholars give the date they referenced an internet page, it is because the pages disappear or get radically re-cast so quickly. Text messages and emails are extremely difficult to keep in their original form; printing out emails does convert them into something more stable, like a letter, but only by destroying their essential, electronic state.
  • The cultural products of pseudo-modernism are also exceptionally banal
  • Much text messaging and emailing is vapid in comparison with what people of all educational levels used to put into letters.
  • A triteness, a shallowness dominates all.
  • In music, the pseudo-modern supersedingof the artist-dominated album as monolithic text by the downloading and mix-and-matching of individual tracks on to an iPod, selected by the listener, was certainly prefigured by the music fan’s creation of compilation tapes a generation ago. But a shift has occurred, in that what was a marginal pastime of the fan has become the dominant and definitive way of consuming music, rendering the idea of the album as a coherent work of art, a body of integrated meaning, obsolete.
  • To a degree, pseudo-modernism is no more than a technologically motivated shift to the cultural centre of something which has always existed (similarly, metafiction has always existed, but was never so fetishised as it was by postmodernism). Television has always used audience participation, just as theatre and other performing arts did before it; but as an option, not as a necessity: pseudo-modern TV programmes have participation built into them.
  • Whereas postmodernism called ‘reality’ into question, pseudo-modernism defines the real implicitly as myself, now, ‘interacting’ with its texts. Thus, pseudo-modernism suggests that whatever it does or makes is what is reality, and a pseudo-modern text may flourish the apparently real in an uncomplicated form: the docu-soap with its hand-held cameras (which, by displaying individuals aware of being regarded, give the viewer the illusion of participation); The Office and The Blair Witch Project, interactive pornography and reality TV; the essayistic cinema of Michael Moore or Morgan Spurlock.
  • whereas postmodernism favoured the ironic, the knowing and the playful, with their allusions to knowledge, history and ambivalence, pseudo-modernism’s typical intellectual states are ignorance, fanaticism and anxiety
  • pseudo-modernism lashes fantastically sophisticated technology to the pursuit of medieval barbarism – as in the uploading of videos of beheadings onto the internet, or the use of mobile phones to film torture in prisons. Beyond this, the destiny of everyone else is to suffer the anxiety of getting hit in the cross-fire. But this fatalistic anxiety extends far beyond geopolitics, into every aspect of contemporary life; from a general fear of social breakdown and identity loss, to a deep unease about diet and health; from anguish about the destructiveness of climate change, to the effects of a new personal ineptitude and helplessness, which yield TV programmes about how to clean your house, bring up your children or remain solvent.
  • Pseudo-modernism belongs to a world pervaded by the encounter between a religiously fanatical segment of the United States, a largely secular but definitionally hyper-religious Israel, and a fanatical sub-section of Muslims scattered across the planet: pseudo-modernism was not born on 11 September 2001, but postmodernism was interred in its rubble.
  • pseudo-modernist communicates constantly with the other side of the planet, yet needs to be told to eat vegetables to be healthy, a fact self-evident in the Bronze Age. He or she can direct the course of national television programmes, but does not know how to make him or herself something to eat – a characteristic fusion of the childish and the advanced, the powerful and the helpless. For varying reasons, these are people incapable of the “disbelief of Grand Narratives” which Lyotard argued typified postmodernists
  •  
    Postmodern philosophy emphasises the elusiveness of meaning and knowledge. This is often expressed in postmodern art as a concern with representation and an ironic self-awareness. And the argument that postmodernism is over has already been made philosophically. There are people who have essentially asserted that for a while we believed in postmodern ideas, but not any more, and from now on we're going to believe in critical realism. The weakness in this analysis is that it centres on the academy, on the practices and suppositions of philosophers who may or may not be shifting ground or about to shift - and many academics will simply decide that, finally, they prefer to stay with Foucault [arch postmodernist] than go over to anything else. However, a far more compelling case can be made that postmodernism is dead by looking outside the academy at current cultural production.
Weiye Loh

Roger Pielke Jr.'s Blog: Mike Daisey and Higher Truths - 0 views

  • Real life is messy. And as a general rule, the more theatrical the story you hear, and the more it divides the world into goodies vs baddies, the less reliable that story is going to be.
  • some people do feel that certain issues are so important that there should be cause in political debates to overlook lies or misrepresentations in service of a "larger truth" (Yellow cake, anyone?). I have seen this attitude for years in the climate change debate (hey look, just today), and often condoned by scientists and journalists alike.
  • the "global warming: yes or no?" debate has become an obstacle to effective policy action related to climate. Several of these colleagues suggested that I should downplay the policy implications of my work showing that for a range of phenomena and places, future climate impacts depend much more on growing human vulnerability to climate than on projected changes in climate itself (under the assumptions of the Intergovernmental Panel on Climate Change). One colleague wrote, "I think we have a professional (or moral?) obligation to be very careful what we say and how we say it when the stakes are so high." In effect, some of these colleagues were intimating that ends justify means or, in other words, doing the "right thing" for the wrong reasons is OK.
  • ...3 more annotations...
  • When science is used (and misused) in political advocacy, there are frequent opportunities for such situations to arise.
  • I don't think you're being fair to Mike Lemonick. In the article by him that you cite, MIke's provocative question was framed in the context of an analogy he was making to the risks of smoking. For example, in that article, he also says: "So should the overall message be that nobody knows anything? I don’t think so. We would never want to pretend the uncertainty isn’t there, since that would be dishonest. But featuring it prominently is dishonest ,too, just as trumpeting uncertainty in the smoking-cancer connection would have been."Thus, I think you're reading way too much into Mike's piece. That said, I do agree with you that there are implications of the Daisey case for climate communicators and climate journalism. My own related post is here: http://www.collide-a-scape.com/2012/03/19/the-seduction-of-narrative/"
  • I don't want journalists shading the truth in a desire to be "effective" in some way. That is Daisey's tradeoff too.
  •  
    Recall that in the aftermath of initial revelations about Peter Gleick's phishing of the Heartland Institute, we heard defenses of his action that included claims that he was only doing the same thing that journalists do to the importance of looking beyond Gleick's misdeeds at the "larger truth." Consider also what was described in the UEA emails as "pressure to present a nice tidy story" related to climate science as well as the IPCC's outright falsification related to disasters and climate change. Such shenanigans so endemic in the climate change debate that when a journalist openly asks whether the media should tell the whole truth about climate change, no one even bats an eye. 
Weiye Loh

The internet: is it changing the way we think? | Technology | The Observer - 0 views

  • American magazine the Atlantic lobs an intellectual grenade into our culture. In the summer of 1945, for example, it published an essay by the Massachusetts Institute of Technology (MIT) engineer Vannevar Bush entitled "As We May Think". It turned out to be the blueprint for what eventually emerged as the world wide web. Two summers ago, the Atlantic published an essay by Nicholas Carr, one of the blogosphere's most prominent (and thoughtful) contrarians, under the headline "Is Google Making Us Stupid?".
  • Carr wrote, "I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going – so far as I can tell – but it's changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle."
  • Carr's target was not really the world's leading search engine, but the impact that ubiquitous, always-on networking is having on our cognitive processes. His argument was that our deepening dependence on networking technology is indeed changing not only the way we think, but also the structure of our brains.
  • ...9 more annotations...
  • Carr's article touched a nerve and has provoked a lively, ongoing debate on the net and in print (he has now expanded it into a book, The Shallows: What the Internet Is Doing to Our Brains). This is partly because he's an engaging writer who has vividly articulated the unease that many adults feel about the way their modi operandi have changed in response to ubiquitous networking.
  • Who bothers to write down or memorise detailed information any more, for example, when they know that Google will always retrieve it if it's needed again? The web has become, in a way, a global prosthesis for our collective memory.
  • easy to dismiss Carr's concern as just the latest episode of the moral panic that always accompanies the arrival of a new communications technology. People fretted about printing, photography, the telephone and television in analogous ways. It even bothered Plato, who argued that the technology of writing would destroy the art of remembering.
  • many commentators who accept the thrust of his argument seem not only untroubled by its far-reaching implications but are positively enthusiastic about them. When the Pew Research Centre's Internet & American Life project asked its panel of more than 370 internet experts for their reaction, 81% of them agreed with the proposition that "people's use of the internet has enhanced human intelligence".
  • As a writer, thinker, researcher and teacher, what I can attest to is that the internet is changing our habits of thinking, which isn't the same thing as changing our brains. The brain is like any other muscle – if you don't stretch it, it gets both stiff and flabby. But if you exercise it regularly, and cross-train, your brain will be flexible, quick, strong and versatile.
  • he internet is analogous to a weight-training machine for the brain, as compared with the free weights provided by libraries and books. Each method has its advantage, but used properly one works you harder. Weight machines are directive and enabling: they encourage you to think you've worked hard without necessarily challenging yourself. The internet can be the same: it often tells us what we think we know, spreading misinformation and nonsense while it's at it. It can substitute surface for depth, imitation for originality, and its passion for recycling would surpass the most committed environmentalist.
  • I've seen students' thinking habits change dramatically: if information is not immediately available via a Google search, students are often stymied. But of course what a Google search provides is not the best, wisest or most accurate answer, but the most popular one.
  • But knowledge is not the same thing as information, and there is no question to my mind that the access to raw information provided by the internet is unparalleled and democratising. Admittance to elite private university libraries and archives is no longer required, as they increasingly digitise their archives. We've all read the jeremiads that the internet sounds the death knell of reading, but people read online constantly – we just call it surfing now. What they are reading is changing, often for the worse; but it is also true that the internet increasingly provides a treasure trove of rare books, documents and images, and as long as we have free access to it, then the internet can certainly be a force for education and wisdom, and not just for lies, damned lies, and false statistics.
  • In the end, the medium is not the message, and the internet is just a medium, a repository and an archive. Its greatest virtue is also its greatest weakness: it is unselective. This means that it is undiscriminating, in both senses of the word. It is indiscriminate in its principles of inclusion: anything at all can get into it. But it also – at least so far – doesn't discriminate against anyone with access to it. This is changing rapidly, of course, as corporations and governments seek to exert control over it. Knowledge may not be the same thing as power, but it is unquestionably a means to power. The question is, will we use the internet's power for good, or for evil? The jury is very much out. The internet itself is disinterested: but what we use it for is not.
  •  
    The internet: is it changing the way we think? American writer Nicholas Carr's claim that the internet is not only shaping our lives but physically altering our brains has sparked a lively and ongoing debate, says John Naughton. Below, a selection of writers and experts offer their opinion
Weiye Loh

The overblown crisis in American education : The New Yorker - 0 views

  • it’s odd that a narrative of crisis, of a systemic failure, in American education is currently so persuasive. This back-to-school season, we have Davis Guggenheim’s documentary about the charter-school movement, “Waiting for ‘Superman’ ”; two short, dyspeptic books about colleges and universities, “Higher Education?,” by Andrew Hacker and Claudia Dreifus, and “Crisis on Campus,” by Mark C. Taylor; and a lot of positive attention to the school-reform movement in the national press. From any of these sources, it would be difficult to reach the conclusion that, over all, the American education system works quite well.
  • In higher education, the reform story isn’t so fully baked yet, but its main elements are emerging. The system is vast: hundreds of small liberal-arts colleges; a new and highly leveraged for-profit sector that offers degrees online; community colleges; state universities whose budgets are being cut because of the recession; and the big-name private universities, which get the most attention. You wouldn’t design a system this way—it’s filled with overlaps and competitive excess. Much of it strives toward an ideal that took shape in nineteenth-century Germany: the university as a small, élite center of pure scholarly research. Research is the rationale for low teaching loads, publication requirements, tenure, tight-knit academic disciplines, and other practices that take it on the chin from Taylor, Hacker, and Dreifus for being of little benefit to students or society.
  • Yet for a system that—according to Taylor, especially—is deeply in crisis, American higher education is not doing badly. The lines of people wanting to get into institutions that the authors say are just waiting to cheat them by overcharging and underteaching grow ever longer and more international, and the people waiting in those lines don’t seem deterred by price increases, even in a terrible recession.
  • ...1 more annotation...
  • There have been attempts in the past to make the system more rational and less redundant, and to shrink the portion of it that undertakes scholarly research, but they have not met with much success, and not just because of bureaucratic resistance by the interested parties. Large-scale, decentralized democratic societies are not very adept at generating neat, rational solutions to messy situations. The story line on education, at this ill-tempered moment in American life, expresses what might be called the Noah’s Ark view of life: a vast territory looks so impossibly corrupted that it must be washed away, so that we can begin its activities anew, on finer, higher, firmer principles. One should treat any perception that something so large is so completely awry with suspicion, and consider that it might not be true—especially before acting on it.
  •  
    mass higher education is one of the great achievements of American democracy. It embodies a faith in the capabilities of ordinary people that the Founders simply didn't have.
Weiye Loh

Twitter, Facebook Won't Make You Immoral - But TV News Might | Wired Science | Wired.com - 1 views

  • It’s too soon to say that Twitter and Facebook destroy the mental foundations of morality, but not too soon to ask what they’re doing.
  • In the paper, published Monday in the Proceedings of the National Academy of Sciences, 13 people were shown documentary-style multimedia narratives designed to arouse empathy. Researchers recorded their brain activity and found that empathy is as deeply rooted in the human psyche as fear and anger.
  • They also noticed that empathic brain systems took an average of six to eight seconds to start up. The researchers didn’t connect this to media consumption habits, but the study’s press release fueled speculation that the Facebook generation could turn into sociopaths.
  • ...6 more annotations...
  • Entitled "Can Twitter Make You Amoral? Rapid-fire Media May Confuse Your Moral Compass," it claimed that the research "raises questions about the emotional cost —particularly for the developing brain — of heavy reliance on a rapid stream of news snippets obtained through television, online feeds or social networks such as Twitter."
  • Compared to in-depth news coverage, first-person Tweets of on-the-ground events, such as the 2008 Mumbai bombings, is generally unmoving. But in those situations, Twitter’s primary use is in gathering useful, immediate facts, not storytelling.
  • Most people who read a handful of words about a friend’s heartache, or see a link to a tragic story, would likely follow it up. But following links to a video news story makes the possibility of a short-circuited neurobiology of compassion becomes more real. Research suggests that people are far more empathic when stories are told in a linear way, without quick shot-to-shot edits. In a 1996 Empirical Studies of the Arts paper, researchers showed three versions of an ostensibly tear-jerking story to 120 test subjects. "Subjects had significantly more favorable impressions of the victimized female protagonist than of her male opponent only when the story structure was linear," they concluded.
  • A review of tabloid news formats in the Journal of Broadcasting & Electronic Media found that jarring, rapid-fire visual storytelling produced a physiological arousal led to better recall of what was seen, but only if the original subject matter was dull. If it was already arousing, tabloid storytelling appeared to produce a cognitive overload that actually prevented stories from sinking in.
  • "Quick cuts will draw and retain a viewer’s focus even if the content is uninteresting," said freelance video producer Jill Bauerle. "MTV-like jump cuts, which have become the standard for many editors, serve as a sort of eye candy to keep eyeballs peeled to screen."
  • f compassion can only be activated by sustained attention, which is prevented by fast-cut editing, then the ability to be genuinely moved by another’s story could atrophy. It might even fail to properly develop in children, whose brains are being formed in ways that will last a lifetime. More research is clearly needed, including a replication of the original empathy findings, but the hypothesis is plausible.
  •  
    Twitter, Facebook Won't Make You Immoral - But TV News Might
Weiye Loh

Skepticblog » The Decline Effect - 0 views

  • The first group are those with an overly simplistic or naive sense of how science functions. This is a view of science similar to those films created in the 1950s and meant to be watched by students, with the jaunty music playing in the background. This view generally respects science, but has a significant underappreciation for the flaws and complexity of science as a human endeavor. Those with this view are easily scandalized by revelations of the messiness of science.
  • The second cluster is what I would call scientific skepticism – which combines a respect for science and empiricism as a method (really “the” method) for understanding the natural world, with a deep appreciation for all the myriad ways in which the endeavor of science can go wrong. Scientific skeptics, in fact, seek to formally understand the process of science as a human endeavor with all its flaws. It is therefore often skeptics pointing out phenomena such as publication bias, the placebo effect, the need for rigorous controls and blinding, and the many vagaries of statistical analysis. But at the end of the day, as complex and messy the process of science is, a reliable picture of reality is slowly ground out.
  • The third group, often frustrating to scientific skeptics, are the science-deniers (for lack of a better term). They may take a postmodernist approach to science – science is just one narrative with no special relationship to the truth. Whatever you call it, what the science-deniers in essence do is describe all of the features of science that the skeptics do (sometimes annoyingly pretending that they are pointing these features out to skeptics) but then come to a different conclusion at the end – that science (essentially) does not work.
  • ...13 more annotations...
  • this third group – the science deniers – started out in the naive group, and then were so scandalized by the realization that science is a messy human endeavor that the leap right to the nihilistic conclusion that science must therefore be bunk.
  • The article by Lehrer falls generally into this third category. He is discussing what has been called “the decline effect” – the fact that effect sizes in scientific studies tend to decrease over time, sometime to nothing.
  • This term was first applied to the parapsychological literature, and was in fact proposed as a real phenomena of ESP – that ESP effects literally decline over time. Skeptics have criticized this view as magical thinking and hopelessly naive – Occam’s razor favors the conclusion that it is the flawed measurement of ESP, not ESP itself, that is declining over time. 
  • Lehrer, however, applies this idea to all of science, not just parapsychology. He writes: And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.) The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.
  • Lehrer is ultimately referring to aspects of science that skeptics have been pointing out for years (as a way of discerning science from pseudoscience), but Lehrer takes it to the nihilistic conclusion that it is difficult to prove anything, and that ultimately “we still have to choose what to believe.” Bollocks!
  • Lehrer is describing the cutting edge or the fringe of science, and then acting as if it applies all the way down to the core. I think the problem is that there is so much scientific knowledge that we take for granted – so much so that we forget it is knowledge that derived from the scientific method, and at one point was not known.
  • It is telling that Lehrer uses as his primary examples of the decline effect studies from medicine, psychology, and ecology – areas where the signal to noise ratio is lowest in the sciences, because of the highly variable and complex human element. We don’t see as much of a decline effect in physics, for example, where phenomena are more objective and concrete.
  • If the truth itself does not “wear off”, as the headline of Lehrer’s article provocatively states, then what is responsible for this decline effect?
  • it is no surprise that effect science in preliminary studies tend to be positive. This can be explained on the basis of experimenter bias – scientists want to find positive results, and initial experiments are often flawed or less than rigorous. It takes time to figure out how to rigorously study a question, and so early studies will tend not to control for all the necessary variables. There is further publication bias in which positive studies tend to be published more than negative studies.
  • Further, some preliminary research may be based upon chance observations – a false pattern based upon a quirky cluster of events. If these initial observations are used in the preliminary studies, then the statistical fluke will be carried forward. Later studies are then likely to exhibit a regression to the mean, or a return to more statistically likely results (which is exactly why you shouldn’t use initial data when replicating a result, but should use entirely fresh data – a mistake for which astrologers are infamous).
  • skeptics are frequently cautioning against new or preliminary scientific research. Don’t get excited by every new study touted in the lay press, or even by a university’s press release. Most new findings turn out to be wrong. In science, replication is king. Consensus and reliable conclusions are built upon multiple independent lines of evidence, replicated over time, all converging on one conclusion.
  • Lehrer does make some good points in his article, but they are points that skeptics are fond of making. In order to have a  mature and functional appreciation for the process and findings of science, it is necessary to understand how science works in the real world, as practiced by flawed scientists and scientific institutions. This is the skeptical message.
  • But at the same time reliable findings in science are possible, and happen frequently – when results can be replicated and when they fit into the expanding intricate weave of the picture of the natural world being generated by scientific investigation.
Weiye Loh

The Matthew Effect § SEEDMAGAZINE.COM - 0 views

  • For to all those who have, more will be given, and they will have an abundance; but from those who have nothing, even what they have will be taken away. —Matthew 25:29
  • Sociologist Robert K. Merton was the first to publish a paper on the similarity between this phrase in the Gospel of Matthew and the realities of how scientific research is rewarded
  • Even if two researchers do similar work, the most eminent of the pair will get more acclaim, Merton observed—more praise within the community, more or better job offers, better opportunities. And it goes without saying that even if a graduate student publishes stellar work in a prestigious journal, their well-known advisor is likely to get more of the credit. 
  • ...7 more annotations...
  • Merton published his theory, called the “Matthew Effect,” in 1968. At that time, the average age of a biomedical researcher in the US receiving his or her first significant funding was 35 or younger. That meant that researchers who had little in terms of fame (at 35, they would have completed a PhD and a post-doc and would be just starting out on their own) could still get funded if they wrote interesting proposals. So Merton’s observation about getting credit for one’s work, however true in terms of prestige, wasn’t adversely affecting the funding of new ideas.
  • Over the last 40 years, the importance of fame in science has increased. The effect has compounded because famous researchers have gathered the smartest and most ambitious graduate students and post-docs around them, so that each notable paper from a high-wattage group bootstraps their collective power. The famous grow more famous, and the younger researchers in their coterie are able to use that fame to their benefit. The effect of this concentration of power has finally trickled down to the level of funding: The average age on first receipt of the most common “starter” grants at the NIH is now almost 42. This means younger researchers without the strength of a fame-based community are cut out of the funding process, and their ideas, separate from an older researcher’s sphere of influence, don’t get pursued. This causes a founder effect in modern science, where the prestigious few dictate the direction of research. It’s not only unfair—it’s also actively dangerous to science’s progress.
  • How can we fund science in a way that is fair? By judging researchers independently of their fame—in other words, not by how many times their papers have been cited. By judging them instead via new measures, measures that until recently have been too ephemeral to use.
  • Right now, the gold standard worldwide for measuring a scientist’s worth is the number of times his or her papers are cited, along with the importance of the journal where the papers were published. Decisions of funding, faculty positions, and eminence in the field all derive from a scientist’s citation history. But relying on these measures entrenches the Matthew Effect: Even when the lead author is a graduate student, the majority of the credit accrues to the much older principal investigator. And an influential lab can inflate its citations by referring to its own work in papers that themselves go on to be heavy-hitters.
  • what is most profoundly unbalanced about relying on citations is that the paper-based metric distorts the reality of the scientific enterprise. Scientists make data points, narratives, research tools, inventions, pictures, sounds, videos, and more. Journal articles are a compressed and heavily edited version of what happens in the lab.
  • We have the capacity to measure the quality of a scientist across multiple dimensions, not just in terms of papers and citations. Was the scientist’s data online? Was it comprehensible? Can I replicate the results? Run the code? Access the research tools? Use them to write a new paper? What ideas were examined and discarded along the way, so that I might know the reality of the research? What is the impact of the scientist as an individual, rather than the impact of the paper he or she wrote? When we can see the scientist as a whole, we’re less prone to relying on reputation alone to assess merit.
  • Multidimensionality is one of the only counters to the Matthew Effect we have available. In forums where this kind of meritocracy prevails over seniority, like Linux or Wikipedia, the Matthew Effect is much less pronounced. And we have the capacity to measure each of these individual factors of a scientist’s work, using the basic discourse of the Web: the blog, the wiki, the comment, the trackback. We can find out who is talented in a lab, not just who was smart enough to hire that talent. As we develop the ability to measure multiple dimensions of scientific knowledge creation, dissemination, and re-use, we open up a new way to recognize excellence. What we can measure, we can value.
  •  
    WHEN IT COMES TO SCIENTIFIC PUBLISHING AND FAME, THE RICH GET RICHER AND THE POOR GET POORER. HOW CAN WE BREAK THIS FEEDBACK LOOP?
Weiye Loh

Meet the Ethical Placebo: A Story that Heals | NeuroTribes - 0 views

  • In modern medicine, placebos are associated with another form of deception — a kind that has long been thought essential for conducting randomized clinical trials of new drugs, the statistical rock upon which the global pharmaceutical industry was built. One group of volunteers in an RCT gets the novel medication; another group (the “control” group) gets pills or capsules that look identical to the allegedly active drug, but contain only an inert substance like milk sugar. These faux drugs are called placebos.
  • Inevitably, the health of some people in both groups improves, while the health of others grows worse. Symptoms of illness fluctuate for all sorts of reasons, including regression to the mean.
  • Since the goal of an RCT, from Big Pharma’s perspective, is to demonstrate the effectiveness of a new drug, the return to robust health of a volunteer in the control group is considered a statistical distraction. If too many people in the trial get better after downing sugar pills, the real drug will look worse by comparison — sometimes fatally so for the purpose of earning approval from the Food and Drug Adminstration.
  • ...12 more annotations...
  • For a complex and somewhat mysterious set of reasons, it is becoming increasingly difficult for experimental drugs to prove their superiority to sugar pills in RCTs
  • in recent years, however, has it become obvious that the abatement of symptoms in control-group volunteers — the so-called placebo effect — is worthy of study outside the context of drug trials, and is in fact profoundly good news to anyone but investors in Pfizer, Roche, and GlaxoSmithKline.
  • The emerging field of placebo research has revealed that the body’s repertoire of resilience contains a powerful self-healing network that can help reduce pain and inflammation, lower the production of stress chemicals like cortisol, and even tame high blood pressure and the tremors of Parkinson’s disease.
  • more and more studies each year — by researchers like Fabrizio Benedetti at the University of Turin, author of a superb new book called The Patient’s Brain, and neuroscientist Tor Wager at the University of Colorado — demonstrate that the placebo effect might be potentially useful in treating a wide range of ills. Then why aren’t doctors supposed to use it?
  • The medical establishment’s ethical problem with placebo treatment boils down to the notion that for fake drugs to be effective, doctors must lie to their patients. It has been widely assumed that if a patient discovers that he or she is taking a placebo, the mind/body password will no longer unlock the network, and the magic pills will cease to do their job.
  • For “Placebos Without Deception,” the researchers tracked the health of 80 volunteers with irritable bowel syndrome for three weeks as half of them took placebos and the other half didn’t.
  • In a previous study published in the British Medical Journal in 2008, Kaptchuk and Kirsch demonstrated that placebo treatment can be highly effective for alleviating the symptoms of IBS. This time, however, instead of the trial being “blinded,” it was “open.” That is, the volunteers in the placebo group knew that they were getting only inert pills — which they were instructed to take religiously, twice a day. They were also informed that, just as Ivan Pavlov trained his dogs to drool at the sound of a bell, the body could be trained to activate its own built-in healing network by the act of swallowing a pill.
  • In other words, in addition to the bogus medication, the volunteers were given a true story — the story of the placebo effect. They also received the care and attention of clinicians, which have been found in many other studies to be crucial for eliciting placebo effects. The combination of the story and a supportive clinical environment were enough to prevail over the knowledge that there was really nothing in the pills. People in the placebo arm of the trial got better — clinically, measurably, significantly better — on standard scales of symptom severity and overall quality of life. In fact, the volunteers in the placebo group experienced improvement comparable to patients taking a drug called alosetron, the standard of care for IBS. Meet the ethical placebo: a powerfully effective faux medication that meets all the standards of informed consent.
  • The study is hardly the last word on the subject, but more like one of the first. Its modest sample size and brief duration leave plenty of room for followup research. (What if “ethical” placebos wear off more quickly than deceptive ones? Does the fact that most of the volunteers in this study were women have any bearing on the outcome? Were any of the volunteers skeptical that the placebo effect is real, and did that affect their response to treatment?) Before some eager editor out there composes a tweet-baiting headline suggesting that placebos are about to drive Big Pharma out of business, he or she should appreciate the fact that the advent of AMA-approved placebo treatments would open numerous cans of fascinatingly tangled worms. For example, since the precise nature of placebo effects is shaped largely by patients’ expectations, would the advertised potency and side effects of theoretical products like Placebex and Therastim be subject to change by Internet rumors, requiring perpetual updating?
  • It’s common to use the word “placebo” as a synonym for “scam.” Economists talk about placebo solutions to our economic catastrophe (tax cuts for the rich, anyone?). Online skeptics mock the billion-dollar herbal-medicine industry by calling it Big Placebo. The fact that our brains and bodies respond vigorously to placebos given in warm and supportive clinical environments, however, turns out to be very real.
  • We’re also discovering that the power of narrative is embedded deeply in our physiology.
  • in the real world of doctoring, many physicians prescribe medications at dosages too low to have an effect on their own, hoping to tap into the body’s own healing resources — though this is mostly acknowledged only in whispers, as a kind of trade secret.
Weiye Loh

The hidden philosophy of David Foster Wallace - Salon.com Mobile - 0 views

  • Taylor's argument, which he himself found distasteful, was that certain logical and seemingly unarguable premises lead to the conclusion that even in matters of human choice, the future is as set in stone as the past. We may think we can affect it, but we can't.
  • human responsibility — that, with advances in neuroscience, is of increasing urgency in jurisprudence, social codes and personal conduct. And it also shows a brilliant young man struggling against fatalism, performing exquisite exercises to convince others, and maybe himself, that what we choose to do is what determines the future, rather than the future more or less determining what we choose to do. This intellectual struggle on Wallace's part seems now a kind of emotional foreshadowing of his suicide. He was a victim of depression from an early age — even during his undergraduate years — and the future never looks more intractable than it does to someone who is depressed.
  • "Fate, Time, and Language" reminded me of how fond philosophers are of extreme situations in creating their thought experiments. In this book alone we find a naval battle, the gallows, a shotgun, poison, an accident that leads to paraplegia, somebody stabbed and killed, and so on. Why not say "I have a pretzel in my hand today. Tomorrow I will have eaten it or not eaten it" instead of "I have a gun in my hand and I will either shoot you through the heart and feast on your flesh or I won't"? Well, OK — the answer is easy: The extreme and violent scenarios catch our attention more forcefully than pretzels do. Also, philosophers, sequestered and meditative as they must be, may long for real action — beyond beekeeping.
  • ...1 more annotation...
  • Wallace, in his essay, at the very center of trying to show that we can indeed make meaningful choices, places a terrorist in the middle of Amherst's campus with his finger on the trigger mechanism of a nuclear weapon. It is by far the most narratively arresting moment in all of this material, and it says far more about the author's approaching antiestablishment explosions of prose and his extreme emotional makeup than it does about tweedy profs fantasizing about ordering their ships into battle. For, after all, who, besides everyone around him, would the terrorist have killed?
  •  
    In 1962, a philosopher (and world-famous beekeeper) named Richard Taylor published a soon-to-be-notorious essay called "Fatalism" in the Philosophical Review.
Weiye Loh

FT.com / FT Magazine - A disastrous truth - 0 views

  • Every time a disaster strikes, some environmentalists blame it on climate change. “It’s been such a part of the narrative of the public and political debate, particularly after Hurricane Katrina,” Roger Pielke Jr, an expert on the politics of climate change at the University of Colorado, told me.
  • But nothing in the scientific literature indicates that this is true. A host of recent peer-reviewed studies agree: there’s no evidence that climate change has increased the damage from natural disasters. Most likely, climate change will make disasters worse some day, but not yet.
  • Laurens Bouwer, of Amsterdam’s Vrije Universiteit, has recently reviewed 22 “disaster loss studies” and concludes: “Anthropogenic climate change so far has not had a significant impact on losses from natural disasters.”
  • ...4 more annotations...
  • Eric Neumayer and Fabian Barthel of the London School of Economics found likewise in their recent “global analysis” of natural disasters.
  • in his book The Climate Fix: What Scientists and Politicians Won’t Tell You About Global Warming, Pielke writes that there’s no upward trend in the landfalls of tropical cyclones. Even floods in Brisbane aren’t getting worse – just check out the city’s 19th-century floods. Pielke says the consensus of peer-reviewed research on this point – that climate change is not yet worsening disasters – is as strong as any consensus in climate science.
  • It’s true that floods and hurricanes do more damage every decade. However, that’s because ever more people, owning ever more “stuff”, live in vulnerable spots.
  • When it comes to preventing today’s disasters, the squabble about climate change is just a distraction. The media usually has room for only one environmental argument: is climate change happening? This pits virtually all climate scientists against a band of self-taught freelance sceptics, many of whom think the “global warming hoax” is a ruse got up by 1960s radicals as a trick to bring in socialism. (I know, I get the sceptics’ e-mails.) Sometimes in this squabble, climate scientists are tempted to overstate their case, and to say that the latest disaster proves that the climate is changing. This is bad science. It also gives the sceptics something dubious to attack. Better to ignore the sceptics, and have more useful debates about disasters and climate change – which, for now, are two separate problems.
Weiye Loh

Book Review: Future Babble by Dan Gardner « Critical Thinking « Skeptic North - 0 views

  • I predict that you will find this review informative. If you do, you will congratulate my foresight. If you don’t, you’ll forget I was wrong.
  • My playful intro summarizes the main thesis of Gardner’s excellent book, Future Babble: Why Expert Predictions Fail – and Why We Believe Them Anyway.
  • In Future Babble, the research area explored is the validity of expert predictions, and the primary researcher examined is Philip Tetlock. In the early 1980s, Tetlock set out to better understand the accuracy of predictions made by experts by conducting a methodologically sound large-scale experiment.
  • ...10 more annotations...
  • Gardner presents Tetlock’s experimental design in an excellent way, making it accessible to the lay person. Concisely, Tetlock examined 27450 judgments in which 284 experts were presented with clear questions whose answers could later be shown to be true or false (e.g., “Will the official unemployment rate be higher, lower or the same a year from now?”). For each prediction, the expert must answer clearly and express their degree of certainty as a percentage (e.g., dead certain = 100%). The usage of precise numbers adds increased statistical options and removes the complications of vague or ambiguous language.
  • Tetlock found the surprising and disturbing truth “that experts’ predictions were no more accurate than random guesses.” (p. 26) An important caveat is that there was a wide range of capability, with some experts being completely out of touch, and others able to make successful predictions.
  • What distinguishes the impressive few from the borderline delusional is not whether they’re liberal or conservative. Tetlock’s data showed political beliefs made no difference to an expert’s accuracy. The same is true of optimists and pessimists. It also made no difference if experts had a doctorate, extensive experience, or access to classified information. Nor did it make a difference if experts were political scientists, historians, journalists, or economists.” (p. 26)
  • The experts who did poorly were not comfortable with complexity and uncertainty, and tended to reduce most problems to some core theoretical theme. It was as if they saw the world through one lens or had one big idea that everything else had to fit into. Alternatively, the experts who did decently were self-critical, used multiple sources of information and were more comfortable with uncertainty and correcting their errors. Their thinking style almost results in a paradox: “The experts who were more accurate than others tended to be less confident they were right.” (p.27)
  • Gardner then introduces the terms ‘Hedgehog’ and ‘Fox’ to refer to bad and good predictors respectively. Hedgehogs are the ones you see pushing the same idea, while Foxes are likely in the background questioning the ability of prediction itself while making cautious proposals. Foxes are more likely to be correct. Unfortunately, it is Hedgehogs that we see on the news.
  • one of Tetlock’s findings was that “the bigger the media profile of an expert, the less accurate his predictions.” (p.28)
  • Chapter 2 – The Unpredictable World An exploration into how many events in the world are simply unpredictable. Gardner discusses chaos theory and necessary and sufficient conditions for events to occur. He supports the idea of actually saying “I don’t know,” which many experts are reluctant to do.
  • Chapter 3 – In the Minds of Experts A more detailed examination of Hedgehogs and Foxes. Gardner discusses randomness and the illusion of control while using narratives to illustrate his points à la Gladwell. This chapter provides a lot of context and background information that should be very useful to those less initiated.
  • Chapter 6 – Everyone Loves a Hedgehog More about predictions and how the media picks up hedgehog stories and talking points without much investigation into their underlying source or concern for accuracy. It is a good demolition of the absurdity of so many news “discussion shows.” Gardner demonstrates how the media prefer a show where Hedgehogs square off against each other, and it is important that these commentators not be challenged lest they become exposed and, by association, implicate the flawed structure of the program/network.Gardner really singles out certain people, like Paul Ehrlich, and shows how they have been wrong many times and yet can still get an audience.
  • “An assertion that cannot be falsified by any conceivable evidence is nothing more than dogma. It can’t be debated. It can’t be proven or disproven. It’s just something people choose to believe or not for reasons that have nothing to do with fact and logic. And dogma is what predictions become when experts and their followers go to ridiculous lengths to dismiss clear evidence that they failed.”
Weiye Loh

Leong Sze Hian stands corrected? | The Online Citizen - 0 views

  • In your article, you make the argument that “Straits Times Forum Editor, was merely amending his (my) letter to cite the correct statistics. “For example, the Education Minister said “How children from the bottom one-third by socio-economic background fare: One in two scores in the top two-thirds at PSLE” - But, Mr Samuel Wee wrote “His statement is backed up with the statistic that 50% of children from the bottom third of the socio-economic ladder score in the bottom third of the Primary School Leaving Examination”.” Kind sir, the statistics state that 1 in 2 are in the top 66.6% (Which, incidentally, includes the top fifth of the bottom 50%!) Does it not stand to reason, then, that if 50% are in the top 66.6%, the remaining 50% are in the bottom 33.3%, as I stated in my letter?
  • Also, perhaps you were not aware of the existence of this resource, but here is a graph from the Straits Times illustrating the fact that only 10% of children from one-to-three room flats make it to university–which is to say, 90% of them don’t. http://www.straitstimes.com/STI/STIMEDIA/pdf/20110308/a10.pdf I look forward to your reply, Mr Leong. Thank you for taking the time to read this message.
  • we should, wherever possible, try to agree to disagree, as it is healthy to have and to encourage different viewpoints.
    • Weiye Loh
       
      Does that mean that every viewpoint can and should be accepted as correct to encourage differences? 
  • ...4 more annotations...
  • If I say I think it is fair in Singapore, because half of the bottom one-third of the people make it to the top two-thirds, it does not mean that someone can quote me and say that I said what I said because half the bottom one-third of people did not make it. I think it is alright to say that I do not agree entirely with what was said, because does it also mean on the flip side that half of the bottom one-third of the people did not make it? This is what I mean by quoting one out of context, by using statistics that I did not say, and implying that I did, or by innuendo.
  • Moreover, depending on the methodology, definition, sampling, etc, half of the  bottom one-third of the people making it, does not necessary mean that half did not make it, because some may not be in the population because of various reasons, like emigration, not turning up, transfer, whether adjustments are made  for the mobility of people up or down the social strata over time, etc. If I did not use a particular statistic to state my case, for example, I don’t think it is appropriate to quote me and say that you agree with me by citing statistics from a third party source, like the MOE chart in the Straits Times article, instead of quoting the statistics that I said.
  • I cannot find anything in any of the media reports to say with certainty that the Minister backed up his remarks with direct reference to the MOE chart. There is also nothing in the narrative that only 10 per cent  of children from one-to-three room flats make it to university – which is to say, 90 per cent  of them don’t. The ’90 per cent’ cannot be attributed to what the minister said, as at best it is the writer’s interpretation of the MOE chart.
  • Interesting exchange of letters. Samuel’s interpretation of the statistics provided by Ng Eng Hen and ST is correct. There is little doubt about it. While I can see where Leong Sze Hian is coming from, I don’t totally agree with him. Specifically, Samuel’s first statement (only ~10% of students living in 1-3 room flat make it to university) is directed at ST’s report that education is a good social leveller but not at Ng. It is therefore a valid point to make.
1 - 20 of 28 Next ›
Showing 20 items per page