Skip to main content

Home/ New Media Ethics 2009 course/ Group items matching "Creationism" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Weiye Loh

The Mysterious Decline Effect | Wired Science | Wired.com - 0 views

  • Question #1: Does this mean I don’t have to believe in climate change? Me: I’m afraid not. One of the sad ironies of scientific denialism is that we tend to be skeptical of precisely the wrong kind of scientific claims. In poll after poll, Americans have dismissed two of the most robust and widely tested theories of modern science: evolution by natural selection and climate change. These are theories that have been verified in thousands of different ways by thousands of different scientists working in many different fields. (This doesn’t mean, of course, that such theories won’t change or get modified – the strength of science is that nothing is settled.) Instead of wasting public debate on creationism or the rhetoric of Senator Inhofe, I wish we’d spend more time considering the value of spinal fusion surgery, or second generation antipsychotics, or the verity of the latest gene association study. The larger point is that we need to be a better job of considering the context behind every claim. In 1952, the Harvard philosopher Willard Von Orman published “The Two Dogmas of Empiricism.” In the essay, Quine compared the truths of science to a spider’s web, in which the strength of the lattice depends upon its interconnectedness. (Quine: “The unit of empirical significance is the whole of science.”) One of the implications of Quine’s paper is that, when evaluating the power of a given study, we need to also consider the other studies and untested assumptions that it depends upon. Don’t just fixate on the effect size – look at the web. Unfortunately for the denialists, climate change and natural selection have very sturdy webs.
  • biases are not fraud. We sometimes forget that science is a human pursuit, mingled with all of our flaws and failings. (Perhaps that explains why an episode like Climategate gets so much attention.) If there’s a single theme that runs through the article it’s that finding the truth is really hard. It’s hard because reality is complicated, shaped by a surreal excess of variables. But it’s also hard because scientists aren’t robots: the act of observation is simultaneously an act of interpretation.
  • (As Paul Simon sang, “A man sees what he wants to see and disregards the rest.”) Most of the time, these distortions are unconscious – we don’t know even we are misperceiving the data. However, even when the distortion is intentional it’s still rarely rises to the level of outright fraud. Consider the story of Mike Rossner. He’s executive director of the Rockefeller University Press, and helps oversee several scientific publications, including The Journal of Cell Biology.  In 2002, while trying to format a scientific image in Photoshop that was going to appear in one of the journals, Rossner noticed that the background of the image contained distinct intensities of pixels. “That’s a hallmark of image manipulation,” Rossner told me. “It means the scientist has gone in and deliberately changed what the data looks like. What’s disturbing is just how easy this is to do.” This led Rossner and his colleagues to begin analyzing every image in every accepted paper. They soon discovered that approximately 25 percent of all papers contained at least one “inappropriately manipulated” picture. Interestingly, the vast, vast majority of these manipulations (~99 percent) didn’t affect the interpretation of the results. Instead, the scientists seemed to be photoshopping the pictures for aesthetic reasons: perhaps a line on a gel was erased, or a background blur was deleted, or the contrast was exaggerated. In other words, they wanted to publish pretty images. That’s a perfectly understandable desire, but it gets problematic when that same basic instinct – we want our data to be neat, our pictures to be clean, our charts to be clear – is transposed across the entire scientific process.
  • ...2 more annotations...
  • One of the philosophy papers that I kept on thinking about while writing the article was Nancy Cartwright’s essay “Do the Laws of Physics State the Facts?” Cartwright used numerous examples from modern physics to argue that there is often a basic trade-off between scientific “truth” and experimental validity, so that the laws that are the most true are also the most useless. “Despite their great explanatory power, these laws [such as gravity] do not describe reality,” Cartwright writes. “Instead, fundamental laws describe highly idealized objects in models.”  The problem, of course, is that experiments don’t test models. They test reality.
  • Cartwright’s larger point is that many essential scientific theories – those laws that explain things – are not actually provable, at least in the conventional sense. This doesn’t mean that gravity isn’t true or real. There is, perhaps, no truer idea in all of science. (Feynman famously referred to gravity as the “greatest generalization achieved by the human mind.”) Instead, what the anomalies of physics demonstrate is that there is no single test that can define the truth. Although we often pretend that experiments and peer-review and clinical trials settle the truth for us – that we are mere passive observers, dutifully recording the results – the actuality of science is a lot messier than that. Richard Rorty said it best: “To say that we should drop the idea of truth as out there waiting to be discovered is not to say that we have discovered that, out there, there is no truth.” Of course, the very fact that the facts aren’t obvious, that the truth isn’t “waiting to be discovered,” means that science is intensely human. It requires us to look, to search, to plead with nature for an answer.
Weiye Loh

Rationally Speaking: The problem of replicability in science - 0 views

  • The problem of replicability in science from xkcdby Massimo Pigliucci
  • In recent months much has been written about the apparent fact that a surprising, indeed disturbing, number of scientific findings cannot be replicated, or when replicated the effect size turns out to be much smaller than previously thought.
  • Arguably, the recent streak of articles on this topic began with one penned by David Freedman in The Atlantic, and provocatively entitled “Lies, Damned Lies, and Medical Science.” In it, the major character was John Ioannidis, the author of some influential meta-studies about the low degree of replicability and high number of technical flaws in a significant portion of published papers in the biomedical literature.
  • ...18 more annotations...
  • As Freedman put it in The Atlantic: “80 percent of non-randomized studies (by far the most common type) turn out to be wrong, as do 25 percent of supposedly gold-standard randomized trials, and as much as 10 percent of the platinum-standard large randomized trials.” Ioannidis himself was quoted uttering some sobering words for the medical community (and the public at large): “Science is a noble endeavor, but it’s also a low-yield endeavor. I’m not sure that more than a very small percentage of medical research is ever likely to lead to major improvements in clinical outcomes and quality of life. We should be very comfortable with that fact.”
  • Julia and I actually addressed this topic during a Rationally Speaking podcast, featuring as guest our friend Steve Novella, of Skeptics’ Guide to the Universe and Science-Based Medicine fame. But while Steve did quibble with the tone of the Atlantic article, he agreed that Ioannidis’ results are well known and accepted by the medical research community. Steve did point out that it should not be surprising that results get better and better as one moves toward more stringent protocols like large randomized trials, but it seems to me that one should be surprised (actually, appalled) by the fact that even there the percentage of flawed studies is high — not to mention the fact that most studies are in fact neither large nor properly randomized.
  • The second big recent blow to public perception of the reliability of scientific results is an article published in The New Yorker by Jonah Lehrer, entitled “The truth wears off.” Lehrer also mentions Ioannidis, but the bulk of his essay is about findings in psychiatry, psychology and evolutionary biology (and even in research on the paranormal!).
  • In these disciplines there are now several documented cases of results that were initially spectacularly positive — for instance the effects of second generation antipsychotic drugs, or the hypothesized relationship between a male’s body symmetry and the quality of his genes — that turned out to be increasingly difficult to replicate over time, with the original effect sizes being cut down dramatically, or even disappearing altogether.
  • As Lehrer concludes at the end of his article: “Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling.”
  • None of this should actually be particularly surprising to any practicing scientist. If you have spent a significant time of your life in labs and reading the technical literature, you will appreciate the difficulties posed by empirical research, not to mention a number of issues such as the fact that few scientists ever actually bother to replicate someone else’s results, for the simple reason that there is no Nobel (or even funded grant, or tenured position) waiting for the guy who arrived second.
  • n the midst of this I was directed by a tweet by my colleague Neil deGrasse Tyson (who has also appeared on the RS podcast, though in a different context) to a recent ABC News article penned by John Allen Paulos, which meant to explain the decline effect in science.
  • Paulos’ article is indeed concise and on the mark (though several of the explanations he proposes were already brought up in both the Atlantic and New Yorker essays), but it doesn’t really make things much better.
  • Paulos suggests that one explanation for the decline effect is the well known statistical phenomenon of the regression toward the mean. This phenomenon is responsible, among other things, for a fair number of superstitions: you’ve probably heard of some athletes’ and other celebrities’ fear of being featured on the cover of a magazine after a particularly impressive series of accomplishments, because this brings “bad luck,” meaning that the following year one will not be able to repeat the performance at the same level. This is actually true, not because of magical reasons, but simply as a result of the regression to the mean: extraordinary performances are the result of a large number of factors that have to line up just right for the spectacular result to be achieved. The statistical chances of such an alignment to repeat itself are low, so inevitably next year’s performance will likely be below par. Paulos correctly argues that this also explains some of the decline effect of scientific results: the first discovery might have been the result of a number of factors that are unlikely to repeat themselves in exactly the same way, thus reducing the effect size when the study is replicated.
  • nother major determinant of the unreliability of scientific results mentioned by Paulos is the well know problem of publication bias: crudely put, science journals (particularly the high-profile ones, like Nature and Science) are interested only in positive, spectacular, “sexy” results. Which creates a powerful filter against negative, or marginally significant results. What you see in science journals, in other words, isn’t a statistically representative sample of scientific results, but a highly biased one, in favor of positive outcomes. No wonder that when people try to repeat the feat they often come up empty handed.
  • A third cause for the problem, not mentioned by Paulos but addressed in the New Yorker article, is the selective reporting of results by scientists themselves. This is essentially the same phenomenon as the publication bias, except that this time it is scientists themselves, not editors and reviewers, who don’t bother to submit for publication results that are either negative or not strongly conclusive. Again, the outcome is that what we see in the literature isn’t all the science that we ought to see. And it’s no good to argue that it is the “best” science, because the quality of scientific research is measured by the appropriateness of the experimental protocols (including the use of large samples) and of the data analyses — not by whether the results happen to confirm the scientist’s favorite theory.
  • The conclusion of all this is not, of course, that we should throw the baby (science) out with the bath water (bad or unreliable results). But scientists should also be under no illusion that these are rare anomalies that do not affect scientific research at large. Too much emphasis is being put on the “publish or perish” culture of modern academia, with the result that graduate students are explicitly instructed to go for the SPU’s — Smallest Publishable Units — when they have to decide how much of their work to submit to a journal. That way they maximize the number of their publications, which maximizes the chances of landing a postdoc position, and then a tenure track one, and then of getting grants funded, and finally of getting tenure. The result is that, according to statistics published by Nature, it turns out that about ⅓ of published studies is never cited (not to mention replicated!).
  • “Scientists these days tend to keep up the polite fiction that all science is equal. Except for the work of the misguided opponent whose arguments we happen to be refuting at the time, we speak as though every scientist’s field and methods of study are as good as every other scientist’s, and perhaps a little better. This keeps us all cordial when it comes to recommending each other for government grants. ... We speak piously of taking measurements and making small studies that will ‘add another brick to the temple of science.’ Most such bricks lie around the brickyard.”
    • Weiye Loh
       
      Written by John Platt in a "Science" article published in 1964
  • Most damning of all, however, is the potential effect that all of this may have on science’s already dubious reputation with the general public (think evolution-creation, vaccine-autism, or climate change)
  • “If we don’t tell the public about these problems, then we’re no better than non-scientists who falsely claim they can heal. If the drugs don’t work and we’re not sure how to treat something, why should we claim differently? Some fear that there may be less funding because we stop claiming we can prove we have miraculous treatments. But if we can’t really provide those miracles, how long will we be able to fool the public anyway? The scientific enterprise is probably the most fantastic achievement in human history, but that doesn’t mean we have a right to overstate what we’re accomplishing.”
  • Joseph T. Lapp said... But is any of this new for science? Perhaps science has operated this way all along, full of fits and starts, mostly duds. How do we know that this isn't the optimal way for science to operate?My issues are with the understanding of science that high school graduates have, and with the reporting of science.
    • Weiye Loh
       
      It's the media at fault again.
  • What seems to have emerged in recent decades is a change in the institutional setting that got science advancing spectacularly since the establishment of the Royal Society. Flaws in the system such as corporate funded research, pal-review instead of peer-review, publication bias, science entangled with policy advocacy, and suchlike, may be distorting the environment, making it less suitable for the production of good science, especially in some fields.
  • Remedies should exist, but they should evolve rather than being imposed on a reluctant sociological-economic science establishment driven by powerful motives such as professional advance or funding. After all, who or what would have the authority to impose those rules, other than the scientific establishment itself?
Weiye Loh

No Science please, we're Anthropologists « Critical Thinking « Skeptic North - 0 views

  • The debate is between researchers in science-based anthropological disciplines like archaeologists, physical anthropology and forensic anthropology — and anthropologists who focus on the more humanities based issues like race, ethnicity and gender.
  • Those that are defend the old mandate, members of the fields that are science based, are interested in relying on the scientific method to inform their theories about anthropology and ensuring that due diligence is done on new theories and that research is being conducted based on sound principles. In opposition are members who view themselves as advocates and activists. As they see it, research on culture, race, and gender is only harmed by science as it represents the cold arm of colonial imperialism.
  • viewing this as more than a simple cosmetic change, he compared the attacks and challenges on anthropology to creationism in that they both are “based on the rejection of rational argument and thought.
  • ...6 more annotations...
  • the American Anthropological Association attempted to clarify their position, they issued a statement in which they stated: “the Executive Board recognizes and endorses the crucial place of the scientific method in much anthropological research.” To further clarify matters they went on to describe anthropology as: “Anthropology is a holistic and expansive discipline that covers the full breadth of human history and culture.”
  • Damon Dozier, the association’s director of public affairs is further quoted saying “We mean holistic in terms of the diversity of the discipline.”
  • Despite the attempts to head off a huge rift, there appears to be lingering doubt as to the direction the American Anthropological Association is going and even more concern that the field of anthropology is under siege from post-modern attacks on its science foundations.
  • One of the most important contributions of science to the world has been a method of inquiry that has proven itself unequalled in explaining the natural world. The scientific method is, and should, be foundational in any field where the goal is to explain the natural world.
  • The so-called “hard sciences” understand this. Where things get muddled is in the “soft sciences” like anthropology, history, and psychology. For some reason these fields have proven especially vulnerable to post-modernism and have fallen prey to schizophrenic notion that science is “western” and trying to use science to explain things is another branch of imperialism.
  • The so-called “soft sciences” are occasionally put in the position of making assumptions. When you have a hypothesis you want to test, you unfortunately can’t travel back in time and do an experiment. Therefore, relying on the evidence you already have and employing your critical thinking skills you formulate a rational assumption and await the opportunity to confirm or deny it. It’s not based on a “hunch” or conjured up from the imagination. It’s based on rational skepticism.
Weiye Loh

Breakthrough Europe: Towards a Social Theory of Climate Change - 0 views

  • Lever-Tracy confronted sociologists head on about their worrisome silence on the issue. Why have sociologists failed to address the greatest and most overwhelming challenge facing modern society? Why have the figureheads of the discipline, such as Anthony Giddens and Ulrich Beck, so far refused to apply their seminal notions of structuration and the risk society to the issue?
  • Earlier, we re-published an important contribution by Ulrich Beck, the world-renowned German sociologist and a Breakthrough Senior Fellow. More recently, Current Sociology published a powerful response by Reiner Grundmann of Aston University and Nico Stehr of Zeppelin University.
  • sociologists should not rush into the discursive arena without asking some critical questions in advance, questions such as: What exactly could sociology contribute to the debate? And, is there something we urgently need that is not addressed by other disciplines or by political proposals?
  • ...12 more annotations...
  • he authors disagree with Lever-Tracy's observation that the lack of interest in climate change among sociologists is driven by a widespread suspicion of naturalistic explanations, teleological arguments and environmental determinism.
  • While conceding that Lever-Tracy's observation may be partially true, the authors argue that more important processes are at play, including cautiousness on the part of sociologists to step into a heavily politicized debate; methodological differences with the natural sciences; and sensitivity about locating climate change in the longue durée.
  • Secondly, while Lever-Tracy argues that "natural and social change are now in lockstep with each other, operating on the same scales," and that therefore a multidisciplinary approach is needed, Grundmann and Stehr suggest that the true challenge is interdisciplinarity, as opposed to multidisciplinarity.
  • Thirdly, and this possibly the most striking observation of the article, Grundmann and Stehr challenge Lever-Tracy's argument that natural scientists have successfully made the case for anthropogenic climate change, and that therefore social scientists should cease to endlessly question this scientific consensus on the basis of a skeptical postmodern 'deconstructionism'.
  • As opposed to both Lever-Tracy's positivist view and the radical postmodern deconstructionist view, Grundmann and Stehr take the social constructivist view, which argues that that every idea is socially constructed and therefore the product of human interpretation and communication. This raises the 'intractable' specters of discourse and framing, to which we will return in a second.
  • Finally, Lever-Tracy holds that climate change needs to be posited "firmly at the heart of the discipline." Grundmann and Stehr, however, emphasize that "if this is going to [be] more than wishful thinking, we need to carefully consider the prospects of such an enterprise."
  • The importance of framing climate change in a way that allows it to resonate with the concerns of the average citizen is an issue that the Breakthrough Institute has long emphasized. Especially the apocalyptic politics of fear that is often associated with climate change tends to have a counterproductive effect on public opinion. Realizing this, Grundmann and Stehr make an important warning to sociologists: "the inherent alarmism in many social science contributions on climate change merely repeats the central message provided by mainstream media." In other words, it fails to provide the kind of distantiated observation needed to approach the issue with at least a mild degree of objectivity or impartiality.
  • While this tension is symptomatic of many social scientific attempts to get involved, we propose to study these very underlying assumptions. For example, we should ask: Does the dramatization of events lead to effective political responses? Do we need a politics of fear? Is scientific consensus instrumental for sound policies? And more generally, what are the relations between a changing technological infrastructure, social shifts and belief systems? What contribution can bottom-up initiatives have in fighting climate change? What roles are there for markets, hierarchies and voluntary action? How was it possible that the 'fight against climate change' rose from a marginal discourse to a hegemonic one (from heresy to dogma)? And will the discourse remain hegemonic or will too much pub¬lic debate about climate change lead to 'climate change fatigue'?
  • In this respect, Grundmann and Stehr make another crucial observation: "the severity of a problem does not mean that we as sociologists should forget about our analytical apparatus." Bringing the analytical apparatus of sociology back in, the hunting season for positivist approaches to knowledge and nature is opened. Grundmann and Stehr consequently criticize not only Lever-Tracy's unspoken adherence to a positivist nature-society duality, taking instead a more dialectical Marxian approach to the relationship between man and his environment, but they also criticize her idea that incremental increases in our scientific knowledge of climate change and its impacts will automatically coalesce into successful and meaningful policy responses.
  • Political decisions about climate change are made on the basis of scientific research and a host of other (economic, political, cultural) considerations. Regarding the scientific dimension, it is a common perception (one that Lever-Tracy seems to share) that the more knowledge we have, the better the political response will be. This is the assumption of the linear model of policy-making that has been dominant in the past but debunked time and again (Godin, 2006). What we increasingly realize is that knowl¬edge creation leads to an excess of information and 'objectivity' (Sarewitz, 2000). Even the consensual mechanisms of the IPCC lead to an increase in options because knowledge about climate change increases.
  • Instead, Grundmann and Stehr propose to look carefully at how we frame climate change socially and whether the hegemonic climate discourse is actually contributing to successful political action or hampering it. Defending this social constructivist approach from the unfounded allegation that it would play into the hands of the climate skeptics, the authors note that defining climate change as a social construction ... is not to diminish its importance, relevance, or reality. It simply means that sociologists study the process whereby something (like anthropogenic climate change) is transformed from a conjecture into an accepted fact. With regard to policy, we observe a near exclusive focus on carbon dioxide emissions. This framing has proven counter productive, as the Hartwell paper and other sources demonstrate (see Eastin et al., 2010; Prins et al., 2010). Reducing carbon emissions in the short term is among the most difficult tasks. More progress could be made by a re-framing of the issue, not as an issue of human sinfulness, but of human dignity. [emphasis added]
  • These observations allow the authors to come full circle, arriving right back at their first observation about the real reasons why sociologists have so far kept silent on climate change. Somehow, "there seems to be the curious conviction that lest you want to be accused of helping the fossil fuel lobbies and the climate skeptics, you better keep quiet."
  •  
    Towards a Social Theory of Climate Change
Weiye Loh

Religion: Faith in science : Nature News - 0 views

  • The Templeton Foundation claims to be a friend of science. So why does it make so many researchers uneasy?
  • With a current endowment estimated at US$2.1 billion, the organization continues to pursue Templeton's goal of building bridges between science and religion. Each year, it doles out some $70 million in grants, more than $40 million of which goes to research in fields such as cosmology, evolutionary biology and psychology.
  • however, many scientists find it troubling — and some see it as a threat. Jerry Coyne, an evolutionary biologist at the University of Chicago, Illinois, calls the foundation "sneakier than the creationists". Through its grants to researchers, Coyne alleges, the foundation is trying to insinuate religious values into science. "It claims to be on the side of science, but wants to make faith a virtue," he says.
  • ...25 more annotations...
  • But other researchers, both with and without Templeton grants, say that they find the foundation remarkably open and non-dogmatic. "The Templeton Foundation has never in my experience pressured, suggested or hinted at any kind of ideological slant," says Michael Shermer, editor of Skeptic, a magazine that debunks pseudoscience, who was hired by the foundation to edit an essay series entitled 'Does science make belief in God obsolete?'
  • The debate highlights some of the challenges facing the Templeton Foundation after the death of its founder in July 2008, at the age of 95.
  • With the help of a $528-million bequest from Templeton, the foundation has been radically reframing its research programme. As part of that effort, it is reducing its emphasis on religion to make its programmes more palatable to the broader scientific community. Like many of his generation, Templeton was a great believer in progress, learning, initiative and the power of human imagination — not to mention the free-enterprise system that allowed him, a middle-class boy from Winchester, Tennessee, to earn billions of dollars on Wall Street. The foundation accordingly allocates 40% of its annual grants to programmes with names such as 'character development', 'freedom and free enterprise' and 'exceptional cognitive talent and genius'.
  • Unlike most of his peers, however, Templeton thought that the principles of progress should also apply to religion. He described himself as "an enthusiastic Christian" — but was also open to learning from Hinduism, Islam and other religious traditions. Why, he wondered, couldn't religious ideas be open to the type of constructive competition that had produced so many advances in science and the free market?
  • That question sparked Templeton's mission to make religion "just as progressive as medicine or astronomy".
  • Early Templeton prizes had nothing to do with science: the first went to the Catholic missionary Mother Theresa of Calcutta in 1973.
  • By the 1980s, however, Templeton had begun to realize that fields such as neuroscience, psychology and physics could advance understanding of topics that are usually considered spiritual matters — among them forgiveness, morality and even the nature of reality. So he started to appoint scientists to the prize panel, and in 1985 the award went to a research scientist for the first time: Alister Hardy, a marine biologist who also investigated religious experience. Since then, scientists have won with increasing frequency.
  • "There's a distinct feeling in the research community that Templeton just gives the award to the most senior scientist they can find who's willing to say something nice about religion," says Harold Kroto, a chemist at Florida State University in Tallahassee, who was co-recipient of the 1996 Nobel Prize in Chemistry and describes himself as a devout atheist.
  • Yet Templeton saw scientists as allies. They had what he called "the humble approach" to knowledge, as opposed to the dogmatic approach. "Almost every scientist will agree that they know so little and they need to learn," he once said.
  • Templeton wasn't interested in funding mainstream research, says Barnaby Marsh, the foundation's executive vice-president. Templeton wanted to explore areas — such as kindness and hatred — that were not well known and did not attract major funding agencies. Marsh says Templeton wondered, "Why is it that some conflicts go on for centuries, yet some groups are able to move on?"
  • Templeton's interests gave the resulting list of grants a certain New Age quality (See Table 1). For example, in 1999 the foundation gave $4.6 million for forgiveness research at the Virginia Commonwealth University in Richmond, and in 2001 it donated $8.2 million to create an Institute for Research on Unlimited Love (that is, altruism and compassion) at Case Western Reserve University in Cleveland, Ohio. "A lot of money wasted on nonsensical ideas," says Kroto. Worse, says Coyne, these projects are profoundly corrupting to science, because the money tempts researchers into wasting time and effort on topics that aren't worth it. If someone is willing to sell out for a million dollars, he says, "Templeton is there to oblige him".
  • At the same time, says Marsh, the 'dean of value investing', as Templeton was known on Wall Street, had no intention of wasting his money on junk science or unanswerables such as whether God exists. So before pursuing a scientific topic he would ask his staff to get an assessment from appropriate scholars — a practice that soon evolved into a peer-review process drawing on experts from across the scientific community.
  • Because Templeton didn't like bureaucracy, adds Marsh, the foundation outsourced much of its peer review and grant giving. In 1996, for example, it gave $5.3 million to the American Association for the Advancement of Science (AAAS) in Washington DC, to fund efforts that work with evangelical groups to find common ground on issues such as the environment, and to get more science into seminary curricula. In 2006, Templeton gave $8.8 million towards the creation of the Foundational Questions Institute (FQXi), which funds research on the origins of the Universe and other fundamental issues in physics, under the leadership of Anthony Aguirre, an astrophysicist at the University of California, Santa Cruz, and Max Tegmark, a cosmologist at the Massachusetts Institute of Technology in Cambridge.
  • But external peer review hasn't always kept the foundation out of trouble. In the 1990s, for example, Templeton-funded organizations gave book-writing grants to Guillermo Gonzalez, an astrophysicist now at Grove City College in Pennsylvania, and William Dembski, a philosopher now at the Southwestern Baptist Theological Seminary in Fort Worth, Texas. After obtaining the grants, both later joined the Discovery Institute — a think-tank based in Seattle, Washington, that promotes intelligent design. Other Templeton grants supported a number of college courses in which intelligent design was discussed. Then, in 1999, the foundation funded a conference at Concordia University in Mequon, Wisconsin, in which intelligent-design proponents confronted critics. Those awards became a major embarrassment in late 2005, during a highly publicized court fight over the teaching of intelligent design in schools in Dover, Pennsylvania. A number of media accounts of the intelligent design movement described the Templeton Foundation as a major supporter — a charge that Charles Harper, then senior vice-president, was at pains to deny.
  • Some foundation officials were initially intrigued by intelligent design, Harper told The New York Times. But disillusionment set in — and Templeton funding stopped — when it became clear that the theory was part of a political movement from the Christian right wing, not science. Today, the foundation website explicitly warns intelligent-design researchers not to bother submitting proposals: they will not be considered.
  • Avowedly antireligious scientists such as Coyne and Kroto see the intelligent-design imbroglio as a symptom of their fundamental complaint that religion and science should not mix at all. "Religion is based on dogma and belief, whereas science is based on doubt and questioning," says Coyne, echoing an argument made by many others. "In religion, faith is a virtue. In science, faith is a vice." The purpose of the Templeton Foundation is to break down that wall, he says — to reconcile the irreconcilable and give religion scholarly legitimacy.
  • Foundation officials insist that this is backwards: questioning is their reason for being. Religious dogma is what they are fighting. That does seem to be the experience of many scientists who have taken Templeton money. During the launch of FQXi, says Aguirre, "Max and I were very suspicious at first. So we said, 'We'll try this out, and the minute something smells, we'll cut and run.' It never happened. The grants we've given have not been connected with religion in any way, and they seem perfectly happy about that."
  • John Cacioppo, a psychologist at the University of Chicago, also had concerns when he started a Templeton-funded project in 2007. He had just published a paper with survey data showing that religious affiliation had a negative correlation with health among African-Americans — the opposite of what he assumed the foundation wanted to hear. He was bracing for a protest when someone told him to look at the foundation's website. They had displayed his finding on the front page. "That made me relax a bit," says Cacioppo.
  • Yet, even scientists who give the foundation high marks for openness often find it hard to shake their unease. Sean Carroll, a physicist at the California Institute of Technology in Pasadena, is willing to participate in Templeton-funded events — but worries about the foundation's emphasis on research into 'spiritual' matters. "The act of doing science means that you accept a purely material explanation of the Universe, that no spiritual dimension is required," he says.
  • It hasn't helped that Jack Templeton is much more politically and religiously conservative than his father was. The foundation shows no obvious rightwards trend in its grant-giving and other activities since John Templeton's death — and it is barred from supporting political activities by its legal status as a not-for-profit corporation. Still, many scientists find it hard to trust an organization whose president has used his personal fortune to support right-leaning candidates and causes such as the 2008 ballot initiative that outlawed gay marriage in California.
  • Scientists' discomfort with the foundation is probably inevitable in the current political climate, says Scott Atran, an anthropologist at the University of Michigan in Ann Arbor. The past 30 years have seen the growing power of the Christian religious right in the United States, the rise of radical Islam around the world, and religiously motivated terrorist attacks such as those in the United States on 11 September 2001. Given all that, says Atran, many scientists find it almost impossible to think of religion as anything but fundamentalism at war with reason.
  • the foundation has embraced the theme of 'science and the big questions' — an open-ended list that includes topics such as 'Does the Universe have a purpose?'
  • Towards the end of Templeton's life, says Marsh, he became increasingly concerned that this reaction was getting in the way of the foundation's mission: that the word 'religion' was alienating too many good scientists.
  • The peer-review and grant-making system has also been revamped: whereas in the past the foundation ran an informal mix of projects generated by Templeton and outside grant seekers, the system is now organized around an annual list of explicit funding priorities.
  • The foundation is still a work in progress, says Jack Templeton — and it always will be. "My father believed," he says, "we were all called to be part of an ongoing creative process. He was always trying to make people think differently." "And he always said, 'If you're still doing today what you tried to do two years ago, then you're not making progress.'" 
Weiye Loh

takchek (读书 ): When Scientific Research and Higher Education become just Political Football - 0 views

  • A mere two years after the passage of the economic stimulus package, the now Republican-controlled House of Representatives have started swinging their budget cutting axe at scientific research and higher education.One point stood out in the midst of all this "fiscal responsibility" talk:The House bill does not specify cuts to five of the Office of Science's six programs, namely, basic energy sciences, high-energy physics, nuclear physics, fusion energy sciences, and advanced scientific computing. However, it explicitly whacks funding for the biological and environmental research program from $588 million to $302 million, a 49% reduction that would effectively zero out the program for the remainder of the year. The program supports much of DOE's climate and bioenergy research and in the past has funded much of the federal government's work on decoding the human genome. - Science , 25 February 2011: Vol. 331 no. 6020 pp. 997-998 DOI: 10.1126/science.331.6020.997 Do the terms Big Oil, Creationism/Intelligent Design come to your mind?
  • In other somewhat related news, tenure rights are being weakened in Louisiana and state legislatures are trying to have greater control over how colleges are run. It is hard not to see that there seems to be a coordinated assault on academia (presumably since many academics are seen by the Republican right as leftist liberals.)Lawmakers are inserting themselves even more directly into the classroom in South Carolina, where a proposal would require professors to teach a minimum of nine credit hours per semester."I think we need to have professors in the classroom and not on sabbatical and out researching and doing things to that effect," State Rep. Murrell G. Smith Jr., a Republican, told the Associated Press.I think they are attempting to turn research universities into trade/vocational schools.
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

Churnalism or news? How PRs have taken over the media | Media | The Guardian - 0 views

  • The website, churnalism.com, created by charity the Media Standards Trust, allows readers to paste press releases into a "churn engine". It then compares the text with a constantly updated database of more than 3m articles. The results, which give articles a "churn rating", show the percentage of any given article that has been reproduced from publicity material.The Guardian was given exclusive access to churnalism.com prior to launch. It revealed how all media organisations are at times simply republishing, verbatim, material sent to them by marketing companies and campaign groups.
  • Meanwhile, an independent film-maker, Chris Atkins, has revealed how he duped the BBC into running an entirely fictitious story about Downing Street's new cat to coincide with the site's launch.

    The director created a Facebook page in the name of a fictitious character, "Tim Sutcliffe", who claimed the cat – which came from Battersea Cats Home – had belonged to his aunt Margaret. The story appeared in the Daily Mail and Metro, before receiving a prominent slot on BBC Radio 5 Live.

    BBC Radio 5 Live's Gaby Logan talks about a fictitious cat story Link to this audio

    Atkins, who was not involved in creating churnalism.com, uses spoof stories to highlight the failure of journalists to corroborate stories. He was behind an infamous prank last year that led to the BBC running a news package on a hoax Youtube video purporting to show urban foxhunters.

  • The creation of churnalism.com is likely to unnerve overworked journalists and the press officers who feed them. "People don't realise how much churn they're being fed every day," said Martin Moore, director of the trust, which seeks to improve standards in news. "Hopefully this will be an eye-opener."
  • ...2 more annotations...
  • Interestingly, all media outlets appear particularly susceptible to PR material disseminated by supermarkets: the Mail appears to have a particular appetite for publicity from Asda and Tesco, while the Guardian favours Waitrose releases.
  • Moore said one unexpected discovery has been that the BBC news website appears particularly prone to churning publicity material."Part of the reason is presumably because they feel a duty to put out so many government pronouncements," Moore said. "But the BBC also has a lot to produce in regions that the newspapers don't cover."
Weiye Loh

Report: Piracy a "global pricing problem" with only one solution - 0 views

  • Over the last three years, 35 researchers contributed to the Media Piracy Project, released last week by the Social Science Research Council. Their mission was to examine media piracy in emerging economies, which account for most of the world's population, and to find out just how and why piracy operates in places like Russia, Mexico, and India.
  • Their conclusion is not that citizens of such piratical societies are somehow morally deficient or opposed to paying for content. Instead, they write that “high prices for media goods, low incomes, and cheap digital technologies are the main ingredients of global media piracy. If piracy is ubiquitous in most parts of the world, it is because these conditions are ubiquitous.”
  • When legitimate CDs, DVDs, and computer software are five to ten times higher (relative to local incomes) than they are in the US and Europe, simply ratcheting up copyright enforcement won't do enough to fix the problem. In the view of the report's authors, the only real solution is the creation of local companies that “actively compete on price and services for local customers” as they sell movies, music, and more.
  • ...7 more annotations...
  • Some markets have local firms that compete on price to offer legitimate content (think the US, which has companies like Hulu, Netflix, Apple, and Microsoft that compete to offer legal video content). But the authors conclude that, in most of the world, legitimate copyrighted goods are only distributed by huge multinational corporations whose dominant goals are not to service a large part of local markets but to “protect the pricing structure in the high-income countries that generate most of their profits.”
  • This might increase profits globally, but it has led to disaster in many developing economies, where piracy may run north of 90 percent. Given access to cheap digital tools, but charged terrific amounts of money for legitimate versions of content, users choose piracy.
  • In Russia, for instance, researchers noted that legal versions of the film The Dark Knight went for $15. That price, akin to what a US buyer would pay, might sound reasonable until you realize that Russians make less money in a year than US workers. As a percentage of their wages, that $15 price is actually equivalent to a US consumer dropping $75 on the film. Pirate versions can be had for one-third the price.
  • Simple crackdowns on pirate behavior won't work in the absence of pricing and other reforms, say the report's authors (who also note that even "developed" economies routinely pirate TV shows and movies that are not made legally available to them for days, weeks, or months after they originally appear elsewhere).
  • The "strong moralization of the debate” makes it difficult to discuss issues beyond enforcement, however, and the authors slam the content companies for lacking any credible "endgame" to their constant requests for more civil and police powers in the War on Piracy.
  • piracy is a “signal of unmet consumer demand.
  • Our studies raise concerns that it may be a long time before such accommodations to reality reach the international policy arena. Hardline enforcement positions may be futile at stemming the tide of piracy, but the United States bears few of the costs of such efforts, and US companies reap most of the modest benefits. This is a recipe for continued US pressure on developing countries, very possibly long after media business models in the United States and other high-income countries have changed.
  •  
    A major new report from a consortium of academic researchers concludes that media piracy can't be stopped through "three strikes" Internet disconnections, Web censorship, more police powers, higher statutory damages, or tougher criminal penalties. That's because the piracy of movies, music, video games, and software is "better described as a global pricing problem." And the only way to solve it is by changing the price.
Weiye Loh

Diary of A Singaporean Mind: Nuclear Crisis : Separating Hyperbole from Reality.... - 0 views

  • the media and pundits stepped on the "fear creation accelerator" focussing on the possibility of disastrous outcomes while ignoring possible solutions and options.
  • Nobody can say for sure how this crisis is headed. As of today, the risk of a total meltdown has been reduced. However, if one was listening to some segments of the media earlier this week, disaster was the only possible outcome. Fear and panic itself would have caused a disaster Imagine the mess created by millions fleeing Tokyo in a haphazard manner - the sick, old and invalid left behind, food & water distribution disrupted - would have led to more deaths much worse than the worst case meltdown that would have led to reactors being entombed. It also shows us the importance of leadership we can trust - the Japanese Minister Yukio Edano held 5 press conference every day[Link] to update the nation on the dynamic situation (compare that with the initial handling of SARS outbreak).
  • I hope the Japanese succeed in getting the nuclear reactors under control. Extraordinary crisis requires extraordinary leadership, extraordinary sacrifice and extraordinary courage. In the confusion and fear, it is hard for people not to panic and flee but most of the Japanese in Tokyo stayed calm despite all sorts of scares. If another group of people are put through a crisis, the response may be completely different
  • ...1 more annotation...
  • there is a tendency to conclude that govts with the best expert advice have made this decision because there is a real danger of something sinister happening. But remember govts are also under pressure to act because they are made up of politicians - also they may be making precautionary moves because they have little to lose and have to be seen as being pro-active. How real is the danger of harmful radiation reaching Tokyo and should you leave if you're in Tokyo? There were many people doing a "wait and see" before Wednesday but once the US & UK govt called for a pull-out, the fear factor rose several notches and if you're a Japanese in Tokyo watching all the foreigners "abandoning" your city, you start to feel some anxiety and later panic. One EU official used the word "apocalypse"[Link] to describe the situation in Japan and the fear index hit the roof....then a whole herd of experts came out to paint more dire scenarios saying the Japanese have lost all control of the nuclear plants. All this lead the public to think that calamity is the most likely outcome of the unfolding saga and if make a decision from all this, you will just run for the exits if you're in Tokyo. All this is happening while the Japanese govt is trying to calm the people and prevent a pandemonium after the triple disaster hit the country. In China, people have emptied the supermarket shelves of iodized salt because of media reports that the consumption of iodine can block radioactive iodine from being absorbed by thyroid glands causing thyroid cancer. There are also reports of people getting ill after ingesting iodine pills out of fear of radiation.
Weiye Loh

Land Destroyer: Alternative Economics - 0 views

  • Peer to peer file sharing (P2P) has made media distribution free and has become the bane of media monopolies. P2P file sharing means digital files can be copied and distributed at no cost. CD's, DVD's, and other older forms of holding media are no longer necessary, nor is the cost involved in making them or distributing them along a traditional logistical supply chain. Disc burners, however, allow users the ability to create their own physical copies at a fraction of the cost of buying the media from the stores. Supply and demand is turned on its head as the more popular a certain file becomes via demand, the more of it that is available for sharing, and the easier it is to obtain. Supply and demand increase in tandem towards a lower "price" of obtaining the said file.Consumers demand more as price decreases. Producersnaturally want to produce more of something as priceincreases. Somewhere in between consumers and producers meet at the market price or "marketequilibrium."P2P technology eliminates material scarcity, thus the more afile is in demand, the more people end up downloading it, andthe easier it is for others to find it and download it. Considerthe implications this would have if technology made physicalobjects as easy to "share" as information is now.
  • In the end, it is not government regulations, legal contrivances, or licenses that govern information, but rather the free market mechanism commonly referred to as Adam Smith's self regulating "Invisible Hand of the Market." In other words, people selfishly seeking accurate information for their own benefit encourage producers to provide the best possible information to meet their demand. While this is not possible in a monopoly, particularly the corporate media monopoly of the "left/right paradigm" of false choice, it is inevitable in the field of real competition that now exists online due to information technology.
  • Compounding the establishment's troubles are cheaper cameras and cheaper, more capable software for 3D graphics, editing, mixing, and other post production tasks, allowing for the creation of an alternative publishing, audio and video industry. "Underground" counter-corporate music and film has been around for a long time but through the combination of technology and the zealous corporate lawyers disenfranchising a whole new generation that now seeks an alternative, it is truly coming of age.
  • ...3 more annotations...
  • With a growing community of people determined to become collaborative producers rather than fit into the producer/consumer paradigm, and 3D files for physical objects already being shared like movies and music, the implications are profound. Products, and the manufacturing technology used to make them will continue to drop in price, become easier to make for individuals rather than large corporations, just as media is now shifting into the hands of the common people. And like the shift of information, industry will move from the elite and their agenda of preserving their power, to the end of empowering the people.
  • In a future alternative economy where everyone is a collaborative designer, producer, and manufacturer instead of passive consumers and when problems like "global climate change," "overpopulation," and "fuel crises" cross our path, we will counter them with technical solutions, not political indulgences like carbon taxes, and not draconian decrees like "one-child policies."
  • We will become the literal architects of our own future in this "personal manufacturing" revolution. While these technologies may still appear primitive, or somewhat "useless" or "impractical" we must remember where our personal computers stood on the eve of the dawning of the information age and how quickly they changed our lives. And while many of us may be unaware of this unfolding revolution, you can bet the globalists, power brokers, and all those that stand to lose from it not only see it but are already actively fighting against it.Understandably it takes some technical know-how to jump into the personal manufacturing revolution. In part 2 of "Alternative Economics" we will explore real world "low-tech" solutions to becoming self-sufficient, local, and rediscover the empowerment granted by doing so.
Weiye Loh

McKinsey & Company - Clouds, big data, and smart assets: Ten tech-enabled business trends to watch - 0 views

  • 1. Distributed cocreation moves into the mainstreamIn the past few years, the ability to organise communities of Web participants to develop, market, and support products and services has moved from the margins of business practice to the mainstream. Wikipedia and a handful of open-source software developers were the pioneers. But in signs of the steady march forward, 70 per cent of the executives we recently surveyed said that their companies regularly created value through Web communities. Similarly, more than 68m bloggers post reviews and recommendations about products and services.
  • for every success in tapping communities to create value, there are still many failures. Some companies neglect the up-front research needed to identify potential participants who have the right skill sets and will be motivated to participate over the longer term. Since cocreation is a two-way process, companies must also provide feedback to stimulate continuing participation and commitment. Getting incentives right is important as well: cocreators often value reputation more than money. Finally, an organisation must gain a high level of trust within a Web community to earn the engagement of top participants.
  • 2. Making the network the organisation In earlier research, we noted that the Web was starting to force open the boundaries of organisations, allowing nonemployees to offer their expertise in novel ways. We called this phenomenon "tapping into a world of talent." Now many companies are pushing substantially beyond that starting point, building and managing flexible networks that extend across internal and often even external borders. The recession underscored the value of such flexibility in managing volatility. We believe that the more porous, networked organisations of the future will need to organise work around critical tasks rather than molding it to constraints imposed by corporate structures.
  • ...10 more annotations...
  • 3. Collaboration at scale Across many economies, the number of people who undertake knowledge work has grown much more quickly than the number of production or transactions workers. Knowledge workers typically are paid more than others, so increasing their productivity is critical. As a result, there is broad interest in collaboration technologies that promise to improve these workers' efficiency and effectiveness. While the body of knowledge around the best use of such technologies is still developing, a number of companies have conducted experiments, as we see in the rapid growth rates of video and Web conferencing, expected to top 20 per cent annually during the next few years.
  • 4. The growing ‘Internet of Things' The adoption of RFID (radio-frequency identification) and related technologies was the basis of a trend we first recognised as "expanding the frontiers of automation." But these methods are rudimentary compared with what emerges when assets themselves become elements of an information system, with the ability to capture, compute, communicate, and collaborate around information—something that has come to be known as the "Internet of Things." Embedded with sensors, actuators, and communications capabilities, such objects will soon be able to absorb and transmit information on a massive scale and, in some cases, to adapt and react to changes in the environment automatically. These "smart" assets can make processes more efficient, give products new capabilities, and spark novel business models. Auto insurers in Europe and the United States are testing these waters with offers to install sensors in customers' vehicles. The result is new pricing models that base charges for risk on driving behavior rather than on a driver's demographic characteristics. Luxury-auto manufacturers are equipping vehicles with networked sensors that can automatically take evasive action when accidents are about to happen. In medicine, sensors embedded in or worn by patients continuously report changes in health conditions to physicians, who can adjust treatments when necessary. Sensors in manufacturing lines for products as diverse as computer chips and pulp and paper take detailed readings on process conditions and automatically make adjustments to reduce waste, downtime, and costly human interventions.
  • 5. Experimentation and big data Could the enterprise become a full-time laboratory? What if you could analyse every transaction, capture insights from every customer interaction, and didn't have to wait for months to get data from the field? What if…? Data are flooding in at rates never seen before—doubling every 18 months—as a result of greater access to customer data from public, proprietary, and purchased sources, as well as new information gathered from Web communities and newly deployed smart assets. These trends are broadly known as "big data." Technology for capturing and analysing information is widely available at ever-lower price points. But many companies are taking data use to new levels, using IT to support rigorous, constant business experimentation that guides decisions and to test new products, business models, and innovations in customer experience. In some cases, the new approaches help companies make decisions in real time. This trend has the potential to drive a radical transformation in research, innovation, and marketing.
  • Using experimentation and big data as essential components of management decision making requires new capabilities, as well as organisational and cultural change. Most companies are far from accessing all the available data. Some haven't even mastered the technologies needed to capture and analyse the valuable information they can access. More commonly, they don't have the right talent and processes to design experiments and extract business value from big data, which require changes in the way many executives now make decisions: trusting instincts and experience over experimentation and rigorous analysis. To get managers at all echelons to accept the value of experimentation, senior leaders must buy into a "test and learn" mind-set and then serve as role models for their teams.
  • 6. Wiring for a sustainable world Even as regulatory frameworks continue to evolve, environmental stewardship and sustainability clearly are C-level agenda topics. What's more, sustainability is fast becoming an important corporate-performance metric—one that stakeholders, outside influencers, and even financial markets have begun to track. Information technology plays a dual role in this debate: it is both a significant source of environmental emissions and a key enabler of many strategies to mitigate environmental damage. At present, information technology's share of the world's environmental footprint is growing because of the ever-increasing demand for IT capacity and services. Electricity produced to power the world's data centers generates greenhouse gases on the scale of countries such as Argentina or the Netherlands, and these emissions could increase fourfold by 2020. McKinsey research has shown, however, that the use of IT in areas such as smart power grids, efficient buildings, and better logistics planning could eliminate five times the carbon emissions that the IT industry produces.
  • 7. Imagining anything as a service Technology now enables companies to monitor, measure, customise, and bill for asset use at a much more fine-grained level than ever before. Asset owners can therefore create services around what have traditionally been sold as products. Business-to-business (B2B) customers like these service offerings because they allow companies to purchase units of a service and to account for them as a variable cost rather than undertake large capital investments. Consumers also like this "paying only for what you use" model, which helps them avoid large expenditures, as well as the hassles of buying and maintaining a product.
  • In the IT industry, the growth of "cloud computing" (accessing computer resources provided through networks rather than running software or storing data on a local computer) exemplifies this shift. Consumer acceptance of Web-based cloud services for everything from e-mail to video is of course becoming universal, and companies are following suit. Software as a service (SaaS), which enables organisations to access services such as customer relationship management, is growing at a 17 per cent annual rate. The biotechnology company Genentech, for example, uses Google Apps for e-mail and to create documents and spreadsheets, bypassing capital investments in servers and software licenses. This development has created a wave of computing capabilities delivered as a service, including infrastructure, platform, applications, and content. And vendors are competing, with innovation and new business models, to match the needs of different customers.
  • 8. The age of the multisided business model Multisided business models create value through interactions among multiple players rather than traditional one-on-one transactions or information exchanges. In the media industry, advertising is a classic example of how these models work. Newspapers, magasines, and television stations offer content to their audiences while generating a significant portion of their revenues from third parties: advertisers. Other revenue, often through subscriptions, comes directly from consumers. More recently, this advertising-supported model has proliferated on the Internet, underwriting Web content sites, as well as services such as search and e-mail (see trend number seven, "Imagining anything as a service," earlier in this article). It is now spreading to new markets, such as enterprise software: Spiceworks offers IT-management applications to 950,000 users at no cost, while it collects advertising from B2B companies that want access to IT professionals.
  • 9. Innovating from the bottom of the pyramid The adoption of technology is a global phenomenon, and the intensity of its usage is particularly impressive in emerging markets. Our research has shown that disruptive business models arise when technology combines with extreme market conditions, such as customer demand for very low price points, poor infrastructure, hard-to-access suppliers, and low cost curves for talent. With an economic recovery beginning to take hold in some parts of the world, high rates of growth have resumed in many developing nations, and we're seeing companies built around the new models emerging as global players. Many multinationals, meanwhile, are only starting to think about developing markets as wellsprings of technology-enabled innovation rather than as traditional manufacturing hubs.
  • 10. Producing public good on the grid The role of governments in shaping global economic policy will expand in coming years. Technology will be an important factor in this evolution by facilitating the creation of new types of public goods while helping to manage them more effectively. This last trend is broad in scope and draws upon many of the other trends described above.
Weiye Loh

Analysis: Hold the panic on cell phones and cancer | Phones | iOS Central | Macworld - 0 views

  • Today’s cell phones are, essentially, extremely sophisticated radios and, as such, emit electromagnetic waves. Much like the vast majority of radiation that surrounds us—from visible light to AM and FM radio waves—electromagnetic waves do not possess enough energy to interact directly with the tissues in our bodies in a way that can cause direct damage. “The radiation that cell phones emit is nowhere near the kind of radiation that x-ray machines, for example, emit,” says Perras. “X-rays […] have much, much shorter wavelengths. Consequently, [they] carry much more energy and thus have much more penetrating power, which is required to be able to image the interior of the human body.”
  • X-rays and other “hard” waves are called ionizing radiation because they can interact with the human body in a way that leads to the creation of chemical compounds called free radicals that can, in turn, be responsible for mutations and the incidence of cancer.
  • The focus of much of the currently-ongoing scientific research, then, is on whether the radiation emitted by cell phones is focused enough to be absorbed into the body and cause heating, which could, in the long run, damage human tissue and eventually lead to cancer.
  • ...4 more annotations...
  • The issue is particularly important because most users still hold their phones close to the head; since the brain is particularly sensitive to external stimuli, even a small amount of heat could lead to medical trouble in the long term.
  • What makes it challenging to determine if a link between cell phones and cancer actually exists are the many variables involved. “The incidence of brain tumors is quite small, making it more difficult to study in large numbers,” says Dr. Eric Olyejar, a Radiation Oncologist from Ironwood Cancer and Research Centers, based in Chandler, Ariz. That means “quantifying the lifetime dose each patient received is extremely difficult.”
  • To make things more difficult, cancer often develops as a result of many different factors. “Family history, exposure to chemicals or radiation, growth defects, the amount of radiation that is actually coming from the phone, amount of time used, proximity to the brain, skull thickness, and wave frequency are only a few of the many variables,” Olyejar says.
  • cell phones have become so ubiquitous that it’s hard to compare the health of users and non-users.
  •  
    Do cell phones cause cancer? Nobody really knows for sure, but scientists are determined to keep an eye on the ever-evolving evidence that continues to accumulate on the subject. That's the gist of a report recently released by the World Health Organization's International Agency for Research on Cancer (IARC), the United Nations body responsible for oncological studies. In the report, IARC scientists have classified cell phone usage as a possible cause of cancer, meaning that, while the data currently available is still inconclusive, the subject deserves further research before a call can be made one way or another.
Weiye Loh

It's Even Less in Your Genes by Richard C. Lewontin | The New York Review of Books - 0 views

  • One of the complications is that the effective environment is defined by the life activities of the organism itself.
  • Thus, as organisms evolve, their environments necessarily evolve with them. Although classic Darwinism is framed by referring to organisms adapting to environments, the actual process of evolution involves the creation of new “ecological niches” as new life forms come into existence. Part of the ecological niche of an earthworm is the tunnel excavated by the worm and part of the ecological niche of a tree is the assemblage of fungi associated with the tree’s root system that provide it with nutrients.
  • , the distinction between organisms and their environments remains deeply embedded in our consciousness. Partly this is due to the inertia of educational institutions and materials
  • ...7 more annotations...
  • But the problem is deeper than simply intellectual inertia. It goes back, ultimately, to the unconsidered differentiations we make—at every moment when we distinguish among objects—between those in the foreground of our consciousness and the background places in which the objects happen to be situated. Moreover, this distinction creates a hierarchy of objects. We are conscious not only of the skin that encloses and defines the object, but of bits and pieces of that object, each of which must have its own “skin.” That is the problem of anatomization. A car has a motor and brakes and a transmission and an outer body that, at appropriate moments, become separate objects of our consciousness, objects that at least some knowledgeable person recognizes as coherent entities.
  • Evelyn Fox Keller sees “The Mirage of a Space Between Nature and Nurture” as a consequence of our false division of the world into living objects without sufficient consideration of the external milieu in which they are embedded, since organisms help create effective environments through their own life activities.
  • The central point of her analysis has been that gender itself (as opposed to sex) is socially constructed, and that construction has influenced the development of science:If there is a single point on which all feminist scholarship…has converged, it is the importance of recognizing the social construction of gender…. All of my work on gender and science proceeds from this basic recognition. My endeavor has been to call attention to the ways in which the social construction of a binary opposition between “masculine” and “feminine” has influenced the social construction of science.
  • major critical concern of Fox Keller’s present book is the widespread attempt to partition in some quantitative way the contribution made to human variation by differences in biological inheritance, that is, differences in genes, as opposed to differences in life experience. She wants to make clear a distinction between analyzing the relative strength of the causes of variation among individuals and groups, an analysis that is coherent in principle, and simply assigning the relative contributions of biological and environmental causes to the value of some character in an individual
  • It is, for example, all very well to say that genetic variation is responsible for 76 percent of the observed variation in adult height among American women while the remaining 24 percent is a consequence of differences in nutrition. The implication is that if all variation in nutrition were abolished then 24 percent of the observed height variation among individuals in the population in the next generation would disappear. To say, however, that 76 percent of Evelyn Fox Keller’s height was caused by her genes and 24 percent by her nutrition does not make sense. The nonsensical implication of trying to partition the causes of her individual height would be that if she never ate anything she would still be three quarters as tall as she is.
  • In fact, Keller is too optimistic about the assignment of causes of variation even when considering variation in a population. As she herself notes parenthetically, the assignment of relative proportions of population variation to different causes in a population depends on there being no specific interaction between the causes.
  • Keller’s rather casual treatment of the interaction between causal factors in the case of the drummers, despite her very great sophistication in analyzing the meaning of variation, is a symptom of a fault that is deeply embedded in the analytic training and thinking of both natural and social scientists. If there are several variable factors influencing some phenomenon, how are we to assign the relative importance to each in determining total variation? Let us take an extreme example. Suppose that we plant seeds of each of two different varieties of corn in two different locations with the following results measured in bushels of corn produced (see Table 1). There are differences between the varieties in their yield from location to location and there are differences between locations from variety to variety. So, both variety and location matter. But there is no average variation between locations when averaged over varieties or between varieties when averaged over locations. Just by knowing the variation in yield associated with location and variety separately does not tell us which factor is the more important source of variation; nor do the facts of location and variety exhaust the description of that variation.
  •  
    In trying to analyze the natural world, scientists are seldom aware of the degree to which their ideas are influenced both by their way of perceiving the everyday world and by the constraints that our cognitive development puts on our formulations. At every moment of perception of the world around us, we isolate objects as discrete entities with clear boundaries while we relegate the rest to a background in which the objects exist.
Weiye Loh

gladwell dot com - something borrowed - 0 views

  • Intellectual-property doctrine isn't a straightforward application of the ethical principle "Thou shalt not steal." At its core is the notion that there are certain situations where you can steal. The protections of copyright, for instance, are time-limited; once something passes into the public domain, anyone can copy it without restriction. Or suppose that you invented a cure for breast cancer in your basement lab. Any patent you received would protect your intellectual property for twenty years, but after that anyone could take your invention.
  • You get an initial monopoly on your creation because we want to provide economic incentives for people to invent things like cancer drugs. But everyone gets to steal your breast-cancer cure—after a decent interval—because it is also in society's interest to let as many people as possible copy your invention; only then can others learn from it, and build on it, and come up with better and cheaper alternatives. This balance between the protecting and the limiting of intellectual property
  • Stanford law professor Lawrence Lessig argues in his new book "Free Culture": In ordinary language, to call a copyright a "property" right is a bit misleading, for the property of copyright is an odd kind of property. . . . I understand what I am taking when I take the picnic table you put in your backyard. I am taking a thing, the picnic table, and after I take it, you don't have it. But what am I taking when I take the good idea you had to put a picnic table in the backyard—by, for example, going to Sears, buying a table, and putting it in my backyard? What is the thing that I am taking then? The point is not just about the thingness of picnic tables versus ideas, though that is an important difference. The point instead is that in the ordinary case—indeed, in practically every case except for a narrow range of exceptions—ideas released to the world are free. I don't take anything from you when I copy the way you dress—though I might seem weird if I do it every day. . . . Instead, as Thomas Jefferson said (and this is especially true when I copy the way someone dresses), "He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me."
  • ...5 more annotations...
  • Lessig argues that, when it comes to drawing this line between private interests and public interests in intellectual property, the courts and Congress have, in recent years, swung much too far in the direction of private interests.
  • We could have sat in his living room playing at musical genealogy for hours. Did the examples upset him? Of course not, because he knew enough about music to know that these patterns of influence—cribbing, tweaking, transforming—were at the very heart of the creative process.
  • True, copying could go too far. There were times when one artist was simply replicating the work of another, and to let that pass inhibited true creativity. But it was equally dangerous to be overly vigilant in policing creative expression, because if Led Zeppelin hadn't been free to mine the blues for inspiration we wouldn't have got "Whole Lotta Love," and if Kurt Cobain couldn't listen to "More Than a Feeling" and pick out and transform the part he really liked we wouldn't have "Smells Like Teen Spirit"—and, in the evolution of rock, "Smells Like Teen Spirit" was a real step forward from "More Than a Feeling." A successful music executive has to understand the distinction between borrowing that is transformative and borrowing that is merely derivative, and that distinction, I realized, was what was missing from the discussion of Bryony Lavery's borrowings. Yes, she had copied my work. But no one was asking why she had copied it, or what she had copied, or whether her copying served some larger purpose.
  • It also matters how Lavery chose to use my words. Borrowing crosses the line when it is used for a derivative work. It's one thing if you're writing a history of the Kennedys, like Doris Kearns Goodwin, and borrow, without attribution, from another history of the Kennedys. But Lavery wasn't writing another profile of Dorothy Lewis. She was writing a play about something entirely new—about what would happen if a mother met the man who killed her daughter. And she used my descriptions of Lewis's work and the outline of Lewis's life as a building block in making that confrontation plausible.
  • this is the second problem with plagiarism. It is not merely extremist. It has also become disconnected from the broader question of what does and does not inhibit creativity. We accept the right of one writer to engage in a full-scale knockoff of another—think how many serial-killer novels have been cloned from "The Silence of the Lambs." Yet, when Kathy Acker incorporated parts of a Harold Robbins sex scene verbatim in a satiric novel, she was denounced as a plagiarist (and threatened with a lawsuit)
  •  
    Under copyright law, what matters is not that you copied someone else's work. What matters is what you copied, and how much you copied.
Paul Melissa

"Can a robot commit a war crime?" - 1 views

http://www.newscientist.com/blog/technology/2008/02/military-turing-test-would-make-war.html This question was raised at the question was raised at the conference on The Ethics of Autonomous Milit...

started by Paul Melissa on 15 Oct 09 no follow-up yet
Weiye Loh

God is not the Creator, claims academic - Telegraph - 1 views

  • Professor Ellen van Wolde, a respected Old Testament scholar and author, claims the first sentence of Genesis "in the beginning God created the Heaven and the Earth" is not a true translation of the Hebrew.
  • She said she eventually concluded the Hebrew verb "bara", which is used in the first sentence of the book of Genesis, does not mean "to create" but to "spatially separate". The first sentence should now read "in the beginning God separated the Heaven and the Earth"
  • She said: "It meant to say that God did create humans and animals, but not the Earth itself."
  • ...1 more annotation...
  • She said she hoped that her conclusions would spark "a robust debate", since her finds are not only new, but would also touch the hearts of many religious people. She said: "Maybe I am even hurting myself. I consider myself to be religious and the Creator used to be very special, as a notion of trust. I want to keep that trust." A spokesman for the Radboud University said: "The new interpretation is a complete shake up of the story of the Creation as we know it." Prof Van Wolde added: "The traditional view of God the Creator is untenable now."
Magdaleine

Immortality only 20 years away says scientist - 9 views

wow interesting! like a start to the creation of super heroes!! it kind of sound like science is playing God here, determining and extending lives. it is already evident now in this society without...

nanotechnology rights divide

Weiye Loh

Rationally Speaking: Evolution as pseudoscience? - 0 views

  • I have been intrigued by an essay by my colleague Michael Ruse, entitled “Evolution and the idea of social progress,” published in a collection that I am reviewing, Biology and Ideology from Descartes to Dawkins (gotta love the title!), edited by Denis Alexander and Ronald Numbers.
  • Ruse's essay in the Alexander-Numbers collection questions the received story about the early evolution of evolutionary theory, which sees the stuff that immediately preceded Darwin — from Lamarck to Erasmus Darwin — as protoscience, the immature version of the full fledged science that biology became after Chuck's publication of the Origin of Species. Instead, Ruse thinks that pre-Darwinian evolutionists really engaged in pseudoscience, and that it took a very conscious and precise effort on Darwin’s part to sweep away all the garbage and establish a discipline with empirical and theoretical content analogous to that of the chemistry and physics of the time.
  • Ruse asserts that many serious intellectuals of the late 18th and early 19th century actually thought of evolution as pseudoscience, and he is careful to point out that the term “pseudoscience” had been used at least since 1843 (by the physiologist Francois Magendie)
  • ...17 more annotations...
  • Ruse’s somewhat surprising yet intriguing claim is that “before Charles Darwin, evolution was an epiphenomenon of the ideology of [social] progress, a pseudoscience and seen as such. Liked by some for that very reason, despised by others for that very reason.”
  • Indeed, the link between evolution and the idea of human social-cultural progress was very strong before Darwin, and was one of the main things Darwin got rid of.
  • The encyclopedist Denis Diderot was typical in this respect: “The Tahitian is at a primary stage in the development of the world, the European is at its old age. The interval separating us is greater than that between the new-born child and the decrepit old man.” Similar nonsensical views can be found in Lamarck, Erasmus, and Chambers, the anonymous author of The Vestiges of the Natural History of Creation, usually considered the last protoscientific book on evolution to precede the Origin.
  • On the other side of the divide were social conservatives like the great anatomist George Cuvier, who rejected the idea of evolution — according to Ruse — not as much on scientific grounds as on political and ideological ones. Indeed, books like Erasmus’ Zoonomia and Chambers’ Vestiges were simply not much better than pseudoscientific treatises on, say, alchemy before the advent of modern chemistry.
  • people were well aware of this sorry situation, so much so that astronomer John Herschel referred to the question of the history of life as “the mystery of mysteries,” a phrase consciously adopted by Darwin in the Origin. Darwin set out to solve that mystery under the influence of three great thinkers: Newton, the above mentioned Herschel, and the philosopher William Whewell (whom Darwin knew and assiduously frequented in his youth)
  • Darwin was a graduate of the University of Cambridge, which had also been Newton’s home. Chuck got drilled early on during his Cambridge education with the idea that good science is about finding mechanisms (vera causa), something like the idea of gravitational attraction underpinning Newtonian mechanics. He reflected that all the talk of evolution up to then — including his grandfather’s — was empty, without a mechanism that could turn the idea into a scientific research program.
  • The second important influence was Herschel’s Preliminary Discourse on the Study of Natural Philosophy, published in 1831 and read by Darwin shortly thereafter, in which Herschel sets out to give his own take on what today we would call the demarcation problem, i.e. what methodology is distinctive of good science. One of Herschel’s points was to stress the usefulness of analogical reasoning
  • Finally, and perhaps most crucially, Darwin also read (twice!) Whewell’s History of the Inductive Sciences, which appeared in 1837. In it, Whewell sets out his notion that good scientific inductive reasoning proceeds by a consilience of ideas, a situation in which multiple independent lines of evidence point to the same conclusion.
  • the first part of the Origin, where Darwin introduces the concept of natural selection by way of analogy with artificial selection can be read as the result of Herschel’s influence (natural selection is the vera causa of evolution)
  • the second part of the book, constituting Darwin's famous “long argument,” applies Whewell’s method of consilience by bringing in evidence from a number of disparate fields, from embryology to paleontology to biogeography.
  • What, then, happened to the strict coupling of the ideas of social and biological progress that had preceded Darwin? While he still believed in the former, the latter was no longer an integral part of evolution, because natural selection makes things “better” only in a relative fashion. There is no meaningful sense in which, say, a large brain is better than very fast legs or sharp claws, as long as you still manage to have dinner and avoid being dinner by the end of the day (or, more precisely, by the time you reproduce).
  • Ruse’s claim that evolution transitioned not from protoscience to science, but from pseudoscience, makes sense to me given the historical and philosophical developments. It wasn’t the first time either. Just think about the already mentioned shift from alchemy to chemistry
  • Of course, the distinction between pseudoscience and protoscience is itself fuzzy, but we do have what I think are clear examples of the latter that cannot reasonably be confused with the former, SETI for one, and arguably Ptolemaic astronomy. We also have pretty obvious instances of pseudoscience (the usual suspects: astrology, ufology, etc.), so the distinction — as long as it is not stretched beyond usefulness — is interesting and defensible.
  • It is amusing to speculate which, if any, of the modern pseudosciences (cryonics, singularitarianism) might turn out to be able to transition in one form or another to actual sciences. To do so, they may need to find their philosophically and scientifically savvy Darwin, and a likely bet — if history teaches us anything — is that, should they succeed in this transition, their mature form will look as different from the original as chemistry and alchemy. Or as Darwinism and pre-Darwinian evolutionism.
  • Darwin called the Origin "one long argument," but I really do think that recognizing that the book contains (at least) two arguments could help to dispel that whole "just a theory" canard. The first half of the book is devoted to demonstrating that natural selection is the true cause of evolution; vera causa arguments require proof that the cause's effect be demonstrated as fact, so the second half of the book is devoted to a demonstration that evolution has really happened. In other words, evolution is a demonstrable fact and natural selection is the theory that explains that fact, just as the motion of the planets is a fact and gravity is a theory that explains it.
  • Cryogenics is the study of the production of low temperatures and the behavior of materials at those temperatures. It is a legitimate branch of physics and has been for a long time. I think you meant 'cryonics'.
  • The Singularity means different things to different people. It is uncharitable to dismiss all "singularitarians" by debunking Kurzweil. He is low hanging fruit. Reach for something higher.
  •  
    "before Charles Darwin, evolution was an epiphenomenon of the ideology of [social] progress, a pseudoscience and seen as such. Liked by some for that very reason, despised by others for that very reason."
Weiye Loh

Rationally Speaking: Ray Kurzweil and the Singularity: visionary genius or pseudoscientific crank? - 0 views

  • I will focus on a single detailed essay he wrote entitled “Superintelligence and Singularity,” which was originally published as chapter 1 of his The Singularity is Near (Viking 2005), and has been reprinted in an otherwise insightful collection edited by Susan Schneider, Science Fiction and Philosophy.
  • Kurzweil begins by telling us that he gradually became aware of the coming Singularity, in a process that, somewhat peculiarly, he describes as a “progressive awakening” — a phrase with decidedly religious overtones. He defines the Singularity as “a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed.” Well, by that definition, we have been through several “singularities” already, as technology has often rapidly and irreversibly transformed our lives.
  • The major piece of evidence for Singularitarianism is what “I [Kurzweil] have called the law of accelerating returns (the inherent acceleration of the rate of evolution, with technological evolution as a continuation of biological evolution).”
  • ...9 more annotations...
  • the first obvious serious objection is that technological “evolution” is in no logical way a continuation of biological evolution — the word “evolution” here being applied with completely different meanings. And besides, there is no scientifically sensible way in which biological evolution has been accelerating over the several billion years of its operation on our planet. So much for scientific accuracy and logical consistency.
  • here is a bit that will give you an idea of why some people think of Singularitarianism as a secular religion: “The Singularity will allow us to transcend [the] limitations of our biological bodies and brains. We will gain power over our fates. Our mortality will be in our own hands. We will be able to live as long as we want.”
  • Fig. 2 of that essay shows a progression through (again, entirely arbitrary) six “epochs,” with the next one (#5) occurring when there will be a merger between technological and human intelligence (somehow, a good thing), and the last one (#6) labeled as nothing less than “the universe wakes up” — a nonsensical outcome further described as “patterns of matter and energy in the universe becom[ing] saturated with intelligence processes and knowledge.” This isn’t just science fiction, it is bad science fiction.
  • “a serious assessment of the history of technology reveals that technological change is exponential. Exponential growth is a feature of any evolutionary process.” First, it is highly questionable that one can even measure “technological change” on a coherent uniform scale. Yes, we can plot the rate of, say, increase in microprocessor speed, but that is but one aspect of “technological change.” As for the idea that any evolutionary process features exponential growth, I don’t know where Kurzweil got it, but it is simply wrong, for one thing because biological evolution does not have any such feature — as any student of Biology 101 ought to know.
  • Kurzweil’s ignorance of evolution is manifested again a bit later, when he claims — without argument, as usual — that “Evolution is a process of creating patterns of increasing order. ... It’s the evolution of patterns that constitutes the ultimate story of the world. ... Each stage or epoch uses the information-processing methods of the previous epoch to create the next.” I swear, I was fully expecting a scholarly reference to Deepak Chopra at the end of that sentence. Again, “evolution” is a highly heterogeneous term that picks completely different concepts, such as cosmic “evolution” (actually just change over time), biological evolution (which does have to do with the creation of order, but not in Kurzweil’s blatantly teleological sense), and technological “evolution” (which is certainly yet another type of beast altogether, since it requires intelligent design). And what on earth does it mean that each epoch uses the “methods” of the previous one to “create” the next one?
  • As we have seen, the whole idea is that human beings will merge with machines during the ongoing process of ever accelerating evolution, an event that will eventually lead to the universe awakening to itself, or something like that. Now here is the crucial question: how come this has not happened already?
  • To appreciate the power of this argument you may want to refresh your memory about the Fermi Paradox, a serious (though in that case, not a knockdown) argument against the possibility of extraterrestrial intelligent life. The story goes that physicist Enrico Fermi (the inventor of the first nuclear reactor) was having lunch with some colleagues, back in 1950. His companions were waxing poetic about the possibility, indeed the high likelihood, that the galaxy is teeming with intelligent life forms. To which Fermi asked something along the lines of: “Well, where are they, then?”
  • The idea is that even under very pessimistic (i.e., very un-Kurzweil like) expectations about how quickly an intelligent civilization would spread across the galaxy (without even violating the speed of light limit!), and given the mind boggling length of time the galaxy has already existed, it becomes difficult (though, again, not impossible) to explain why we haven’t seen the darn aliens yet.
  • Now, translate that to Kurzweil’s much more optimistic predictions about the Singularity (which allegedly will occur around 2045, conveniently just a bit after Kurzweil’s expected demise, given that he is 63 at the time of this writing). Considering that there is no particular reason to think that planet earth, or the human species, has to be the one destined to trigger the big event, why is it that the universe hasn’t already “awakened” as a result of a Singularity occurring somewhere else at some other time?
‹ Previous 21 - 40 of 43 Next ›
Showing 20 items per page