Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Intellectual

Rss Feed Group items tagged

Weiye Loh

Roger Pielke Jr.'s Blog: Innovation in Drug Development: An Inverse Moore's Law? - 0 views

  • Today's FT has this interesting graph and an accompanying story, showing a sort of inverse Moore's Law of drug development.  Over almost 60 years the number of new drugs developed per unit of investment has declined in a fairly constant manner, and some drug companies are now slashing their R&D budgets.
  • why this trend has occurred.  The FT points to a combination of low-hanging fruit that has been plucked and increasing costs of drug development. To some observers, that reflects the end of the mid to late 20th century golden era for drug discovery, when first-generation medicines such as antibiotics and beta-blockers to treat high blood pressure transformed healthcare. At the same time, regulatory demands to prove safety and efficacy have grown firmer. The result is larger and more costly clinical trials, and high failure rates for experimental drugs.
  • Others point to flawed innovation policies in industry and governments: “The markets treat drug companies as though research and development spending destroys value,” says Jack Scannell, an analyst at Bernstein Research. “People have stopped distinguishing the good from the bad. All those which performed well returned cash to shareholders. Unless the industry can articulate what the problem is, I don’t expect that to change.”
  • ...6 more annotations...
  • Mr [Andrew] Baum [of Morgan Stanley] argues that the solution for drug companies is to share the risks of research with others. That means reducing in-house investment in research, and instead partnering and licensing experimental medicines from smaller companies after some of the early failures have been eliminated.
  • Chas Bountra of Oxford university calls for a more radical partnership combining industry and academic research. “What we are trying to do is just too difficult,” he says. “No one organisation can do it, so we have to pool resources and expertise.” He suggests removing intellectual property rights until a drug is in mid-stage testing in humans, which would make academics more willing to co-operate because they could publish their results freely. The sharing of data would enable companies to avoid duplicating work.
  • The challenge is for academia and biotech companies to fill the research gap. Mr Ratcliffe argues that after a lull in 2009 and 2010, private capital is returning to the sector – as demonstrated by a particular buzz at JPMorgan’s new year biotech conference in California.
  • Patrick Vallance, senior vice-president for discovery at GSK, is cautious about deferring patents until so late, arguing that drug companies need to be able to protect their intellectual property in order to fund expensive late-stage development. But he too is experimenting with ways to co-operate more closely with academics over longer periods. He is also championing the “externalisation” of the company’s pipeline, with biotech and university partners accounting for half the total. GSK has earmarked £50m to support fledgling British companies, many “wrapped around” the group’s sites. One such example is Convergence, a spin-out from a GSK lab researching pain relief.
  • Big pharmaceutical companies are scrambling to find ways to overcome the loss of tens of billions of dollars in revenue as patents on top-selling drugs run out. Many sound similar notes about encouraging entrepreneurialism in their ranks, making smart deals and capitalizing on emerging-market growth, But their actual plans are often quite different—and each carries significant risks. Novartis AG, for instance, is so convinced that diversification is the best course that the company has a considerable business selling low-priced generics. Meantime, Bristol-Myers Squibb Co. has decided to concentrate on innovative medicines, shedding so many nonpharmaceutical units that it' has become midsize. GlaxoSmithKline PLC is still investing in research, but like Pfizer it has narrowed the range of disease areas in which it's seeking new treatments. Underlying the divergence is a deep-seated philosophical dispute over the merits of the heavy investment that companies must make to discover new drugs. By most estimates, bringing a new molecule to market costs drug makers more than $1 billion. Industry officials have been engaged in a vigorous debate over whether the investment is worth it, or whether they should leave it to others whose work they can acquire or license after a demonstration of strong potential.
  • To what extent can approached to innovation influence the trend line in the graph above?  I don't think that anyone really knows the answer.  The different approaches being taken by Merck and Pfizer, for instance, represent a real world policy experiment: The contrast between Merck and Pfizer reflects the very different personal approaches of their CEOs. An accountant by training, Mr. Read has held various business positions during a three-decade career at Pfizer. The 57-year-old cited torcetrapib, a cholesterol medicine that the company spent more than $800 million developing but then pulled due to safety concerns, as an example of the kind of wasteful spending Pfizer would avoid. "We're going to have metrics," Mr. Read said. He wants Pfizer to stop "always investing on hope rather than strong signals and the quality of the science, the quality of the medicine." Mr. Frazier, 56, a Harvard-educated lawyer who joined Merck in 1994 from private practice, said the company was sticking by its own troubled heart drug, vorapaxar. Mr. Frazier said he wanted to see all of the data from the trials before rushing to judgment. "We believe in the innovation approach," he said.
Weiye Loh

Rationally Speaking: Are Intuitions Good Evidence? - 0 views

  • Is it legitimate to cite one’s intuitions as evidence in a philosophical argument?
  • appeals to intuitions are ubiquitous in philosophy. What are intuitions? Well, that’s part of the controversy, but most philosophers view them as intellectual “seemings.” George Bealer, perhaps the most prominent defender of intuitions-as-evidence, writes, “For you to have an intuition that A is just for it to seem to you that A… Of course, this kind of seeming is intellectual, not sensory or introspective (or imaginative).”2 Other philosophers have characterized them as “noninferential belief due neither to perception nor introspection”3 or alternatively as “applications of our ordinary capacities for judgment.”4
  • Philosophers may not agree on what, exactly, intuition is, but that doesn’t stop them from using it. “Intuitions often play the role that observation does in science – they are data that must be explained, confirmers or the falsifiers of theories,” Brian Talbot says.5 Typically, the way this works is that a philosopher challenges a theory by applying it to a real or hypothetical case and showing that it yields a result which offends his intuitions (and, he presumes, his readers’ as well).
  • ...16 more annotations...
  • For example, John Searle famously appealed to intuition to challenge the notion that a computer could ever understand language: “Imagine a native English speaker who knows no Chinese locked in a room full of boxes of Chinese symbols (a data base) together with a book of instructions for manipulating the symbols (the program). Imagine that people outside the room send in other Chinese symbols which, unknown to the person in the room, are questions in Chinese (the input). And imagine that by following the instructions in the program the man in the room is able to pass out Chinese symbols which are correct answers to the questions (the output)… If the man in the room does not understand Chinese on the basis of implementing the appropriate program for understanding Chinese then neither does any other digital computer solely on that basis because no computer, qua computer, has anything the man does not have.” Should we take Searle’s intuition that such a system would not constitute “understanding” as good evidence that it would not? Many critics of the Chinese Room argument argue that there is no reason to expect our intuitions about intelligence and understanding to be reliable.
  • Ethics leans especially heavily on appeals to intuition, with a whole school of ethicists (“intuitionists”) maintaining that a person can see the truth of general ethical principles not through reason, but because he “just sees without argument that they are and must be true.”6
  • Intuitions are also called upon to rebut ethical theories such as utilitarianism: maximizing overall utility would require you to kill one innocent person if, in so doing, you could harvest her organs and save five people in need of transplants. Such a conclusion is taken as a reductio ad absurdum, requiring utilitarianism to be either abandoned or radically revised – not because the conclusion is logically wrong, but because it strikes nearly everyone as intuitively wrong.
  • British philosopher G.E. Moore used intuition to argue that the existence of beauty is good irrespective of whether anyone ever gets to see and enjoy that beauty. Imagine two planets, he said, one full of stunning natural wonders – trees, sunsets, rivers, and so on – and the other full of filth. Now suppose that nobody will ever have the opportunity to glimpse either of those two worlds. Moore concluded, “Well, even so, supposing them quite apart from any possible contemplation by human beings; still, is it irrational to hold that it is better that the beautiful world should exist than the one which is ugly? Would it not be well, in any case, to do what we could to produce it rather than the other? Certainly I cannot help thinking that it would."7
  • Although similar appeals to intuition can be found throughout all the philosophical subfields, their validity as evidence has come under increasing scrutiny over the last two decades, from philosophers such as Hilary Kornblith, Robert Cummins, Stephen Stich, Jonathan Weinberg, and Jaakko Hintikka (links go to representative papers from each philosopher on this issue). The severity of their criticisms vary from Weinberg’s warning that “We simply do not know enough about how intuitions work,” to Cummins’ wholesale rejection of philosophical intuition as “epistemologically useless.”
  • One central concern for the critics is that a single question can inspire totally different, and mutually contradictory, intuitions in different people.
  • For example, I disagree with Moore’s intuition that it would be better for a beautiful planet to exist than an ugly one even if there were no one around to see it. I can’t understand what the words “better” and “worse,” let alone “beautiful” and “ugly,” could possibly mean outside the domain of the experiences of conscious beings
  • If we want to take philosophers’ intuitions as reason to believe a proposition, then the existence of opposing intuitions leaves us in the uncomfortable position of having reason to believe both a proposition and its opposite.
  • “I suspect there is overall less agreement than standard philosophical practice presupposes, because having the ‘right’ intuitions is the entry ticket to various subareas of philosophy,” Weinberg says.
  • But the problem that intuitions are often not universally shared is overshadowed by another problem: even if an intuition is universally shared, that doesn’t mean it’s accurate. For in fact there are many universal intuitions that are demonstrably false.
  • People who have not been taught otherwise typically assume that an object dropped out of a moving plane will fall straight down to earth, at exactly the same latitude and longitude from which it was dropped. What will actually happen is that, because the object begins its fall with the same forward momentum it had while it was on the plane, it will continue to travel forward, tracing out a curve as it falls and not a straight line. “Considering the inadequacies of ordinary physical intuitions, it is natural to wonder whether ordinary moral intuitions might be similarly inadequate,” Princeton’s Gilbert Harman has argued,9 and the same could be said for our intuitions about consciousness, metaphysics, and so on.
  • We can’t usually “check” the truth of our philosophical intuitions externally, with an experiment or a proof, the way we can in physics or math. But it’s not clear why we should expect intuitions to be true. If we have an innate tendency towards certain intuitive beliefs, it’s likely because they were useful to our ancestors.
  • But there’s no reason to expect that the intuitions which were true in the world of our ancestors would also be true in other, unfamiliar contexts
  • And for some useful intuitions, such as moral ones, “truth” may have been beside the point. It’s not hard to see how moral intuitions in favor of fairness and generosity would have been crucial to the survival of our ancestors’ tribes, as would the intuition to condemn tribe members who betrayed those reciprocal norms. If we can account for the presence of these moral intuitions by the fact that they were useful, is there any reason left to hypothesize that they are also “true”? The same question could be asked of the moral intuitions which Jonathan Haidt has classified as “purity-based” – an aversion to incest, for example, would clearly have been beneficial to our ancestors. Since that fact alone suffices to explain the (widespread) presence of the “incest is morally wrong” intuition, why should we take that intuition as evidence that “incest is morally wrong” is true?
  • The still-young debate over intuition will likely continue to rage, especially since it’s intertwined with a rapidly growing body of cognitive and social psychological research examining where our intuitions come from and how they vary across time and place.
  • its resolution bears on the work of literally every field of analytic philosophy, except perhaps logic. Can analytic philosophy survive without intuition? (If so, what would it look like?) And can the debate over the legitimacy of appeals to intuition be resolved with an appeal to intuition?
Weiye Loh

Skepticblog » A Creationist Challenge - 0 views

  • The commenter starts with some ad hominems, asserting that my post is biased and emotional. They provide no evidence or argument to support this assertion. And of course they don’t even attempt to counter any of the arguments I laid out. They then follow up with an argument from authority – he can link to a PhD creationist – so there.
  • The article that the commenter links to is by Henry M. Morris, founder for the Institute for Creation Research (ICR) – a young-earth creationist organization. Morris was (he died in 2006 following a stroke) a PhD – in civil engineering. This point is irrelevant to his actual arguments. I bring it up only to put the commenter’s argument from authority into perspective. No disrespect to engineers – but they are not biologists. They have no expertise relevant to the question of evolution – no more than my MD. So let’s stick to the arguments themselves.
  • The article by Morris is an overview of so-called Creation Science, of which Morris was a major architect. The arguments he presents are all old creationist canards, long deconstructed by scientists. In fact I address many of them in my original refutation. Creationists generally are not very original – they recycle old arguments endlessly, regardless of how many times they have been destroyed.
  • ...26 more annotations...
  • Morris also makes heavy use of the “taking a quote out of context” strategy favored by creationists. His quotes are often from secondary sources and are incomplete.
  • A more scholarly (i.e. intellectually honest) approach would be to cite actual evidence to support a point. If you are going to cite an authority, then make sure the quote is relevant, in context, and complete.
  • And even better, cite a number of sources to show that the opinion is representative. Rather we get single, partial, and often outdated quotes without context.
  • (nature is not, it turns out, cleanly divided into “kinds”, which have no operational definition). He also repeats this canard: Such variation is often called microevolution, and these minor horizontal (or downward) changes occur fairly often, but such changes are not true “vertical” evolution. This is the microevolution/macroevolution false dichotomy. It is only “often called” this by creationists – not by actual evolutionary scientists. There is no theoretical or empirical division between macro and micro evolution. There is just evolution, which can result in the full spectrum of change from minor tweaks to major changes.
  • Morris wonders why there are no “dats” – dog-cat transitional species. He misses the hierarchical nature of evolution. As evolution proceeds, and creatures develop a greater and greater evolutionary history behind them, they increasingly are committed to their body plan. This results in a nestled hierarchy of groups – which is reflected in taxonomy (the naming scheme of living things).
  • once our distant ancestors developed the basic body plan of chordates, they were committed to that body plan. Subsequent evolution resulted in variations on that plan, each of which then developed further variations, etc. But evolution cannot go backward, undo evolutionary changes and then proceed down a different path. Once an evolutionary line has developed into a dog, evolution can produce variations on the dog, but it cannot go backwards and produce a cat.
  • Stephen J. Gould described this distinction as the difference between disparity and diversity. Disparity (the degree of morphological difference) actually decreases over evolutionary time, as lineages go extinct and the surviving lineages are committed to fewer and fewer basic body plans. Meanwhile, diversity (the number of variations on a body plan) within groups tends to increase over time.
  • the kind of evolutionary changes that were happening in the past, when species were relatively undifferentiated (compared to contemporary species) is indeed not happening today. Modern multi-cellular life has 600 million years of evolutionary history constraining their future evolution – which was not true of species at the base of the evolutionary tree. But modern species are indeed still evolving.
  • Here is a list of research documenting observed instances of speciation. The list is from 1995, and there are more recent examples to add to the list. Here are some more. And here is a good list with references of more recent cases.
  • Next Morris tries to convince the reader that there is no evidence for evolution in the past, focusing on the fossil record. He repeats the false claim (again, which I already dealt with) that there are no transitional fossils: Even those who believe in rapid evolution recognize that a considerable number of generations would be required for one distinct “kind” to evolve into another more complex kind. There ought, therefore, to be a considerable number of true transitional structures preserved in the fossils — after all, there are billions of non-transitional structures there! But (with the exception of a few very doubtful creatures such as the controversial feathered dinosaurs and the alleged walking whales), they are not there.
  • I deal with this question at length here, pointing out that there are numerous transitional fossils for the evolution of terrestrial vertebrates, mammals, whales, birds, turtles, and yes – humans from ape ancestors. There are many more examples, these are just some of my favorites.
  • Much of what follows (as you can see it takes far more space to correct the lies and distortions of Morris than it did to create them) is classic denialism – misinterpreting the state of the science, and confusing lack of information about the details of evolution with lack of confidence in the fact of evolution. Here are some examples – he quotes Niles Eldridge: “It is a simple ineluctable truth that virtually all members of a biota remain basically stable, with minor fluctuations, throughout their durations. . . .“ So how do evolutionists arrive at their evolutionary trees from fossils of organisms which didn’t change during their durations? Beware the “….” – that means that meaningful parts of the quote are being omitted. I happen to have the book (The Pattern of Evolution) from which Morris mined that particular quote. Here’s the rest of it: (Remember, by “biota” we mean the commonly preserved plants and animals of a particular geological interval, which occupy regions often as large as Roger Tory Peterson’s “eastern” region of North American birds.) And when these systems change – when the older species disappear, and new ones take their place – the change happens relatively abruptly and in lockstep fashion.”
  • Eldridge was one of the authors (with Gould) of punctuated equilibrium theory. This states that, if you look at the fossil record, what we see are species emerging, persisting with little change for a while, and then disappearing from the fossil record. They theorize that most species most of the time are at equilibrium with their environment, and so do not change much. But these periods of equilibrium are punctuated by disequilibrium – periods of change when species will have to migrate, evolve, or go extinct.
  • This does not mean that speciation does not take place. And if you look at the fossil record we see a pattern of descendant species emerging from ancestor species over time – in a nice evolutionary pattern. Morris gives a complete misrepresentation of Eldridge’s point – once again we see intellectual dishonesty in his methods of an astounding degree.
  • Regarding the atheism = religion comment, it reminds me of a great analogy that I first heard on twitter from Evil Eye. (paraphrase) “those that say atheism is a religion, is like saying ‘not collecting stamps’ is a hobby too.”
  • Morris next tackles the genetic evidence, writing: More often is the argument used that similar DNA structures in two different organisms proves common evolutionary ancestry. Neither argument is valid. There is no reason whatever why the Creator could not or would not use the same type of genetic code based on DNA for all His created life forms. This is evidence for intelligent design and creation, not evolution.
  • Here is an excellent summary of the multiple lines of molecular evidence for evolution. Basically, if we look at the sequence of DNA, the variations in trinucleotide codes for amino acids, and amino acids for proteins, and transposons within DNA we see a pattern that can only be explained by evolution (or a mischievous god who chose, for some reason, to make life look exactly as if it had evolved – a non-falsifiable notion).
  • The genetic code is essentially comprised of four letters (ACGT for DNA), and every triplet of three letters equates to a specific amino acid. There are 64 (4^3) possible three letter combinations, and 20 amino acids. A few combinations are used for housekeeping, like a code to indicate where a gene stops, but the rest code for amino acids. There are more combinations than amino acids, so most amino acids are coded for by multiple combinations. This means that a mutation that results in a one-letter change might alter from one code for a particular amino acid to another code for the same amino acid. This is called a silent mutation because it does not result in any change in the resulting protein.
  • It also means that there are very many possible codes for any individual protein. The question is – which codes out of the gazillions of possible codes do we find for each type of protein in different species. If each “kind” were created separately there would not need to be any relationship. Each kind could have it’s own variation, or they could all be identical if they were essentially copied (plus any mutations accruing since creation, which would be minimal). But if life evolved then we would expect that the exact sequence of DNA code would be similar in related species, but progressively different (through silent mutations) over evolutionary time.
  • This is precisely what we find – in every protein we have examined. This pattern is necessary if evolution were true. It cannot be explained by random chance (the probability is absurdly tiny – essentially zero). And it makes no sense from a creationist perspective. This same pattern (a branching hierarchy) emerges when we look at amino acid substitutions in proteins and other aspects of the genetic code.
  • Morris goes for the second law of thermodynamics again – in the exact way that I already addressed. He responds to scientists correctly pointing out that the Earth is an open system, by writing: This naive response to the entropy law is typical of evolutionary dissimulation. While it is true that local order can increase in an open system if certain conditions are met, the fact is that evolution does not meet those conditions. Simply saying that the earth is open to the energy from the sun says nothing about how that raw solar heat is converted into increased complexity in any system, open or closed. The fact is that the best known and most fundamental equation of thermodynamics says that the influx of heat into an open system will increase the entropy of that system, not decrease it. All known cases of decreased entropy (or increased organization) in open systems involve a guiding program of some sort and one or more energy conversion mechanisms.
  • Energy has to be transformed into a usable form in order to do the work necessary to decrease entropy. That’s right. That work is done by life. Plants take solar energy (again – I’m not sure what “raw solar heat” means) and convert it into food. That food fuels the processes of life, which include development and reproduction. Evolution emerges from those processes- therefore the conditions that Morris speaks of are met.
  • But Morris next makes a very confused argument: Evolution has neither of these. Mutations are not “organizing” mechanisms, but disorganizing (in accord with the second law). They are commonly harmful, sometimes neutral, but never beneficial (at least as far as observed mutations are concerned). Natural selection cannot generate order, but can only “sieve out” the disorganizing mutations presented to it, thereby conserving the existing order, but never generating new order.
  • The notion that evolution (as if it’s a thing) needs to use energy is hopelessly confused. Evolution is a process that emerges from the system of life – and life certainly can use solar energy to decrease its entropy, and by extension the entropy of the biosphere. Morris slips into what is often presented as an information argument.  (Yet again – already dealt with. The pattern here is that we are seeing a shuffling around of the same tired creationists arguments.) It is first not true that most mutations are harmful. Many are silent, and many of those that are not silent are not harmful. They may be neutral, they may be a mixed blessing, and their relative benefit vs harm is likely to be situational. They may be fatal. And they also may be simply beneficial.
  • Morris finishes with a long rambling argument that evolution is religion. Evolution is promoted by its practitioners as more than mere science. Evolution is promulgated as an ideology, a secular religion — a full-fledged alternative to Christianity, with meaning and morality . . . . Evolution is a religion. This was true of evolution in the beginning, and it is true of evolution still today. Morris ties evolution to atheism, which, he argues, makes it a religion. This assumes, of course, that atheism is a religion. That depends on how you define atheism and how you define religion – but it is mostly wrong. Atheism is a lack of belief in one particular supernatural claim – that does not qualify it as a religion.
  • But mutations are not “disorganizing” – that does not even make sense. It seems to be based on a purely creationist notion that species are in some privileged perfect state, and any mutation can only take them farther from that perfection. For those who actually understand biology, life is a kluge of compromises and variation. Mutations are mostly lateral moves from one chaotic state to another. They are not directional. But they do provide raw material, variation, for natural selection. Natural selection cannot generate variation, but it can select among that variation to provide differential survival. This is an old game played by creationists – mutations are not selective, and natural selection is not creative (does not increase variation). These are true but irrelevant, because mutations increase variation and information, and selection is a creative force that results in the differential survival of better adapted variation.
  •  
    One of my earlier posts on SkepticBlog was Ten Major Flaws in Evolution: A Refutation, published two years ago. Occasionally a creationist shows up to snipe at the post, like this one:i read this and found it funny. It supposedly gives a scientific refutation, but it is full of more bias than fox news, and a lot of emotion as well.here's a scientific case by an actual scientists, you know, one with a ph. D, and he uses statements by some of your favorite evolutionary scientists to insist evolution doesn't exist.i challenge you to write a refutation on this one.http://www.icr.org/home/resources/resources_tracts_scientificcaseagainstevolution/Challenge accepted.
Weiye Loh

Debating the Value of College in America : The New Yorker - 0 views

  • Society needs a mechanism for sorting out its more intelligent members from its less intelligent ones
  • Society wants to identify intelligent people early on so that it can funnel them into careers that maximize their talents. It wants to get the most out of its human resources. College is a process that is sufficiently multifaceted and fine-grained to do this. College is, essentially, a four-year intelligence test. Students have to demonstrate intellectual ability over time and across a range of subjects. If they’re sloppy or inflexible or obnoxious—no matter how smart they might be in the I.Q. sense—those negatives will get picked up in their grades.
  • college also sorts people according to aptitude. It separates the math types from the poetry types. At the end of the process, graduates get a score, the G.P.A., that professional schools and employers can trust as a measure of intellectual capacity and productive potential. It’s important, therefore, that everyone is taking more or less the same test.
  • ...2 more annotations...
  • College exposes future citizens to material that enlightens and empowers them, whatever careers they end up choosing. In performing this function, college also socializes. It takes people with disparate backgrounds and beliefs and brings them into line with mainstream norms of reason and taste. Independence of mind is tolerated in college, and even honored, but students have to master the accepted ways of doing things before they are permitted to deviate. Ideally, we want everyone to go to college, because college gets everyone on the same page. It’s a way of producing a society of like-minded grownups.
  • If you like the first theory, then it doesn’t matter which courses students take, or even what is taught in them, as long as they’re rigorous enough for the sorting mechanism to do its work. All that matters is the grades. If you prefer the second theory, then you might consider grades a useful instrument of positive or negative reinforcement, but the only thing that matters is what students actually learn. There is stuff that every adult ought to know, and college is the best delivery system for getting that stuff into people’s heads.
Weiye Loh

normblog: Johann Hari and the meaning of plagiarism - 0 views

  • Johann says that the accusation of plagiarism against him is 'totally false', and in support of the claim he writes that:Plagiarism is presenting somebody else's intellectual work as your own - whereas I have always accurately attributed the ideas of (say) Gideon Levy to Gideon Levy.But this is subtly - and self-servingly - to narrow the meaning of the word 'plagiarism'. By saying, first, 'intellectual work' and then segueing from that into 'the ideas of' Gideon Levy, Johann omits forms of plagiarism that involve using the work of others without making due acknowledgement. This is certainly a meaning of plagiarism. If I report as having happened to me an encounter on a New York street with Wayne Gretzky, using the exact words of someone who really did meet Wayne Gretzky on a New York street, and I pretend it happened to me, then that is plagiarism. I'm not stealing anyone's 'ideas', in the way that this is usually meant; I'm not passing off as my own an argument, or conceptual proposal, or novel thesis, of some writer or thinker. But I am improperly drawing on the work of others.
  •  
    Johann Hari and the meaning of plagiarism. http://t.co/drgsoZl
Weiye Loh

The Death of Postmodernism And Beyond | Philosophy Now - 0 views

  • Most of the undergraduates who will take ‘Postmodern Fictions’ this year will have been born in 1985 or after, and all but one of the module’s primary texts were written before their lifetime. Far from being ‘contemporary’, these texts were published in another world, before the students were born: The French Lieutenant’s Woman, Nights at the Circus, If on a Winter’s Night a Traveller, Do Androids Dream of Electric Sheep? (and Blade Runner), White Noise: this is Mum and Dad’s culture. Some of the texts (‘The Library of Babel’) were written even before their parents were born. Replace this cache with other postmodern stalwarts – Beloved, Flaubert’s Parrot, Waterland, The Crying of Lot 49, Pale Fire, Slaughterhouse 5, Lanark, Neuromancer, anything by B.S. Johnson – and the same applies. It’s all about as contemporary as The Smiths, as hip as shoulder pads, as happening as Betamax video recorders. These are texts which are just coming to grips with the existence of rock music and television; they mostly do not dream even of the possibility of the technology and communications media – mobile phones, email, the internet, computers in every house powerful enough to put a man on the moon – which today’s undergraduates take for granted.
  • somewhere in the late 1990s or early 2000s, the emergence of new technologies re-structured, violently and forever, the nature of the author, the reader and the text, and the relationships between them.
  • Postmodernism, like modernism and romanticism before it, fetishised [ie placed supreme importance on] the author, even when the author chose to indict or pretended to abolish him or herself. But the culture we have now fetishises the recipient of the text to the degree that they become a partial or whole author of it. Optimists may see this as the democratisation of culture; pessimists will point to the excruciating banality and vacuity of the cultural products thereby generated (at least so far).
  • ...17 more annotations...
  • Pseudo-modernism also encompasses contemporary news programmes, whose content increasingly consists of emails or text messages sent in commenting on the news items. The terminology of ‘interactivity’ is equally inappropriate here, since there is no exchange: instead, the viewer or listener enters – writes a segment of the programme – then departs, returning to a passive role. Pseudo-modernism also includes computer games, which similarly place the individual in a context where they invent the cultural content, within pre-delineated limits. The content of each individual act of playing the game varies according to the particular player.
  • The pseudo-modern cultural phenomenon par excellence is the internet. Its central act is that of the individual clicking on his/her mouse to move through pages in a way which cannot be duplicated, inventing a pathway through cultural products which has never existed before and never will again. This is a far more intense engagement with the cultural process than anything literature can offer, and gives the undeniable sense (or illusion) of the individual controlling, managing, running, making up his/her involvement with the cultural product. Internet pages are not ‘authored’ in the sense that anyone knows who wrote them, or cares. The majority either require the individual to make them work, like Streetmap or Route Planner, or permit him/her to add to them, like Wikipedia, or through feedback on, for instance, media websites. In all cases, it is intrinsic to the internet that you can easily make up pages yourself (eg blogs).
  • Where once special effects were supposed to make the impossible appear credible, CGI frequently [inadvertently] works to make the possible look artificial, as in much of Lord of the Rings or Gladiator. Battles involving thousands of individuals have really happened; pseudo-modern cinema makes them look as if they have only ever happened in cyberspace.
  • Similarly, television in the pseudo-modern age favours not only reality TV (yet another unapt term), but also shopping channels, and quizzes in which the viewer calls to guess the answer to riddles in the hope of winning money.
  • The purely ‘spectacular’ function of television, as with all the arts, has become a marginal one: what is central now is the busy, active, forging work of the individual who would once have been called its recipient. In all of this, the ‘viewer’ feels powerful and is indeed necessary; the ‘author’ as traditionally understood is either relegated to the status of the one who sets the parameters within which others operate, or becomes simply irrelevant, unknown, sidelined; and the ‘text’ is characterised both by its hyper-ephemerality and by its instability. It is made up by the ‘viewer’, if not in its content then in its sequence – you wouldn’t read Middlemarch by going from page 118 to 316 to 401 to 501, but you might well, and justifiably, read Ceefax that way.
  • A pseudo-modern text lasts an exceptionally brief time. Unlike, say, Fawlty Towers, reality TV programmes cannot be repeated in their original form, since the phone-ins cannot be reproduced, and without the possibility of phoning-in they become a different and far less attractive entity.
  • If scholars give the date they referenced an internet page, it is because the pages disappear or get radically re-cast so quickly. Text messages and emails are extremely difficult to keep in their original form; printing out emails does convert them into something more stable, like a letter, but only by destroying their essential, electronic state.
  • The cultural products of pseudo-modernism are also exceptionally banal
  • Much text messaging and emailing is vapid in comparison with what people of all educational levels used to put into letters.
  • A triteness, a shallowness dominates all.
  • In music, the pseudo-modern supersedingof the artist-dominated album as monolithic text by the downloading and mix-and-matching of individual tracks on to an iPod, selected by the listener, was certainly prefigured by the music fan’s creation of compilation tapes a generation ago. But a shift has occurred, in that what was a marginal pastime of the fan has become the dominant and definitive way of consuming music, rendering the idea of the album as a coherent work of art, a body of integrated meaning, obsolete.
  • To a degree, pseudo-modernism is no more than a technologically motivated shift to the cultural centre of something which has always existed (similarly, metafiction has always existed, but was never so fetishised as it was by postmodernism). Television has always used audience participation, just as theatre and other performing arts did before it; but as an option, not as a necessity: pseudo-modern TV programmes have participation built into them.
  • Whereas postmodernism called ‘reality’ into question, pseudo-modernism defines the real implicitly as myself, now, ‘interacting’ with its texts. Thus, pseudo-modernism suggests that whatever it does or makes is what is reality, and a pseudo-modern text may flourish the apparently real in an uncomplicated form: the docu-soap with its hand-held cameras (which, by displaying individuals aware of being regarded, give the viewer the illusion of participation); The Office and The Blair Witch Project, interactive pornography and reality TV; the essayistic cinema of Michael Moore or Morgan Spurlock.
  • whereas postmodernism favoured the ironic, the knowing and the playful, with their allusions to knowledge, history and ambivalence, pseudo-modernism’s typical intellectual states are ignorance, fanaticism and anxiety
  • pseudo-modernism lashes fantastically sophisticated technology to the pursuit of medieval barbarism – as in the uploading of videos of beheadings onto the internet, or the use of mobile phones to film torture in prisons. Beyond this, the destiny of everyone else is to suffer the anxiety of getting hit in the cross-fire. But this fatalistic anxiety extends far beyond geopolitics, into every aspect of contemporary life; from a general fear of social breakdown and identity loss, to a deep unease about diet and health; from anguish about the destructiveness of climate change, to the effects of a new personal ineptitude and helplessness, which yield TV programmes about how to clean your house, bring up your children or remain solvent.
  • Pseudo-modernism belongs to a world pervaded by the encounter between a religiously fanatical segment of the United States, a largely secular but definitionally hyper-religious Israel, and a fanatical sub-section of Muslims scattered across the planet: pseudo-modernism was not born on 11 September 2001, but postmodernism was interred in its rubble.
  • pseudo-modernist communicates constantly with the other side of the planet, yet needs to be told to eat vegetables to be healthy, a fact self-evident in the Bronze Age. He or she can direct the course of national television programmes, but does not know how to make him or herself something to eat – a characteristic fusion of the childish and the advanced, the powerful and the helpless. For varying reasons, these are people incapable of the “disbelief of Grand Narratives” which Lyotard argued typified postmodernists
  •  
    Postmodern philosophy emphasises the elusiveness of meaning and knowledge. This is often expressed in postmodern art as a concern with representation and an ironic self-awareness. And the argument that postmodernism is over has already been made philosophically. There are people who have essentially asserted that for a while we believed in postmodern ideas, but not any more, and from now on we're going to believe in critical realism. The weakness in this analysis is that it centres on the academy, on the practices and suppositions of philosophers who may or may not be shifting ground or about to shift - and many academics will simply decide that, finally, they prefer to stay with Foucault [arch postmodernist] than go over to anything else. However, a far more compelling case can be made that postmodernism is dead by looking outside the academy at current cultural production.
joanne ye

Measuring the effectiveness of online activism - 2 views

Reference: Krishnan, S. (2009, June 21). Measuring the effectiveness of online activism. The Hindu. Retrieved September 24, 2009, from Factiva. (Article can be found at bottom of the post) Summary...

online activism freedom control

started by joanne ye on 24 Sep 09 no follow-up yet
Chen Guo Lim

Anti plagiarism is (un)ethical - 20 views

I think there is a need to investigate the motivation behind using these softwares. Suppose a writer has recently come across an article that seemingly have plagiarised, thus using the software to ...

Turnitin plagiarism

Chen Guo Lim

Digging up the Past, but not necessarilly forgotten. - 1 views

http://www.youtube.com/watch?v=rYoexSAInQY Firstly, let me apologise for the exclusivity of the language. I tried looking for an English song but could not recall one. In any case, when t...

Classical Pop

started by Chen Guo Lim on 26 Aug 09 no follow-up yet
YongTeck Lee

Illegal downloaders may face ban from internet - 5 views

http://news.yahoo.com/s/ap/20090825/ap_on_re_eu/eu_britain_downloading UK may block repeated offenders who illegally download and share copyright films and music could find their internet access ...

piracy internet Intellectual property

started by YongTeck Lee on 25 Aug 09 no follow-up yet
Reseena Abdullah

Pinkberry serves lawsuits to six frozen yogurt shops - 4 views

http://www.latimes.com/business/la-fi-pinkberry12-2008sep12,0,2520279.story The article is about an American frozen yogurt chain that is suing other 'copycat' businesses. One such chain was even c...

started by Reseena Abdullah on 25 Aug 09 no follow-up yet
Kathleen Tan

Forthcoming Leona Lewis tracks leaked onto the net by hackers - 11 views

http://www.dailymail.co.uk/tvshowbiz/article-1207707/Forthcoming-Leona-Lewis-tracks-leaked-internet-highest-profile-hacking-case.html Basically, hackers managed to gain access to unreleased tracks...

Intellectual property hacking piracy

started by Kathleen Tan on 25 Aug 09 no follow-up yet
Elaine Ong

Turning dolls into babies - 6 views

http://www.chroniclelive.co.uk/north-east-news/todays-evening-chronicle/2007/09/11/when-does-a-doll-become-a-baby-72703-19770082/ Just to share an interesting article about how dolls nowadays are ...

started by Elaine Ong on 25 Aug 09 no follow-up yet
Paul Melissa

Police raid 13 shops in Lucky Plaza - 13 views

http://www.tnp.sg/printfriendly/0,4139,209251,00.html 1) Officers from the Criminal Investigation Department (CID) raided 13 shops in Lucky Plaza and arrested 27 men and one woman, aged...

Pirated games Illegal modification

started by Paul Melissa on 24 Aug 09 no follow-up yet
Karin Tan

ASCAP Makes Outlandish Copyright Claims on Cell Phone Ringtones - 16 views

As is the beginnings of copyright laws, it is to place value on IP so that people will have the motivation and incentive to produce and create even more in the future. Therefore, by saying th...

Copyright

Weiye Loh

The internet: is it changing the way we think? | Technology | The Observer - 0 views

  • American magazine the Atlantic lobs an intellectual grenade into our culture. In the summer of 1945, for example, it published an essay by the Massachusetts Institute of Technology (MIT) engineer Vannevar Bush entitled "As We May Think". It turned out to be the blueprint for what eventually emerged as the world wide web. Two summers ago, the Atlantic published an essay by Nicholas Carr, one of the blogosphere's most prominent (and thoughtful) contrarians, under the headline "Is Google Making Us Stupid?".
  • Carr wrote, "I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going – so far as I can tell – but it's changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle."
  • Carr's target was not really the world's leading search engine, but the impact that ubiquitous, always-on networking is having on our cognitive processes. His argument was that our deepening dependence on networking technology is indeed changing not only the way we think, but also the structure of our brains.
  • ...9 more annotations...
  • Carr's article touched a nerve and has provoked a lively, ongoing debate on the net and in print (he has now expanded it into a book, The Shallows: What the Internet Is Doing to Our Brains). This is partly because he's an engaging writer who has vividly articulated the unease that many adults feel about the way their modi operandi have changed in response to ubiquitous networking.
  • Who bothers to write down or memorise detailed information any more, for example, when they know that Google will always retrieve it if it's needed again? The web has become, in a way, a global prosthesis for our collective memory.
  • easy to dismiss Carr's concern as just the latest episode of the moral panic that always accompanies the arrival of a new communications technology. People fretted about printing, photography, the telephone and television in analogous ways. It even bothered Plato, who argued that the technology of writing would destroy the art of remembering.
  • many commentators who accept the thrust of his argument seem not only untroubled by its far-reaching implications but are positively enthusiastic about them. When the Pew Research Centre's Internet & American Life project asked its panel of more than 370 internet experts for their reaction, 81% of them agreed with the proposition that "people's use of the internet has enhanced human intelligence".
  • As a writer, thinker, researcher and teacher, what I can attest to is that the internet is changing our habits of thinking, which isn't the same thing as changing our brains. The brain is like any other muscle – if you don't stretch it, it gets both stiff and flabby. But if you exercise it regularly, and cross-train, your brain will be flexible, quick, strong and versatile.
  • he internet is analogous to a weight-training machine for the brain, as compared with the free weights provided by libraries and books. Each method has its advantage, but used properly one works you harder. Weight machines are directive and enabling: they encourage you to think you've worked hard without necessarily challenging yourself. The internet can be the same: it often tells us what we think we know, spreading misinformation and nonsense while it's at it. It can substitute surface for depth, imitation for originality, and its passion for recycling would surpass the most committed environmentalist.
  • I've seen students' thinking habits change dramatically: if information is not immediately available via a Google search, students are often stymied. But of course what a Google search provides is not the best, wisest or most accurate answer, but the most popular one.
  • But knowledge is not the same thing as information, and there is no question to my mind that the access to raw information provided by the internet is unparalleled and democratising. Admittance to elite private university libraries and archives is no longer required, as they increasingly digitise their archives. We've all read the jeremiads that the internet sounds the death knell of reading, but people read online constantly – we just call it surfing now. What they are reading is changing, often for the worse; but it is also true that the internet increasingly provides a treasure trove of rare books, documents and images, and as long as we have free access to it, then the internet can certainly be a force for education and wisdom, and not just for lies, damned lies, and false statistics.
  • In the end, the medium is not the message, and the internet is just a medium, a repository and an archive. Its greatest virtue is also its greatest weakness: it is unselective. This means that it is undiscriminating, in both senses of the word. It is indiscriminate in its principles of inclusion: anything at all can get into it. But it also – at least so far – doesn't discriminate against anyone with access to it. This is changing rapidly, of course, as corporations and governments seek to exert control over it. Knowledge may not be the same thing as power, but it is unquestionably a means to power. The question is, will we use the internet's power for good, or for evil? The jury is very much out. The internet itself is disinterested: but what we use it for is not.
  •  
    The internet: is it changing the way we think? American writer Nicholas Carr's claim that the internet is not only shaping our lives but physically altering our brains has sparked a lively and ongoing debate, says John Naughton. Below, a selection of writers and experts offer their opinion
Weiye Loh

An insider's view of academic censorship in Singapore | Asian Correspondent - 0 views

  • Mark, who is now assistant professor of history at the University of Hong Kong, talks candidly about the censorship, both self-imposed and external, that guided his research and writing.
  • During my 6 years in the city, I definitely became ever more acutely aware of "political sensitivities". Thus, there were comments that came up in interviews with some of Singapore's former political detainees (interviews which are cited in the book) that were not included because they would have possibly resulted in libel actions. There were other things, such as the deviousness of LKY's political negotiations with the British in the late 50s and early 60s, which we could have gone into further (the details have been published) rather than just pointing to them in the footnotes. Was this the result of a subconscious self-censorship or a desire to move the story on? I'm still thinking about that one. But I do recall that, as a foreign academic working at the National Univ. of Singapore, you inevitably became careful about what sort of public criticism you directed at your paymasters. No doubt, this carefulness ultimately seeps into you (though I think good work can be done in Singapore, nevertheless, and many people in academia there continue to do it).
  • The decision to halt Singapore: a Biography in 1965, and in that sense narrow the narrative, was a very conscious one. I am still not comfortable tackling Singapore's political history after 1965, given the current political constraints in the Republic, and the official control of the archive. I have told publishers who have enquired about us extending the story or writing a sequel that this would involve a narrative far more critical of the ruling party. Repressive political measures that might have garnered a degree of popular support in the turbulent early-60s became, I believe, for many Singaporeans, less justifiable and more reprehensible in the 70s and 80s (culminating with the disgust that many people felt over the treatment of Catholic agitators involved in the so-called "Marxist conspiracy" of 1987).
  • ...2 more annotations...
  • As for the rise of the PAP, my personal view is that in the late 1950s the PAP was the only viable alternative to colonial rule, once Marshall had bailed - that is, in terms of getting Singapore out of its postwar social and economic predicament. As much as my heart is with the idealists who founded the Barisan, I'm not sure they would have achieved the same practical results as the PAP did in its first 5 years, had they got into power. There were already rifts in the Barisan prior to Operation Cold Store in 1963, and the more one looks into the party at this time, the more chaotic it appears. (Undoubtedly, this chaos was also a result of the pressures exerted upon it by the PAP.)
  • when the Barisan was systematically destroyed, hopeless though its leaders might have proved as technocrats, Singapore turned a corner. From 1963, economic success and political stability were won at the expense of freedom of expression and 'responsible dissent', generating a conformity, an intellectual sterility and a deep loss of historical identity that I hope the Epilogue to the book conveys. That's basically my take on the rise of the PAP. The party became something very different from 1963.
  •  
    An insider's view of academic censorship in Singapore
Weiye Loh

CancerGuide: The Median Isn't the Message - 0 views

  • Statistics recognizes different measures of an "average," or central tendency. The mean is our usual concept of an overall average - add up the items and divide them by the number of sharers
  • The median, a different measure of central tendency, is the half-way point.
  • A politician in power might say with pride, "The mean income of our citizens is $15,000 per year." The leader of the opposition might retort, "But half our citizens make less than $10,000 per year." Both are right, but neither cites a statistic with impassive objectivity. The first invokes a mean, the second a median. (Means are higher than medians in such cases because one millionaire may outweigh hundreds of poor people in setting a mean; but he can balance only one mendicant in calculating a median).
  • ...7 more annotations...
  • The larger issue that creates a common distrust or contempt for statistics is more troubling. Many people make an unfortunate and invalid separation between heart and mind, or feeling and intellect. In some contemporary traditions, abetted by attitudes stereotypically centered on Southern California, feelings are exalted as more "real" and the only proper basis for action - if it feels good, do it - while intellect gets short shrift as a hang-up of outmoded elitism. Statistics, in this absurd dichotomy, often become the symbol of the enemy. As Hilaire Belloc wrote, "Statistics are the triumph of the quantitative method, and the quantitative method is the victory of sterility and death."
  • This is a personal story of statistics, properly interpreted, as profoundly nurturant and life-giving. It declares holy war on the downgrading of intellect by telling a small story about the utility of dry, academic knowledge about science. Heart and head are focal points of one body, one personality.
  • We still carry the historical baggage of a Platonic heritage that seeks sharp essences and definite boundaries. (Thus we hope to find an unambiguous "beginning of life" or "definition of death," although nature often comes to us as irreducible continua.) This Platonic heritage, with its emphasis in clear distinctions and separated immutable entities, leads us to view statistical measures of central tendency wrongly, indeed opposite to the appropriate interpretation in our actual world of variation, shadings, and continua. In short, we view means and medians as the hard "realities," and the variation that permits their calculation as a set of transient and imperfect measurements of this hidden essence. If the median is the reality and variation around the median just a device for its calculation, the "I will probably be dead in eight months" may pass as a reasonable interpretation.
  • But all evolutionary biologists know that variation itself is nature's only irreducible essence. Variation is the hard reality, not a set of imperfect measures for a central tendency. Means and medians are the abstractions. Therefore, I looked at the mesothelioma statistics quite differently - and not only because I am an optimist who tends to see the doughnut instead of the hole, but primarily because I know that variation itself is the reality. I had to place myself amidst the variation. When I learned about the eight-month median, my first intellectual reaction was: fine, half the people will live longer; now what are my chances of being in that half. I read for a furious and nervous hour and concluded, with relief: damned good. I possessed every one of the characteristics conferring a probability of longer life: I was young; my disease had been recognized in a relatively early stage; I would receive the nation's best medical treatment; I had the world to live for; I knew how to read the data properly and not despair.
  • Another technical point then added even more solace. I immediately recognized that the distribution of variation about the eight-month median would almost surely be what statisticians call "right skewed." (In a symmetrical distribution, the profile of variation to the left of the central tendency is a mirror image of variation to the right. In skewed distributions, variation to one side of the central tendency is more stretched out - left skewed if extended to the left, right skewed if stretched out to the right.) The distribution of variation had to be right skewed, I reasoned. After all, the left of the distribution contains an irrevocable lower boundary of zero (since mesothelioma can only be identified at death or before). Thus, there isn't much room for the distribution's lower (or left) half - it must be scrunched up between zero and eight months. But the upper (or right) half can extend out for years and years, even if nobody ultimately survives. The distribution must be right skewed, and I needed to know how long the extended tail ran - for I had already concluded that my favorable profile made me a good candidate for that part of the curve.
  • The distribution was indeed, strongly right skewed, with a long tail (however small) that extended for several years above the eight month median. I saw no reason why I shouldn't be in that small tail, and I breathed a very long sigh of relief. My technical knowledge had helped. I had read the graph correctly. I had asked the right question and found the answers. I had obtained, in all probability, the most precious of all possible gifts in the circumstances - substantial time.
  • One final point about statistical distributions. They apply only to a prescribed set of circumstances - in this case to survival with mesothelioma under conventional modes of treatment. If circumstances change, the distribution may alter. I was placed on an experimental protocol of treatment and, if fortune holds, will be in the first cohort of a new distribution with high median and a right tail extending to death by natural causes at advanced old age.
  •  
    The Median Isn't the Message by Stephen Jay Gould
Weiye Loh

The meritocratic route to oblivion « Yawning Bread on Wordpress - 0 views

  • Part of the problem with meritocracy is that it homogenizes in the name of diversity: It skims the cream from every race and class and population, puts all of the best and brightest through the same educational conveyor belt, and comes out with a ruling class that’s cosmetically diverse but intellectually conformist, and that tends to huddle together rather than spreading out to enrich the country as a whole.
  • meritocracy co-opts people who might otherwise become its critics
  •  
    The meritocratic route to oblivion
Weiye Loh

What Is Skepticism? Week 3: Skepticism vs. Denial « Skepticism « Critical Thi... - 0 views

  • Everyone is a skeptic nowadays, or so it seems. From climate change to evolution to vaccination, large proportions of the population claim to be skeptical about many of the claims of mainstream science. So why are we, member of the skeptical community, not rejoicing?
  • A skeptic, in popular discourse, is simply someone who denies a particular claim. But true skepticism, as espoused by philosophers and scientists for millenia, is more an intellectual attitude than a position on a specific issue. A skeptic is someone who always demands sufficient evidence or reasons before accepting a claim. This skeptical attitude – its opposite is credulity – leads skeptics to reject as unfounded any claim that cannot withstand the rigours of the scientific method, which includes controlled experimental testing. The more extraordinary the claim, the more rigourously it must be tested before a skeptic will be willing to accept
  • skepticism does not always lead to denial. Extraordinary claims require extraordinary evidence, but sometimes that extraordinary evidence can be provided. Einstein’s theory of relativity, which holds that matter can change the very shape of space and time, is an extraordinary claim, yet it has stood up to the most demanding of scientific testing.
  • ...1 more annotation...
  • let us turn to the climate change “skeptics”. Are they just being more demanding than us in their skepticism? After all, nothing in science is ever certain; some room for doubt always exists. For that doubt to warrant disbelief in the face of all the positive evidence, however, skeptics would require significant contrary evidence, or a plausible alternative theory which fit the data. But climate change deniers have not provided any such evidence or theory (theories involving variations in solar activity simply don’t fit the data). Nor have they shown significant inclination to provide such evidence, generally being content to gesture frantically at any minor mistake, no matter how irrelevant, in the climate change literature. In fact, in denying climate change, these “skeptics” find themselves committed to claims no less extraordinary than the ones they deny, yet with far less evidence.
  •  
    Skepticism vs. Denial
‹ Previous 21 - 40 of 64 Next › Last »
Showing 20 items per page