Skip to main content

Home/ New Media Ethics 2009 course/ Group items matching ""Public Relations"" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Weiye Loh

Science, Strong Inference -- Proper Scientific Method - 0 views

  • Scientists these days tend to keep up a polite fiction that all science is equal. Except for the work of the misguided opponent whose arguments we happen to be refuting at the time, we speak as though every scientist's field and methods of study are as good as every other scientist's and perhaps a little better. This keeps us all cordial when it comes to recommending each other for government grants.
  • Why should there be such rapid advances in some fields and not in others? I think the usual explanations that we tend to think of - such as the tractability of the subject, or the quality or education of the men drawn into it, or the size of research contracts - are important but inadequate. I have begun to believe that the primary factor in scientific advance is an intellectual one. These rapidly moving fields are fields where a particular method of doing scientific research is systematically used and taught, an accumulative method of inductive inference that is so effective that I think it should be given the name of "strong inference." I believe it is important to examine this method, its use and history and rationale, and to see whether other groups and individuals might learn to adopt it profitably in their own scientific and intellectual work. In its separate elements, strong inference is just the simple and old-fashioned method of inductive inference that goes back to Francis Bacon. The steps are familiar to every college student and are practiced, off and on, by every scientist. The difference comes in their systematic application. Strong inference consists of applying the following steps to every problem in science, formally and explicitly and regularly: Devising alternative hypotheses; Devising a crucial experiment (or several of them), with alternative possible outcomes, each of which will, as nearly is possible, exclude one or more of the hypotheses; Carrying out the experiment so as to get a clean result; Recycling the procedure, making subhypotheses or sequential hypotheses to refine the possibilities that remain, and so on.
  • On any new problem, of course, inductive inference is not as simple and certain as deduction, because it involves reaching out into the unknown. Steps 1 and 2 require intellectual inventions, which must be cleverly chosen so that hypothesis, experiment, outcome, and exclusion will be related in a rigorous syllogism; and the question of how to generate such inventions is one which has been extensively discussed elsewhere (2, 3). What the formal schema reminds us to do is to try to make these inventions, to take the next step, to proceed to the next fork, without dawdling or getting tied up in irrelevancies.
  • ...28 more annotations...
  • It is clear why this makes for rapid and powerful progress. For exploring the unknown, there is no faster method; this is the minimum sequence of steps. Any conclusion that is not an exclusion is insecure and must be rechecked. Any delay in recycling to the next set of hypotheses is only a delay. Strong inference, and the logical tree it generates, are to inductive reasoning what the syllogism is to deductive reasoning in that it offers a regular method for reaching firm inductive conclusions one after the other as rapidly as possible.
  • "But what is so novel about this?" someone will say. This is the method of science and always has been, why give it a special name? The reason is that many of us have almost forgotten it. Science is now an everyday business. Equipment, calculations, lectures become ends in themselves. How many of us write down our alternatives and crucial experiments every day, focusing on the exclusion of a hypothesis? We may write our scientific papers so that it looks as if we had steps 1, 2, and 3 in mind all along. But in between, we do busywork. We become "method- oriented" rather than "problem-oriented." We say we prefer to "feel our way" toward generalizations. We fail to teach our students how to sharpen up their inductive inferences. And we do not realize the added power that the regular and explicit use of alternative hypothesis and sharp exclusion could give us at every step of our research.
  • A distinguished cell biologist rose and said, "No two cells give the same properties. Biology is the science of heterogeneous systems." And he added privately. "You know there are scientists, and there are people in science who are just working with these over-simplified model systems - DNA chains and in vitro systems - who are not doing science at all. We need their auxiliary work: they build apparatus, they make minor studies, but they are not scientists." To which Cy Levinthal replied: "Well, there are two kinds of biologists, those who are looking to see if there is one thing that can be understood and those who keep saying it is very complicated and that nothing can be understood. . . . You must study the simplest system you think has the properties you are interested in."
  • At the 1958 Conference on Biophysics, at Boulder, there was a dramatic confrontation between the two points of view. Leo Szilard said: "The problems of how enzymes are induced, of how proteins are synthesized, of how antibodies are formed, are closer to solution than is generally believed. If you do stupid experiments, and finish one a year, it can take 50 years. But if you stop doing experiments for a little while and think how proteins can possibly be synthesized, there are only about 5 different ways, not 50! And it will take only a few experiments to distinguish these." One of the young men added: "It is essentially the old question: How small and elegant an experiment can you perform?" These comments upset a number of those present. An electron microscopist said. "Gentlemen, this is off the track. This is philosophy of science." Szilard retorted. "I was not quarreling with third-rate scientists: I was quarreling with first-rate scientists."
  • Any criticism or challenge to consider changing our methods strikes of course at all our ego-defenses. But in this case the analytical method offers the possibility of such great increases in effectiveness that it is unfortunate that it cannot be regarded more often as a challenge to learning rather than as challenge to combat. Many of the recent triumphs in molecular biology have in fact been achieved on just such "oversimplified model systems," very much along the analytical lines laid down in the 1958 discussion. They have not fallen to the kind of men who justify themselves by saying "No two cells are alike," regardless of how true that may ultimately be. The triumphs are in fact triumphs of a new way of thinking.
  • the emphasis on strong inference
  • is also partly due to the nature of the fields themselves. Biology, with its vast informational detail and complexity, is a "high-information" field, where years and decades can easily be wasted on the usual type of "low-information" observations or experiments if one does not think carefully in advance about what the most important and conclusive experiments would be. And in high-energy physics, both the "information flux" of particles from the new accelerators and the million-dollar costs of operation have forced a similar analytical approach. It pays to have a top-notch group debate every experiment ahead of time; and the habit spreads throughout the field.
  • Historically, I think, there have been two main contributions to the development of a satisfactory strong-inference method. The first is that of Francis Bacon (13). He wanted a "surer method" of "finding out nature" than either the logic-chopping or all-inclusive theories of the time or the laudable but crude attempts to make inductions "by simple enumeration." He did not merely urge experiments as some suppose, he showed the fruitfulness of interconnecting theory and experiment so that the one checked the other. Of the many inductive procedures he suggested, the most important, I think, was the conditional inductive tree, which proceeded from alternative hypothesis (possible "causes," as he calls them), through crucial experiments ("Instances of the Fingerpost"), to exclusion of some alternatives and adoption of what is left ("establishing axioms"). His Instances of the Fingerpost are explicitly at the forks in the logical tree, the term being borrowed "from the fingerposts which are set up where roads part, to indicate the several directions."
  • ere was a method that could separate off the empty theories! Bacon, said the inductive method could be learned by anybody, just like learning to "draw a straighter line or more perfect circle . . . with the help of a ruler or a pair of compasses." "My way of discovering sciences goes far to level men's wit and leaves but little to individual excellence, because it performs everything by the surest rules and demonstrations." Even occasional mistakes would not be fatal. "Truth will sooner come out from error than from confusion."
  • Nevertheless there is a difficulty with this method. As Bacon emphasizes, it is necessary to make "exclusions." He says, "The induction which is to be available for the discovery and demonstration of sciences and arts, must analyze nature by proper rejections and exclusions, and then, after a sufficient number of negatives come to a conclusion on the affirmative instances." "[To man] it is granted only to proceed at first by negatives, and at last to end in affirmatives after exclusion has been exhausted." Or, as the philosopher Karl Popper says today there is no such thing as proof in science - because some later alternative explanation may be as good or better - so that science advances only by disproofs. There is no point in making hypotheses that are not falsifiable because such hypotheses do not say anything, "it must be possible for all empirical scientific system to be refuted by experience" (14).
  • The difficulty is that disproof is a hard doctrine. If you have a hypothesis and I have another hypothesis, evidently one of them must be eliminated. The scientist seems to have no choice but to be either soft-headed or disputatious. Perhaps this is why so many tend to resist the strong analytical approach and why some great scientists are so disputatious.
  • Fortunately, it seems to me, this difficulty can be removed by the use of a second great intellectual invention, the "method of multiple hypotheses," which is what was needed to round out the Baconian scheme. This is a method that was put forward by T.C. Chamberlin (15), a geologist at Chicago at the turn of the century, who is best known for his contribution to the Chamberlain-Moulton hypothesis of the origin of the solar system.
  • Chamberlin says our trouble is that when we make a single hypothesis, we become attached to it. "The moment one has offered an original explanation for a phenomenon which seems satisfactory, that moment affection for his intellectual child springs into existence, and as the explanation grows into a definite theory his parental affections cluster about his offspring and it grows more and more dear to him. . . . There springs up also unwittingly a pressing of the theory to make it fit the facts and a pressing of the facts to make them fit the theory..." "To avoid this grave danger, the method of multiple working hypotheses is urged. It differs from the simple working hypothesis in that it distributes the effort and divides the affections. . . . Each hypothesis suggests its own criteria, its own method of proof, its own method of developing the truth, and if a group of hypotheses encompass the subject on all sides, the total outcome of means and of methods is full and rich."
  • The conflict and exclusion of alternatives that is necessary to sharp inductive inference has been all too often a conflict between men, each with his single Ruling Theory. But whenever each man begins to have multiple working hypotheses, it becomes purely a conflict between ideas. It becomes much easier then for each of us to aim every day at conclusive disproofs - at strong inference - without either reluctance or combativeness. In fact, when there are multiple hypotheses, which are not anyone's "personal property," and when there are crucial experiments to test them, the daily life in the laboratory takes on an interest and excitement it never had, and the students can hardly wait to get to work to see how the detective story will come out. It seems to me that this is the reason for the development of those distinctive habits of mind and the "complex thought" that Chamberlin described, the reason for the sharpness, the excitement, the zeal, the teamwork - yes, even international teamwork - in molecular biology and high- energy physics today. What else could be so effective?
  • Unfortunately, I think, there are other other areas of science today that are sick by comparison, because they have forgotten the necessity for alternative hypotheses and disproof. Each man has only one branch - or none - on the logical tree, and it twists at random without ever coming to the need for a crucial decision at any point. We can see from the external symptoms that there is something scientifically wrong. The Frozen Method, The Eternal Surveyor, The Never Finished, The Great Man With a Single Hypothcsis, The Little Club of Dependents, The Vendetta, The All-Encompassing Theory Which Can Never Be Falsified.
  • a "theory" of this sort is not a theory at all, because it does not exclude anything. It predicts everything, and therefore does not predict anything. It becomes simply a verbal formula which the graduate student repeats and believes because the professor has said it so often. This is not science, but faith; not theory, but theology. Whether it is hand-waving or number-waving, or equation-waving, a theory is not a theory unless it can be disproved. That is, unless it can be falsified by some possible experimental outcome.
  • the work methods of a number of scientists have been testimony to the power of strong inference. Is success not due in many cases to systematic use of Bacon's "surest rules and demonstrations" as much as to rare and unattainable intellectual power? Faraday's famous diary (16), or Fermi's notebooks (3, 17), show how these men believed in the effectiveness of daily steps in applying formal inductive methods to one problem after another.
  • Surveys, taxonomy, design of equipment, systematic measurements and tables, theoretical computations - all have their proper and honored place, provided they are parts of a chain of precise induction of how nature works. Unfortunately, all too often they become ends in themselves, mere time-serving from the point of view of real scientific advance, a hypertrophied methodology that justifies itself as a lore of respectability.
  • We speak piously of taking measurements and making small studies that will "add another brick to the temple of science." Most such bricks just lie around the brickyard (20). Tables of constraints have their place and value, but the study of one spectrum after another, if not frequently re-evaluated, may become a substitute for thinking, a sad waste of intelligence in a research laboratory, and a mistraining whose crippling effects may last a lifetime.
  • Beware of the man of one method or one instrument, either experimental or theoretical. He tends to become method-oriented rather than problem-oriented. The method-oriented man is shackled; the problem-oriented man is at least reaching freely toward that is most important. Strong inference redirects a man to problem-orientation, but it requires him to be willing repeatedly to put aside his last methods and teach himself new ones.
  • anyone who asks the question about scientific effectiveness will also conclude that much of the mathematizing in physics and chemistry today is irrelevant if not misleading. The great value of mathematical formulation is that when an experiment agrees with a calculation to five decimal places, a great many alternative hypotheses are pretty well excluded (though the Bohr theory and the Schrödinger theory both predict exactly the same Rydberg constant!). But when the fit is only to two decimal places, or one, it may be a trap for the unwary; it may be no better than any rule-of-thumb extrapolation, and some other kind of qualitative exclusion might be more rigorous for testing the assumptions and more important to scientific understanding than the quantitative fit.
  • Today we preach that science is not science unless it is quantitative. We substitute correlations for causal studies, and physical equations for organic reasoning. Measurements and equations are supposed to sharpen thinking, but, in my observation, they more often tend to make the thinking noncausal and fuzzy. They tend to become the object of scientific manipulation instead of auxiliary tests of crucial inferences.
  • Many - perhaps most - of the great issues of science are qualitative, not quantitative, even in physics and chemistry. Equations and measurements are useful when and only when they are related to proof; but proof or disproof comes first and is in fact strongest when it is absolutely convincing without any quantitative measurement.
  • you can catch phenomena in a logical box or in a mathematical box. The logical box is coarse but strong. The mathematical box is fine-grained but flimsy. The mathematical box is a beautiful way of wrapping up a problem, but it will not hold the phenomena unless they have been caught in a logical box to begin with.
  • Of course it is easy - and all too common - for one scientist to call the others unscientific. My point is not that my particular conclusions here are necessarily correct, but that we have long needed some absolute standard of possible scientific effectiveness by which to measure how well we are succeeding in various areas - a standard that many could agree on and one that would be undistorted by the scientific pressures and fashions of the times and the vested interests and busywork that they develop. It is not public evaluation I am interested in so much as a private measure by which to compare one's own scientific performance with what it might be. I believe that strong inference provides this kind of standard of what the maximum possible scientific effectiveness could be - as well as a recipe for reaching it.
  • The strong-inference point of view is so resolutely critical of methods of work and values in science that any attempt to compare specific cases is likely to sound but smug and destructive. Mainly one should try to teach it by example and by exhorting to self-analysis and self-improvement only in general terms
  • one severe but useful private test - a touchstone of strong inference - that removes the necessity for third-person criticism, because it is a test that anyone can learn to carry with him for use as needed. It is our old friend the Baconian "exclusion," but I call it "The Question." Obviously it should be applied as much to one's own thinking as to others'. It consists of asking in your own mind, on hearing any scientific explanation or theory put forward, "But sir, what experiment could disprove your hypothesis?"; or, on hearing a scientific experiment described, "But sir, what hypothesis does your experiment disprove?"
  • It is not true that all science is equal; or that we cannot justly compare the effectiveness of scientists by any method other than a mutual-recommendation system. The man to watch, the man to put your money on, is not the man who wants to make "a survey" or a "more detailed study" but the man with the notebook, the man with the alternative hypotheses and the crucial experiments, the man who knows how to answer your Question of disproof and is already working on it.
  •  
    There is so much bad science and bad statistics information in media reports, publications, and shared between conversants that I think it is important to understand about facts and proofs and the associated pitfalls.
Weiye Loh

Rod Beckstrom proposes ways to reclaim control over our online selves. - Project Syndicate - 0 views

  • As the virtual world expands, so, too, do breaches of trust and misuse of personal data. Surveillance has increased public unease – and even paranoia – about state agencies. Private companies that trade in personal data have incited the launch of a “reclaim privacy” movement. As one delegate at a recent World Economic Forum debate, noted: “The more connected we have become, the more privacy we have given up.”
  • Now that our personal data have become such a valuable asset, companies are coming under increasing pressure to develop online business models that protect rather than exploit users’ private information. In particular, Internet users want to stop companies befuddling their customers with convoluted and legalistic service agreements in order to extract and sell their data.
  • Hyper-connectivity not only creates new commercial opportunities; it also changes the way ordinary people think about their lives. The so-called FoMo (fear of missing out) syndrome reflects the anxieties of a younger generation whose members feel compelled to capture instantly everything they do and see.CommentsView/Create comment on this paragraphIronically, this hyper-connectivity has increased our insularity, as we increasingly live through our electronic devices. Neuroscientists believe that this may even have altered how we now relate to one another in the real world.
  • ...1 more annotation...
  • At the heart of this debate is the need to ensure that in a world where many, if not all, of the important details of our lives – including our relationships – exist in cyber-perpetuity, people retain, or reclaim, some level of control over their online selves. While the world of forgetting may have vanished, we can reshape the new one in a way that benefits rather than overwhelms us. Our overriding task is to construct a digital way of life that reinforces our existing sense of ethics and values, with security, trust, and fairness at its heart.
  •  
    "We must answer profound questions about the way we live. Should everyone be permanently connected to everything? Who owns which data, and how should information be made public? Can and should data use be regulated, and, if so, how? And what role should government, business, and ordinary Internet users play in addressing these issues?"
Weiye Loh

The Way We Live Now - Metric Mania - NYTimes.com - 0 views

  • In the realm of public policy, we live in an age of numbers.
  • do wehold an outsize belief in our ability to gauge complex phenomena, measure outcomes and come up with compelling numerical evidence? A well-known quotation usually attributed to Einstein is “Not everything that can be counted counts, and not everything that counts can be counted.” I’d amend it to a less eloquent, more prosaic statement: Unless we know how things are counted, we don’t know if it’s wise to count on the numbers.
  • The problem isn’t with statistical tests themselves but with what we do before and after we run them.
  • ...9 more annotations...
  • First, we count if we can, but counting depends a great deal on previous assumptions about categorization. Consider, for example, the number of homeless people in Philadelphia, or the number of battered women in Atlanta, or the number of suicides in Denver. Is someone homeless if he’s unemployed and living with his brother’s family temporarily? Do we require that a women self-identify as battered to count her as such? If a person starts drinking day in and day out after a cancer diagnosis and dies from acute cirrhosis, did he kill himself? The answers to such questions significantly affect the count.
  • Second, after we’ve gathered some numbers relating to a phenomenon, we must reasonably aggregate them into some sort of recommendation or ranking. This is not easy. By appropriate choices of criteria, measurement protocols and weights, almost any desired outcome can be reached.
  • Are there good reasons the authors picked the criteria they did? Why did they weigh the criteria in the way they did?
  • Since the answer to the last question is usually yes, the problem of reasonable aggregation is no idle matter.
  • These two basic procedures — counting and aggregating — have important implications for public policy. Consider the plan to evaluate the progress of New York City public schools inaugurated by the city a few years ago. While several criteria were used, much of a school’s grade was determined by whether students’ performance on standardized state tests showed annual improvement. This approach risked putting too much weight on essentially random fluctuations and induced schools to focus primarily on the topics on the tests. It also meant that the better schools could receive mediocre grades becausethey were already performing well and had little room for improvement. Conversely, poor schools could receive high grades by improving just a bit.
  • Medical researchers face similar problems when it comes to measuring effectiveness.
  • Suppose that whenever people contract the disease, they always get it in their mid-60s and live to the age of 75. In the first region, an early screening program detects such people in their 60s. Because these people live to age 75, the five-year survival rate is 100 percent. People in the second region are not screened and thus do not receive their diagnoses until symptoms develop in their early 70s, but they, too, die at 75, so their five-year survival rate is 0 percent. The laissez-faire approach thus yields the same results as the universal screening program, yet if five-year survival were the criterion for effectiveness, universal screening would be deemed the best practice.
  • Because so many criteria can be used to assess effectiveness — median or mean survival times, side effects, quality of life and the like — there is a case to be made against mandating that doctors follow what seems at any given time to be the best practice. Perhaps, as some have suggested, we should merely nudge them with gentle incentives. A comparable tentativeness may be appropriate when devising criteria for effective schools.
  • Arrow’s Theorem, a famous result in mathematical economics, essentially states that no voting system satisfying certain minimal conditions can be guaranteed to always yield a fair or reasonable aggregation of the voters’ rankings of several candidates. A squishier analogue for the field of social measurement would say something like this: No method of measuring a societal phenomenon satisfying certain minimal conditions exists that can’t be second-guessed, deconstructed, cheated, rejected or replaced. This doesn’t mean we shouldn’t be counting — but it does mean we should do so with as much care and wisdom as we can muster.
  •  
    THE WAY WE LIVE NOW Metric Mania
Weiye Loh

McKinsey & Company - Clouds, big data, and smart assets: Ten tech-enabled business trends to watch - 0 views

  • 1. Distributed cocreation moves into the mainstreamIn the past few years, the ability to organise communities of Web participants to develop, market, and support products and services has moved from the margins of business practice to the mainstream. Wikipedia and a handful of open-source software developers were the pioneers. But in signs of the steady march forward, 70 per cent of the executives we recently surveyed said that their companies regularly created value through Web communities. Similarly, more than 68m bloggers post reviews and recommendations about products and services.
  • for every success in tapping communities to create value, there are still many failures. Some companies neglect the up-front research needed to identify potential participants who have the right skill sets and will be motivated to participate over the longer term. Since cocreation is a two-way process, companies must also provide feedback to stimulate continuing participation and commitment. Getting incentives right is important as well: cocreators often value reputation more than money. Finally, an organisation must gain a high level of trust within a Web community to earn the engagement of top participants.
  • 2. Making the network the organisation In earlier research, we noted that the Web was starting to force open the boundaries of organisations, allowing nonemployees to offer their expertise in novel ways. We called this phenomenon "tapping into a world of talent." Now many companies are pushing substantially beyond that starting point, building and managing flexible networks that extend across internal and often even external borders. The recession underscored the value of such flexibility in managing volatility. We believe that the more porous, networked organisations of the future will need to organise work around critical tasks rather than molding it to constraints imposed by corporate structures.
  • ...10 more annotations...
  • 3. Collaboration at scale Across many economies, the number of people who undertake knowledge work has grown much more quickly than the number of production or transactions workers. Knowledge workers typically are paid more than others, so increasing their productivity is critical. As a result, there is broad interest in collaboration technologies that promise to improve these workers' efficiency and effectiveness. While the body of knowledge around the best use of such technologies is still developing, a number of companies have conducted experiments, as we see in the rapid growth rates of video and Web conferencing, expected to top 20 per cent annually during the next few years.
  • 4. The growing ‘Internet of Things' The adoption of RFID (radio-frequency identification) and related technologies was the basis of a trend we first recognised as "expanding the frontiers of automation." But these methods are rudimentary compared with what emerges when assets themselves become elements of an information system, with the ability to capture, compute, communicate, and collaborate around information—something that has come to be known as the "Internet of Things." Embedded with sensors, actuators, and communications capabilities, such objects will soon be able to absorb and transmit information on a massive scale and, in some cases, to adapt and react to changes in the environment automatically. These "smart" assets can make processes more efficient, give products new capabilities, and spark novel business models. Auto insurers in Europe and the United States are testing these waters with offers to install sensors in customers' vehicles. The result is new pricing models that base charges for risk on driving behavior rather than on a driver's demographic characteristics. Luxury-auto manufacturers are equipping vehicles with networked sensors that can automatically take evasive action when accidents are about to happen. In medicine, sensors embedded in or worn by patients continuously report changes in health conditions to physicians, who can adjust treatments when necessary. Sensors in manufacturing lines for products as diverse as computer chips and pulp and paper take detailed readings on process conditions and automatically make adjustments to reduce waste, downtime, and costly human interventions.
  • 5. Experimentation and big data Could the enterprise become a full-time laboratory? What if you could analyse every transaction, capture insights from every customer interaction, and didn't have to wait for months to get data from the field? What if…? Data are flooding in at rates never seen before—doubling every 18 months—as a result of greater access to customer data from public, proprietary, and purchased sources, as well as new information gathered from Web communities and newly deployed smart assets. These trends are broadly known as "big data." Technology for capturing and analysing information is widely available at ever-lower price points. But many companies are taking data use to new levels, using IT to support rigorous, constant business experimentation that guides decisions and to test new products, business models, and innovations in customer experience. In some cases, the new approaches help companies make decisions in real time. This trend has the potential to drive a radical transformation in research, innovation, and marketing.
  • Using experimentation and big data as essential components of management decision making requires new capabilities, as well as organisational and cultural change. Most companies are far from accessing all the available data. Some haven't even mastered the technologies needed to capture and analyse the valuable information they can access. More commonly, they don't have the right talent and processes to design experiments and extract business value from big data, which require changes in the way many executives now make decisions: trusting instincts and experience over experimentation and rigorous analysis. To get managers at all echelons to accept the value of experimentation, senior leaders must buy into a "test and learn" mind-set and then serve as role models for their teams.
  • 6. Wiring for a sustainable world Even as regulatory frameworks continue to evolve, environmental stewardship and sustainability clearly are C-level agenda topics. What's more, sustainability is fast becoming an important corporate-performance metric—one that stakeholders, outside influencers, and even financial markets have begun to track. Information technology plays a dual role in this debate: it is both a significant source of environmental emissions and a key enabler of many strategies to mitigate environmental damage. At present, information technology's share of the world's environmental footprint is growing because of the ever-increasing demand for IT capacity and services. Electricity produced to power the world's data centers generates greenhouse gases on the scale of countries such as Argentina or the Netherlands, and these emissions could increase fourfold by 2020. McKinsey research has shown, however, that the use of IT in areas such as smart power grids, efficient buildings, and better logistics planning could eliminate five times the carbon emissions that the IT industry produces.
  • 7. Imagining anything as a service Technology now enables companies to monitor, measure, customise, and bill for asset use at a much more fine-grained level than ever before. Asset owners can therefore create services around what have traditionally been sold as products. Business-to-business (B2B) customers like these service offerings because they allow companies to purchase units of a service and to account for them as a variable cost rather than undertake large capital investments. Consumers also like this "paying only for what you use" model, which helps them avoid large expenditures, as well as the hassles of buying and maintaining a product.
  • In the IT industry, the growth of "cloud computing" (accessing computer resources provided through networks rather than running software or storing data on a local computer) exemplifies this shift. Consumer acceptance of Web-based cloud services for everything from e-mail to video is of course becoming universal, and companies are following suit. Software as a service (SaaS), which enables organisations to access services such as customer relationship management, is growing at a 17 per cent annual rate. The biotechnology company Genentech, for example, uses Google Apps for e-mail and to create documents and spreadsheets, bypassing capital investments in servers and software licenses. This development has created a wave of computing capabilities delivered as a service, including infrastructure, platform, applications, and content. And vendors are competing, with innovation and new business models, to match the needs of different customers.
  • 8. The age of the multisided business model Multisided business models create value through interactions among multiple players rather than traditional one-on-one transactions or information exchanges. In the media industry, advertising is a classic example of how these models work. Newspapers, magasines, and television stations offer content to their audiences while generating a significant portion of their revenues from third parties: advertisers. Other revenue, often through subscriptions, comes directly from consumers. More recently, this advertising-supported model has proliferated on the Internet, underwriting Web content sites, as well as services such as search and e-mail (see trend number seven, "Imagining anything as a service," earlier in this article). It is now spreading to new markets, such as enterprise software: Spiceworks offers IT-management applications to 950,000 users at no cost, while it collects advertising from B2B companies that want access to IT professionals.
  • 9. Innovating from the bottom of the pyramid The adoption of technology is a global phenomenon, and the intensity of its usage is particularly impressive in emerging markets. Our research has shown that disruptive business models arise when technology combines with extreme market conditions, such as customer demand for very low price points, poor infrastructure, hard-to-access suppliers, and low cost curves for talent. With an economic recovery beginning to take hold in some parts of the world, high rates of growth have resumed in many developing nations, and we're seeing companies built around the new models emerging as global players. Many multinationals, meanwhile, are only starting to think about developing markets as wellsprings of technology-enabled innovation rather than as traditional manufacturing hubs.
  • 10. Producing public good on the grid The role of governments in shaping global economic policy will expand in coming years. Technology will be an important factor in this evolution by facilitating the creation of new types of public goods while helping to manage them more effectively. This last trend is broad in scope and draws upon many of the other trends described above.
Inosha Wickrama

ethical porn? - 50 views

I've seen that video recently. Anyway, some points i need to make. 1. different countries have different ages of consent. Does that mean children mature faster in some countries and not in other...

pornography

Weiye Loh

Rationally Speaking: On Utilitarianism and Consequentialism - 0 views

  • Utilitarianism and consequentialism are different, yet closely related philosophical positions. Utilitarians are usually consequentialists, and the two views mesh in many areas, but each rests on a different claim
  • Utilitarianism's starting point is that we all attempt to seek happiness and avoid pain, and therefore our moral focus ought to center on maximizing happiness (or, human flourishing generally) and minimizing pain for the greatest number of people. This is both about what our goals should be and how to achieve them.
  • Consequentialism asserts that determining the greatest good for the greatest number of people (the utilitarian goal) is a matter of measuring outcome, and so decisions about what is moral should depend on the potential or realized costs and benefits of a moral belief or action.
  • ...17 more annotations...
  • first question we can reasonably ask is whether all moral systems are indeed focused on benefiting human happiness and decreasing pain.
  • Jeremy Bentham, the founder of utilitarianism, wrote the following in his Introduction to the Principles of Morals and Legislation: “When a man attempts to combat the principle of utility, it is with reasons drawn, without his being aware of it, from that very principle itself.”
  • Michael Sandel discusses this line of thought in his excellent book, Justice: What’s the Right Thing to Do?, and sums up Bentham’s argument as such: “All moral quarrels, properly understood, are [for Bentham] disagreements about how to apply the utilitarian principle of maximizing pleasure and minimizing pain, not about the principle itself.”
  • But Bentham’s definition of utilitarianism is perhaps too broad: are fundamentalist Christians or Muslims really utilitarians, just with different ideas about how to facilitate human flourishing?
  • one wonders whether this makes the word so all-encompassing in meaning as to render it useless.
  • Yet, even if pain and happiness are the objects of moral concern, so what? As philosopher Simon Blackburn recently pointed out, “Every moral philosopher knows that moral philosophy is functionally about reducing suffering and increasing human flourishing.” But is that the central and sole focus of all moral philosophies? Don’t moral systems vary in their core focuses?
  • Consider the observation that religious belief makes humans happier, on average
  • Secularists would rightly resist the idea that religious belief is moral if it makes people happier. They would reject the very idea because deep down, they value truth – a value that is non-negotiable.Utilitarians would assert that truth is just another utility, for people can only value truth if they take it to be beneficial to human happiness and flourishing.
  • . We might all agree that morality is “functionally about reducing suffering and increasing human flourishing,” as Blackburn says, but how do we achieve that? Consequentialism posits that we can get there by weighing the consequences of beliefs and actions as they relate to human happiness and pain. Sam Harris recently wrote: “It is true that many people believe that ‘there are non-consequentialist ways of approaching morality,’ but I think that they are wrong. In my experience, when you scratch the surface on any deontologist, you find a consequentialist just waiting to get out. For instance, I think that Kant's Categorical Imperative only qualifies as a rational standard of morality given the assumption that it will be generally beneficial (as J.S. Mill pointed out at the beginning of Utilitarianism). Ditto for religious morality.”
  • we might wonder about the elasticity of words, in this case consequentialism. Do fundamentalist Christians and Muslims count as consequentialists? Is consequentialism so empty of content that to be a consequentialist one need only think he or she is benefiting humanity in some way?
  • Harris’ argument is that one cannot adhere to a certain conception of morality without believing it is beneficial to society
  • This still seems somewhat obvious to me as a general statement about morality, but is it really the point of consequentialism? Not really. Consequentialism is much more focused than that. Consider the issue of corporal punishment in schools. Harris has stated that we would be forced to admit that corporal punishment is moral if studies showed that “subjecting children to ‘pain, violence, and public humiliation’ leads to ‘healthy emotional development and good behavior’ (i.e., it conduces to their general well-being and to the well-being of society). If it did, well then yes, I would admit that it was moral. In fact, it would appear moral to more or less everyone.” Harris is being rhetorical – he does not believe corporal punishment is moral – but the point stands.
  • An immediate pitfall of this approach is that it does not qualify corporal punishment as the best way to raise emotionally healthy children who behave well.
  • The virtue ethicists inside us would argue that we ought not to foster a society in which people beat and humiliate children, never mind the consequences. There is also a reasonable and powerful argument based on personal freedom. Don’t children have the right to be free from violence in the public classroom? Don’t children have the right not to suffer intentional harm without consent? Isn’t that part of their “moral well-being”?
  • If consequences were really at the heart of all our moral deliberations, we might live in a very different society.
  • what if economies based on slavery lead to an increase in general happiness and flourishing for their respective societies? Would we admit slavery was moral? I hope not, because we value certain ideas about human rights and freedom. Or, what if the death penalty truly deterred crime? And what if we knew everyone we killed was guilty as charged, meaning no need for The Innocence Project? I would still object, on the grounds that it is morally wrong for us to kill people, even if they have committed the crime of which they are accused. Certain things hold, no matter the consequences.
  • We all do care about increasing human happiness and flourishing, and decreasing pain and suffering, and we all do care about the consequences of our beliefs and actions. But we focus on those criteria to differing degrees, and we have differing conceptions of how to achieve the respective goals – making us perhaps utilitarians and consequentialists in part, but not in whole.
  •  
    Is everyone a utilitarian and/or consequentialist, whether or not they know it? That is what some people - from Jeremy Bentham and John Stuart Mill to Sam Harris - would have you believe. But there are good reasons to be skeptical of such claims.
Weiye Loh

Breakthrough Europe: "Coal Kills 4,000 Times More People Per Unit of Energy than Nuclear" - 0 views

  • Last year, for example, coal mining accidents killed 4,233 in China alone, while coal pollutants killed an estimated 13,200 Americans. And while you may remember a few of the 25 worst energy-related disasters of 2010, most went unnoticed by Western media and the public.
  • When you actually do the math, coal kills somewhere on the order of 4,000 times more people per unit of energy produced than nuclear power. Or to put it another way, outdoor air pollution, caused principally by the combustion of fossil fuels, kills as many people every 29 hours as will eventually die due to radiation exposure from the Chernobyl nuclear disaster, according to World Health Organization figures (Source: nuclear; air pollution).
  • Yet since coal-related deaths have a much lower profile than nuclear disasters, and because they largely occur in the conveniently far-away obscurity of the developing world, they tend to be severely underreported by the mainstream media in the West.
Chen Guo Lim

YouTube - Mika - Lady Jane - 0 views

shared by Chen Guo Lim on 26 Aug 09 - Cached
  •  
    while I was watching this video, I suddenly had a desire to share this video with my friends. Then I realised that there are serious ethics issues here. Such is the life of a NM4204 student. 1. Is it alright to video a clip of a live performance? Seeing as I have just spent a couple of hundreds on a ticket, surely I am allowed to bring home some memories. Leaving uploading online aside, is the act of recording infringing on rights? Seeing as it does not harm either party if the clip is stroed in my device, and I viewed at my own time. 2. By us (me that is to say) sharing this file while everyone in the class, have I stepped into the boundaries of infringing on copyrights, seeing as the playback of this clip asynchronously can constitute as a public performance right? In any case, enjoy this song first before you think about these. One of my favourite artist.
Weiye Loh

Science Warriors' Ego Trips - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • By Carlin Romano Standing up for science excites some intellectuals the way beautiful actresses arouse Warren Beatty, or career liberals boil the blood of Glenn Beck and Rush Limbaugh. It's visceral.
  • A brave champion of beleaguered science in the modern age of pseudoscience, this Ayn Rand protagonist sarcastically derides the benighted irrationalists and glows with a self-anointed superiority. Who wouldn't want to feel that sense of power and rightness?
  • You hear the voice regularly—along with far more sensible stuff—in the latest of a now common genre of science patriotism, Nonsense on Stilts: How to Tell Science From Bunk (University of Chicago Press), by Massimo Pigliucci, a philosophy professor at the City University of New York.
  • ...24 more annotations...
  • it mixes eminent common sense and frequent good reporting with a cocksure hubris utterly inappropriate to the practice it apotheosizes.
  • According to Pigliucci, both Freudian psychoanalysis and Marxist theory of history "are too broad, too flexible with regard to observations, to actually tell us anything interesting." (That's right—not one "interesting" thing.) The idea of intelligent design in biology "has made no progress since its last serious articulation by natural theologian William Paley in 1802," and the empirical evidence for evolution is like that for "an open-and-shut murder case."
  • Pigliucci offers more hero sandwiches spiced with derision and certainty. Media coverage of science is "characterized by allegedly serious journalists who behave like comedians." Commenting on the highly publicized Dover, Pa., court case in which U.S. District Judge John E. Jones III ruled that intelligent-design theory is not science, Pigliucci labels the need for that judgment a "bizarre" consequence of the local school board's "inane" resolution. Noting the complaint of intelligent-design advocate William Buckingham that an approved science textbook didn't give creationism a fair shake, Pigliucci writes, "This is like complaining that a textbook in astronomy is too focused on the Copernican theory of the structure of the solar system and unfairly neglects the possibility that the Flying Spaghetti Monster is really pulling each planet's strings, unseen by the deluded scientists."
  • Or is it possible that the alternate view unfairly neglected could be more like that of Harvard scientist Owen Gingerich, who contends in God's Universe (Harvard University Press, 2006) that it is partly statistical arguments—the extraordinary unlikelihood eons ago of the physical conditions necessary for self-conscious life—that support his belief in a universe "congenially designed for the existence of intelligent, self-reflective life"?
  • Even if we agree that capital "I" and "D" intelligent-design of the scriptural sort—what Gingerich himself calls "primitive scriptural literalism"—is not scientifically credible, does that make Gingerich's assertion, "I believe in intelligent design, lowercase i and lowercase d," equivalent to Flying-Spaghetti-Monsterism? Tone matters. And sarcasm is not science.
  • The problem with polemicists like Pigliucci is that a chasm has opened up between two groups that might loosely be distinguished as "philosophers of science" and "science warriors."
  • Philosophers of science, often operating under the aegis of Thomas Kuhn, recognize that science is a diverse, social enterprise that has changed over time, developed different methodologies in different subsciences, and often advanced by taking putative pseudoscience seriously, as in debunking cold fusion
  • The science warriors, by contrast, often write as if our science of the moment is isomorphic with knowledge of an objective world-in-itself—Kant be damned!—and any form of inquiry that doesn't fit the writer's criteria of proper science must be banished as "bunk." Pigliucci, typically, hasn't much sympathy for radical philosophies of science. He calls the work of Paul Feyerabend "lunacy," deems Bruno Latour "a fool," and observes that "the great pronouncements of feminist science have fallen as flat as the similarly empty utterances of supporters of intelligent design."
  • It doesn't have to be this way. The noble enterprise of submitting nonscientific knowledge claims to critical scrutiny—an activity continuous with both philosophy and science—took off in an admirable way in the late 20th century when Paul Kurtz, of the University at Buffalo, established the Committee for the Scientific Investigation of Claims of the Paranormal (Csicop) in May 1976. Csicop soon after launched the marvelous journal Skeptical Inquirer
  • Although Pigliucci himself publishes in Skeptical Inquirer, his contributions there exhibit his signature smugness. For an antidote to Pigliucci's overweening scientism 'tude, it's refreshing to consult Kurtz's curtain-raising essay, "Science and the Public," in Science Under Siege (Prometheus Books, 2009, edited by Frazier)
  • Kurtz's commandment might be stated, "Don't mock or ridicule—investigate and explain." He writes: "We attempted to make it clear that we were interested in fair and impartial inquiry, that we were not dogmatic or closed-minded, and that skepticism did not imply a priori rejection of any reasonable claim. Indeed, I insisted that our skepticism was not totalistic or nihilistic about paranormal claims."
  • Kurtz combines the ethos of both critical investigator and philosopher of science. Describing modern science as a practice in which "hypotheses and theories are based upon rigorous methods of empirical investigation, experimental confirmation, and replication," he notes: "One must be prepared to overthrow an entire theoretical framework—and this has happened often in the history of science ... skeptical doubt is an integral part of the method of science, and scientists should be prepared to question received scientific doctrines and reject them in the light of new evidence."
  • Pigliucci, alas, allows his animus against the nonscientific to pull him away from sensitive distinctions among various sciences to sloppy arguments one didn't see in such earlier works of science patriotism as Carl Sagan's The Demon-Haunted World: Science as a Candle in the Dark (Random House, 1995). Indeed, he probably sets a world record for misuse of the word "fallacy."
  • To his credit, Pigliucci at times acknowledges the nondogmatic spine of science. He concedes that "science is characterized by a fuzzy borderline with other types of inquiry that may or may not one day become sciences." Science, he admits, "actually refers to a rather heterogeneous family of activities, not to a single and universal method." He rightly warns that some pseudoscience—for example, denial of HIV-AIDS causation—is dangerous and terrible.
  • But at other points, Pigliucci ferociously attacks opponents like the most unreflective science fanatic
  • He dismisses Feyerabend's view that "science is a religion" as simply "preposterous," even though he elsewhere admits that "methodological naturalism"—the commitment of all scientists to reject "supernatural" explanations—is itself not an empirically verifiable principle or fact, but rather an almost Kantian precondition of scientific knowledge. An article of faith, some cold-eyed Feyerabend fans might say.
  • He writes, "ID is not a scientific theory at all because there is no empirical observation that can possibly contradict it. Anything we observe in nature could, in principle, be attributed to an unspecified intelligent designer who works in mysterious ways." But earlier in the book, he correctly argues against Karl Popper that susceptibility to falsification cannot be the sole criterion of science, because science also confirms. It is, in principle, possible that an empirical observation could confirm intelligent design—i.e., that magic moment when the ultimate UFO lands with representatives of the intergalactic society that planted early life here, and we accept their evidence that they did it.
  • "As long as we do not venture to make hypotheses about who the designer is and why and how she operates," he writes, "there are no empirical constraints on the 'theory' at all. Anything goes, and therefore nothing holds, because a theory that 'explains' everything really explains nothing."
  • Here, Pigliucci again mixes up what's likely or provable with what's logically possible or rational. The creation stories of traditional religions and scriptures do, in effect, offer hypotheses, or claims, about who the designer is—e.g., see the Bible.
  • Far from explaining nothing because it explains everything, such an explanation explains a lot by explaining everything. It just doesn't explain it convincingly to a scientist with other evidentiary standards.
  • A sensible person can side with scientists on what's true, but not with Pigliucci on what's rational and possible. Pigliucci occasionally recognizes that. Late in his book, he concedes that "nonscientific claims may be true and still not qualify as science." But if that's so, and we care about truth, why exalt science to the degree he does? If there's really a heaven, and science can't (yet?) detect it, so much the worse for science.
  • Pigliucci quotes a line from Aristotle: "It is the mark of an educated mind to be able to entertain a thought without accepting it." Science warriors such as Pigliucci, or Michael Ruse in his recent clash with other philosophers in these pages, should reflect on a related modern sense of "entertain." One does not entertain a guest by mocking, deriding, and abusing the guest. Similarly, one does not entertain a thought or approach to knowledge by ridiculing it.
  • Long live Skeptical Inquirer! But can we deep-six the egomania and unearned arrogance of the science patriots? As Descartes, that immortal hero of scientists and skeptics everywhere, pointed out, true skepticism, like true charity, begins at home.
  • Carlin Romano, critic at large for The Chronicle Review, teaches philosophy and media theory at the University of Pennsylvania.
  •  
    April 25, 2010 Science Warriors' Ego Trips
Weiye Loh

The Free Speech Blog: Official blog of Index on Censorship » Thank God for the Goats - 0 views

  • The US Supreme Court ruled yesterday by an 8-1 vote that the bizarre anti-gay funeral picketers belonging to the Westboro Baptist Church have a First Amendment right to free speech. Rev Fred Phelps and his crew have been waving placards with messages such as “Thank God for Dead Soldiers” and “AIDS Cures Fags” at military funerals to promote their belief that God is punishing the US for accepting homosexuality.
  • The Supreme Court decision (see below) overruled a previous award of over $10 million (reduced on appeal to $5 million) to the family of Lance Corporal Matthew Snyder in relation to a protest at his funeral.
  • First, undoubtedly debate about war, its causes and casualties is important. This was “speech” in a public place on an issue of public concern, even though the particular hypothesis is ridiculous and offensive. Free speech protection can’t, however, just be for views already presumed to be true. Secondly, protestors were scrupulous about staying within the letter of the law. They knew that they had to remain 1,000 feet from the funeral, for instance, and did not shout or otherwise disrupt the service. Preventing such orderly protests on issues of importance would have been a serious attack on civil liberties, even though the protestors displayed gross insensitivity to those mourning.
  • ...1 more annotation...
  • we should welcome this decision even though it protects bigots of limited reasoning ability about cause and effect who are indifferent to the feelings of the recently bereaved. The best response to hateful speech is surely counter-speech. At many recent military funerals, counter-protestors have arrived early in their thousands and occupied the prime spaces in the surrounding area. That is a far better reaction than a legal gagging order.
Weiye Loh

11.01.97 - Misconceptions about the causes of cancer lead to skewed priorities and wasted money, UC Berkeley researchers say - 0 views

  • One of the big misconceptions is that artificial chemicals such as pesticides have a lot to do with human cancer, but that's just not true," says Bruce N. Ames, professor of biochemistry and molecular biology at the University of California at Berkeley and co-author of a new review of what is known about environmental pollution and cancer. "Nevertheless, it's conventional wisdom and society spends billions on this each year." "We consume more carcinogens in one cup of coffee than we get from the pesticide residues on all the fruits and vegetables we eat in a year," he adds.
  • there may be many excellent reasons for cleaning up pollution of our air, water and soil, the researchers say, prevention of cancer is not one of them.
  • "The problem is that lifestyle changes are tough," says Gold, director of the Carcinogenic Potency Project at UC Berkeley's National Institute for Environmental Health Sciences Center and a senior scientist in the cell and molecular biology division at Lawrence Berkeley National Laboratory. "But by targeting pesticide residues as a major problem, we risk making fruits and vegetables more expensive and indirectly increasing cancer risks, especially among the poor."
  • ...10 more annotations...
  • Whereas 99.9 percent of all the chemicals we ingest are natural, 78 percent of the chemicals tested are synthetic. So when more than half of all synthetic chemicals are found to cause cancer in rodents, it's not surprising that people link cancer with synthetic chemicals. But of the natural chemicals in our diet that have been tested in animals, half also cause cancer, Gold says.
  • "We need to recognize that there are far more carcinogens in the natural world than in the synthetic world, and go after the important things, such as lifestyle change."
  • Misconception: Cancer rates are soaring. In fact, the researchers say, if lung cancer due to smoking is excluded, overall cancer deaths in the U.S. have declined 16 percent since 1950.
  • Misconception: Reducing pesticide residues is an effective way to prevent diet-related cancer. Because fruits and vegetables are of major importance in reducing cancer, the unintended effect of requiring expensive efforts to reduce the amount of pesticides remaining on fruits and vegetables will be to increase their cost. This will lead to an increase in cancer among low income people who no longer will be able to afford to eat them.
  • Misconception: Human exposures to carcinogens and other potential hazards are primarily due to synthetic chemicals. Americans actually eat about 10,000 times more natural pesticides from fruits and vegetables than synthetic pesticide residues on food. Natural pesticides are chemicals that plants produce to defend themselves against fungi, insects, and other predators. And half of all natural pesticides tested in rodents turn out to be rodent carcinogens. In addition, we consume many other carcinogens in foods because of the chemicals produced in cooking. In a single cup of roasted coffee, for example, the natural chemicals known to be rodent carcinogens are about equal in weight to an entire year's work of synthetic pesticide residues.
  • Misconception: Cancer risks to humans can be assessed by standard high-dose animal cancer tests. In cancer tests, animals are given very high, nearly toxic doses. The effect on humans at lower doses is extrapolated from these results, as if the relationship were a straight line from high dose to low dose. However, the fact that half of all chemicals tested, whether natural or synthetic, turn out to cause cancer in rodents implies that this is an artifact of using high doses. High doses of any chemical can chronically kill cells and wound tissue, a risk factor for cancer . "Our conclusion is that the scientific evidence shows that there are high-dose effects," Ames says. "But even though government regulatory agencies recognize this, they still decide which synthetic chemicals to regulate based on linear extrapolation of high dose cancer tests in animals."
  • Misconception: Synthetic chemicals pose greater carcinogenic hazards than natural chemicals. Naturally occurring carcinogens represent an enormous background compared to the low-dose exposures to residues of synthetic chemicals such as pesticides, the researchers conclude. These results call for a reevaluation of whether animal cancer tests are really useful guides for protecting the public against minor hypothetical risks.
  • Misconception: The toxicology of synthetic chemicals is different from that of natural chemicals. No evidence exists for this, but the assumption could lead to unfortunate tradeoffs between natural and synthetic pesticides. Recently, for example, when a new variety of highly insect-resistant celery was introduced on a farm, the workers handling the celery developed rashes when they were exposed to sunlight. The pest-resistant celery turned out to contain almost eight times more natural pesticide in the form of psoralens -- chemicals known to cause cancer and genetic mutations -- than common celery.
  • Misconception: Pesticides and other synthetic chemicals are disrupting human hormones. Claims that synthetic chemicals with hormonal activity contribute to cancer and reduced sperm count ignore the fact that natural chemicals have hormone-like activity millions of times greater than do traces of synthetic chemicals. Rather, lifestyle -- lack of exercise, obesity, alcohol use and reproductive history -- are known to lead to marked changes in hormone levels in the body.
  • Misconception: Regulating low, hypothetical risks advances public health. Society -- primarily the private sector -- will spend an estimated $140 billion to comply with environmental regulations this year, according to projections by the Environmental Protection Agency. Much of this is aimed at reducing low-level human exposure to chemicals solely because they are rodent carcinogens, despite the fact that this rationale is flawed. Our improved ability to detect even minuscule concentrations of chemicals makes regulation even more expensive.
  •  
    BERKELEY -- Despite a lack of convincing evidence that pollution is an important cause of human cancer, this misconception drives government policy today and results in billions of dollars spent to clean up minuscule amounts of synthetic chemicals, say two UC Berkeley researchers.
Weiye Loh

DenialDepot: A word of caution to the BEST project team - 0 views

  • 1) Any errors, however inconsequential, will be taken Very Seriously and accusations of fraud will be made.
  • 2) If you adjust the raw data we will accuse you of fraudulently fiddling the figures whilst cooking the books.3) If you don't adjust the raw data we will accuse you of fraudulently failing to account for station biases and UHI.
  • 7) By all means publish all your source code, but we will still accuse you of hiding the methodology for your adjustments.
  • ...10 more annotations...
  • 8) If you publish results to your website and errors are found, we will accuse you of a Very Serious Error irregardless of severity (see point #1) and bemoan the press release you made about your results even though you won't remember making any press release about your results.
  • 9) With regard to point #8 above, at extra cost and time to yourself you must employ someone to thoroughly check each monthly update before is is published online, even if this delays publication of the results till the end of the month. You might be surprised at this because no-one actually relies on such freshly published data anyway and aren't the many eyes of blog audit better than a single pair of eyes? Well that's irrelevant. See points #1 and #810) If you don't publish results promptly at the start of the month on the public website, but instead say publish the results to a private site for checks to be performed before release, we will accuse you of engaging in unscientific-like secrecy and massaging the data behind closed doors.
  • 14) If any region/station shows a warming trend that doesn't match the raw data, and we can't understand why, we will accuse you of fraud and dismiss the entire record. Don't expect us to have to read anything to understand results.
  • 15) You must provide all input datasets on your website. It's no good referencing NOAAs site and saying they "own" the GHCN data for example. I don't want their GHCN raw temperatures file, I want the one on your hard drive which you used for the analysis, even if you claim they are the same. If you don't do this we will accuse you of hiding the data and preventing us checking your results.
  • 24. In the event that you comply with all of the above, we will point out that a mere hundred-odd years of data is irrelevant next to the 4.5 billion year history of Earth. So why do you even bother?
  • 23) In the unlikely event that I haven't wasted enough of your time forcing you to comply with the above rules, I also demand to see all emails you have sent or will send during the period 1950 to 2050 that contain any of these keywords
  • 22) We don't need any scrutiny because our role isn't important.
  • 17) We will treat your record as if no alternative exists. As if your record is the make or break of Something Really Important (see point #1) and we just can't check the results in any other way.
  • 16) You are to blame for any station data your team uses. If we find out that a station you use is next to an AC Unit, we will conclude you personally planted the thermometer there to deliberately get warming.
  • an article today by Roger Pielke Nr. (no relation) that posited the fascinating concept that thermometers are just as capricious and unreliable proxies for temperature as tree rings. In fact probably more so, and re-computing global temperature by gristlecone pines would reveal the true trend of global cooling, which will be in all our best interests and definitely NOT just those of well paying corporate entities.
  •  
    Dear Professor Muller and Team, If you want your Berkley Earth Surface Temperature project to succeed and become the center of attention you need to learn from the vast number of mistakes Hansen and Jones have made with their temperature records. To aid this task I created a point by point list for you.
Weiye Loh

TODAYonline | Commentary | Trust us, we're academics ... or should you? - 0 views

  • the 2011 Edelman Trust Barometer, published by research firm StrategyOne, which surveyed 5,075 "informed publics" in 23 countries on their trust in business, government, institutions and individuals. One of the questions asked of respondents was: "If you heard information about a company from one of these people, how credible would that information be?". Of the eight groups of individuals - academic/expert, technical expert in company, financial/industry analyst, CEO, non-governmental organisation representative, government official, person like myself, and regular employee - academic/expert came out tops with a score of 70 per cent, followed by technical expert at 64 per cent.
  • the film on the global financial crisis Inside Job, which won the 2011 Academy Award for best documentary. One of the documentary's themes is the role a number of renowned academics, particularly academic economists, played in the global crisis. It highlighted potentially serious conflicts of interests related to significant compensation derived by these academics serving on boards of financial services firms and advising such firms.
  • Often, these academics also played key roles in shaping government policies relating to deregulation - most appear allergic to regulation of the financial services industry. The documentary argued that these academics from Ivy League universities had basically become advocates for financial services firms, which blinded them to firms' excesses. It noted that few academic economists saw the financial crisis coming, and suggested this might be because they were too busy making money from the industry.
  • ...12 more annotations...
  • It is difficult to say if the "failure" of the academics was due to an unstinting belief in free markets or conflicts of interest. Parts of the movie did appear to be trying too hard to prove the point. However, the threat posed by academics earning consulting fees that dwarf their academic compensation, and which might therefore impair their independence, is a real one.
  • One of the worst was the Ivy League university economics professor engaged by the Icelandic Chamber of Commerce to co-author a report on the Icelandic financial system. He concluded that the system was sound even though there were numerous warning signs. When he was asked how he arrived at his conclusions, he said he had talked to people and were misled by them. One wonders how much of his conclusions were actually based on rigorous analysis.
  • it is troubling if academics merely become mouthpieces for vested interests. The impression one gets from watching the movie certainly does not fit with the high level of trust in academics shown by the Edelman Trust Barometer.
  • As an academic, I have often been told that I can be independent and objective - that I should have no axe to grind and no wheels to grease. However, I worry about an erosion of trust in academics. This may be especially true in certain disciplines like business (which is mine, incidentally).
  • too many business school professors were serving on US corporate boards and have lost their willingness to be critical about unethical business practices. In corporate scandals such as Enron and Satyam, academics from top business schools have not particularly covered themselves in glory.
  • It is more and more common for universities - in the US and here - to invite business people to serve on their boards.
  • universities and academics may lose their independence and objectivity in commenting on business issues critically, for fear of offending those who ultimately have an oversight role over the varsity's senior management.
  • Universities might also have business leaders serving on boards as potential donors, which would also confuse the role of board members and lead to conflicts of interest. In the Satyam scandal in India, the founder of Satyam sat on the board of the Indian School of Business, while the Dean of the Indian School of Business sat on Satyam's board. Satyam also made a significant donation to the Indian School of Business.
  • Universities are increasingly dependent on funding from industry and wealthy individuals as well as other sources, sometimes even dubious ones. The recent scandal at the London School of Economics involving its affiliation with Libya is an example.
  • It is important for universities to have robust gift policies as part of the risk management to protect their reputation, which can be easily tainted if a donation comes from a questionable source. It is especially important that donations do not cause universities to be captured by vested interests.
  • From time to time, people in industry ask me if I have been pressured by the university to tone down on my outspokenness on corporate governance issues. Thankfully, while there have been instances where varsity colleagues and friends in industry have conveyed messages from others to "tone down", I have felt relatively free to express my views. Of course, were I trying to earn more money from external consulting, I guess I would be less vocal.
  • I do worry about the loss of independence and, therefore, trust in academics and academic institutions if we are not careful about it.
Weiye Loh

Cadbury's Naomi Campbell ad not racist, rules watchdog | Media | guardian.co.uk - 0 views

  • The press ad for Cadbury's Bliss range of Dairy Milk chocolate – which ran with the strapline "move over Naomi, there's a new diva in town" – provoked outrage from the supermodel as well as campaigning group Operation Black Vote.Campbell said she was shocked by the ad, while her mother Valerie said she was "deeply upset by this racist advert".Cadbury initially defended the campaign, saying it was intended as a tongue-in-cheek play on her reputation for diva-style tantrums and had nothing to do with her skin colour.However, after taking took legal advice Cadbury withdrew the campaign and made a public apology on its corporate website.
  • The complainants objected that the ad was racially offensive because it compared a black woman to a bar of chocolate.However, the ASA council said that the ad was "likely to be understood to refer to Naomi Campbell's reputation for 'diva-style' behaviour rather than her race"."On this basis the council decided that the ad was unlikely to be seen as racist or to cause serious or widespread offence," the ASA added.
  •  
    The advertising watchdog has thrown out complaints accusing an ad by Cadbury of racism for comparing model Naomi Campbell to a bar of chocolate. This decision follows an assessment by the council of the Advertising Standards Authority on whether to launch an investigation to see if the press campaign is in breach of the advertising code relating to racism.
Weiye Loh

Ian Burrell: 'Hackgate' is a story that refuses to go away - Commentators, Opinion - The Independent - 0 views

  • Mr Murdoch's close henchman Les Hinton assured MPs that the affair had been dealt with and when, two years later, Mr Coulson – by now director of communications for David Cameron – appeared before a renewed parliamentary inquiry he seemed confident of being fireproof. "We did not use subterfuge of any kind unless there was a clear public interest in doing so," he told MPs. When Scotland Yard concluded that, despite more allegations of hacking, there was nothing new to investigate, Wapping and Mr Coulson must again have concluded the affair was over.
  • But after an election campaign in which the Conservatives were roundly supported by Mr Murdoch's papers, a succession of further claimants against the News of the World has come forward. Sienna Miller, among others, seems determined to take her case to court, compelling Mulcaire to reveal his handlers and naming in court documents Ian Edmondson, once one of Coulson's executives. Mr Edmondson is now suspended. But the story is unlikely to end there
  • When Rupert Murdoch came to England last October to deliver a lecture, there were some in the audience who raised eyebrows when the media mogul broke off from a paean to Baroness Thatcher to say of his journalists: "We will vigorously pursue the truth – and we will not tolerate wrongdoing." The latter comment seemed to refer to the long-running phone-hacking scandal involving the News of the World, the tabloid he has owned for 41 years. Mr Murdoch's executives at his British headquarters in Wapping, east London, tried to draw a veil over the paper's own dirty secrets in 2007 and had no doubt assured him that the matter was history. Yet here was the boss, four years later, having to vouch for his organisation's honesty. Related articles
  •  
    The news agenda changes fast in tabloid journalism but Hackgate has been a story that refuses to go away. When the private investigator Glenn Mulcaire and the News of the World journalist Clive Goodman were jailed for conspiring to intercept the voicemails of members of the royal household, Wapping quickly closed ranks. The editor Andy Coulson was obliged to fall on his sword - while denying knowledge of illegality - and Goodman was condemned as a rogue operator.
Weiye Loh

Rationally Speaking: Liberal Democracy's Constant Tension: The Openness of Debate - 0 views

  • These questions essentially get at the issue of openness of debate. Openness includes at least two aspects, which are inevitably closely related, indeed hard to separate: who (or, whose ideas) can enter the debate; and how long should debate last before it ends (and people move to the next topic, or act on conclusions from the past debate).
  • The first issue would seem an easy one: no person, nor any person’s ideas, can be barred from debate.
  • ohn Stuart Mill’s On Liberty, which is a cornerstone work of the modern liberal society: “If all mankind minus one were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person, than he, if he had the power, would be justified in silencing mankind. … To refuse a hearing to an opinion, because they are sure that it is false, is to assume that their certainty is the same thing as absolutely certainty. All silencing of discussion is an assumption of infallibility. …” (Mill, 23, 28).
  • ...3 more annotations...
  • how long should political liberals have let debate last? Here we reach another issue to parse, that of dividing the spheres of discourse of politics and society.The political sphere includes lawmakers, who have their name for a reason: they make laws. They cannot sit around and debate endlessly. They must, at some point, push legislation through (which is at the center of the debate over filibuster reform).
  • take note of another passage from Mill: “It is the duty of governments, and of individuals, to form the truest opinions they can; to form them carefully, and never impose them on others unless they are quite sure of being right. But when they are sure … it is not conscientiousness but cowardice to shrink from acting on their opinions. ... Men, and governments, must act to the best of their ability. There is no such thing as absolute certainty, but there is assurance sufficient for the purposes of human life." (Mill, 25-26, emphasis added).
  • Yet our division of spheres of discourse means passage of a bill – or even defeat – does not mark the end of debate. Indeed, many Americans continued to discuss the merits of the legislation, with some even filing lawsuits arguing it was unconstitutional (I think these stand little chance of going anywhere). American society at large can and will continue to have the conversation about health insurance reform. Then, in the next election, they will bring their beliefs to the polls. They will expect those voted in to act. And then, the conversation will continue. Politics is a continuous process. By dividing up spheres of discourse into political and societal, we see that debate never really ends – it’s just that sometimes lawmakers need to get on with their job, and leave debate to the public.
  •  
    TUESDAY, JULY 06, 2010 Liberal Democracy's Constant Tension: The Openness of Debate
Weiye Loh

nanopolitan: Medicine, Trials, Conflict of Interest, Disclosures - 0 views

  • Some 1500 documents revealed in litigation provide unprecedented insights into how pharmaceutical companies promote drugs, including the use of vendors to produce ghostwritten manuscripts and place them into medical journals.
  • Dozens of ghostwritten reviews and commentaries published in medical journals and supplements were used to promote unproven benefits and downplay harms of menopausal hormone therapy (HT), and to cast raloxifene and other competing therapies in a negative light.
  • the pharmaceutical company Wyeth used ghostwritten articles to mitigate the perceived risks of breast cancer associated with HT, to defend the unsupported cardiovascular “benefits” of HT, and to promote off-label, unproven uses of HT such as the prevention of dementia, Parkinson's disease, vision problems, and wrinkles.
  • ...7 more annotations...
  • Given the growing evidence that ghostwriting has been used to promote HT and other highly promoted drugs, the medical profession must take steps to ensure that prescribers renounce participation in ghostwriting, and to ensure that unscrupulous relationships between industry and academia are avoided rather than courted.
  • Twenty-five out of 32 highly paid consultants to medical device companies in 2007, or their publishers, failed to reveal the financial connections in journal articles the following year, according to a [recent] study.
  • The study compared major payments to consultants by orthopedic device companies with financial disclosures the consultants later made in medical journal articles, and found them lacking in public transparency. “We found a massive, dramatic system failure,” said David J. Rothman, a professor and president of the Institute on Medicine as a Profession at Columbia University, who wrote the study with two other Columbia researchers, Susan Chimonas and Zachary Frosch.
  • Carl Elliot in The Chronicle of Higher Educations: The Secret Lives of Big Pharma's 'Thought Leaders':
  • See also a related NYTimes report -- Menopause, as Brought to You by Big Pharma by Natasha Singer and Duff Wilson -- from December 2009. Duff Wilson reports in the NYTimes: Medical Industry Ties Often Undisclosed in Journals:
  • Pharmaceutical companies hire KOL's [Key Opinion Leaders] to consult for them, to give lectures, to conduct clinical trials, and occasionally to make presentations on their behalf at regulatory meetings or hearings.
  • KOL's do not exactly endorse drugs, at least not in ways that are too obvious, but their opinions can be used to market them—sometimes by word of mouth, but more often by quasi-academic activities, such as grand-rounds lectures, sponsored symposia, or articles in medical journals (which may be ghostwritten by hired medical writers). While pharmaceutical companies seek out high-status KOL's with impressive academic appointments, status is only one determinant of a KOL's influence. Just as important is the fact that a KOL is, at least in theory, independent. [...]
  •  
    Medicine, Trials, Conflict of Interest, Disclosures Just a bunch of links -- mostly from the US -- that paint give us a troubling picture of the state of ethics in biomedical fields:
Weiye Loh

After Wakefield: Undoing a decade of damaging debate « Skepticism « Critical Thinking « Skeptic North - 0 views

  • Mass vaccination completely eradicated smallpox, which had been killing one in seven children.  Public health campaigns have also eliminated diptheria, and reduced the incidence of pertussis, tetanus, measles, rubella and mumps to near zero.
  • when vaccination rates drop, diseases can reemerge in the population again. Measles is currently endemic in the United Kingdom, after vaccination rates dropped below 80%. When diptheria immunization dropped in Russia and Ukraine in the early 1990′s, there were over 100,000 cases with 1,200 deaths.  In Nigeria in 2001, unfounded fears of the polio vaccine led to a drop in vaccinations, an re-emergence of infection, and the spread of polio to ten other countries.
  • one reason that has experienced a dramatic upsurge over the past decade or so has been the fear that vaccines cause autism. The connection between autism and vaccines, in particular the measles, mumps, rubella (MMR) vaccine, has its roots in a paper published by Andrew Wakefield in 1998 in the medical journal The Lancet.  This link has already been completely and thoroughly debunked – there is no evidence to substantiate this connection. But over the past two weeks, the full extent of the deception propagated by Wakefield was revealed. The British Medical Journal has a series of articles from journalist Brian Deer (part 1, part 2), who spent years digging into the facts behind Wakefield,  his research, and the Lancet paper
  • ...3 more annotations...
  • Wakefield’s original paper (now retracted) attempted to link gastrointestinal symptoms and regressive autism in 12 children to the administration of the MMR vaccine. Last year Wakefield was stripped of his medical license for unethical behaviour, including undeclared conflicts of interest.  The most recent revelations demonstrate that it wasn’t just sloppy research – it was fraud.
  • Unbelievably, some groups still hold Wakefield up as some sort of martyr, but now we have the facts: Three of the 9 children said to have autism didn’t have autism at all. The paper claimed all 12 children were normal, before administration of the vaccine. In fact, 5 had developmental delays that were detected prior to the administration of the vaccine. Behavioural symptoms in some children were claimed in the paper as being closely related to the vaccine administration, but documentation showed otherwise. What were initially determined to be “unremarkable” colon pathology reports were changed to “non-specific colitis” after a secondary review. Parents were recruited for the “study” by anti-vaccinationists. The study was designed and funded to support future litigation.
  • As Dr. Paul Offit has been quoted as saying, you can’t unring a bell. So what’s going to stop this bell from ringing? Perhaps an awareness of its fraudulent basis will do more to change perceptions than a decade of scientific investigation has been able to achieve. For the sake of population health, we hope so.
Weiye Loh

FT.com / Business education / Soapbox - Popular fads replace relevant teaching - 0 views

  • There is a great divide in business schools, one that few outsiders are aware of. It is the divide between research and teaching. There is little relation between them. What is being taught in management books and classrooms is usually not based on rigorous research and vice-versa; the research published in prestigious academic journals seldom finds its way into the MBA classroom.
  • none of this research is really intended to be used in the classroom, or to be communicated to managers in some other form, it is not suited to serve that purpose. The goal is publication in a prestigious academic journal, but that does not make it useful or even offer a guarantee that the research findings provide much insight into the workings of business reality.
  • is not a new problem. In 1994, Don Hambrick, then the president of the Academy of Management, said: “We read each others’ papers in our journals and write our own papers so that we may, in turn, have an audience . . . an incestuous, closed loop”. Management research is not required to be relevant. Consequently much of it is not.
  • ...6 more annotations...
  • But business education clearly also suffers. What is being taught in management courses is usually not based on solid scientific evidence. Instead, it concerns the generalisation of individual business cases or the lessons from popular management books. Such books often are based on the appealing formula that they look at several successful companies, see what they have in common and conclude that other companies should strive to do the same thing.
  • how do you know that the advice provided is reasonable, or if it comes from tomorrow’s Enrons, RBSs, Lehmans and WorldComs? How do you know that today’s advice and cases will not later be heralded as the epitome of mismanagement?
  • In the 1990s, ISO9000 (a quality management systems standard) spread through many industries. But research by professors Mary Benner and Mike Tushman showed that its adoption could, in time, lead to a fall in innovation (because ISO9000 does not allow for deviations from a set standard, which innovation requires), making the adopter worse off. This research was overlooked by practitioners, many business schools continued to applaud the benefits of ISO9000 in their courses, while firms continued – and still do – to implement the practice, ignorant of its potential pitfalls. Yet this research offers a clear example of the possible benefits of scientific research methods: rigorous research that reveals unintended consequences to expose the true nature of a business practice.
  • such research with important practical implications unfortunately is the exception rather than the rule. Moreover, even relevant research is largely ignored in business education – as happened to the findings by Benner and Tushman.
  • Of course one should not make the mistake that business cases and business books based on personal observation and opinion are without value. They potentially offer a great source of practical experience. Similarly, it would be naive to assume that scientific research can provide custom-made answers. Rigorous management research could and should provide the basis for skilled managers to make better decisions. However, they cannot do that without the in-depth knowledge of their specific organisation and circumstances.
  • at present, business schools largely fail in providing rigorous, evidence-based teaching.
Weiye Loh

Breakthrough Europe: Towards a Social Theory of Climate Change - 0 views

  • Lever-Tracy confronted sociologists head on about their worrisome silence on the issue. Why have sociologists failed to address the greatest and most overwhelming challenge facing modern society? Why have the figureheads of the discipline, such as Anthony Giddens and Ulrich Beck, so far refused to apply their seminal notions of structuration and the risk society to the issue?
  • Earlier, we re-published an important contribution by Ulrich Beck, the world-renowned German sociologist and a Breakthrough Senior Fellow. More recently, Current Sociology published a powerful response by Reiner Grundmann of Aston University and Nico Stehr of Zeppelin University.
  • sociologists should not rush into the discursive arena without asking some critical questions in advance, questions such as: What exactly could sociology contribute to the debate? And, is there something we urgently need that is not addressed by other disciplines or by political proposals?
  • ...12 more annotations...
  • he authors disagree with Lever-Tracy's observation that the lack of interest in climate change among sociologists is driven by a widespread suspicion of naturalistic explanations, teleological arguments and environmental determinism.
  • While conceding that Lever-Tracy's observation may be partially true, the authors argue that more important processes are at play, including cautiousness on the part of sociologists to step into a heavily politicized debate; methodological differences with the natural sciences; and sensitivity about locating climate change in the longue durée.
  • Secondly, while Lever-Tracy argues that "natural and social change are now in lockstep with each other, operating on the same scales," and that therefore a multidisciplinary approach is needed, Grundmann and Stehr suggest that the true challenge is interdisciplinarity, as opposed to multidisciplinarity.
  • Thirdly, and this possibly the most striking observation of the article, Grundmann and Stehr challenge Lever-Tracy's argument that natural scientists have successfully made the case for anthropogenic climate change, and that therefore social scientists should cease to endlessly question this scientific consensus on the basis of a skeptical postmodern 'deconstructionism'.
  • As opposed to both Lever-Tracy's positivist view and the radical postmodern deconstructionist view, Grundmann and Stehr take the social constructivist view, which argues that that every idea is socially constructed and therefore the product of human interpretation and communication. This raises the 'intractable' specters of discourse and framing, to which we will return in a second.
  • Finally, Lever-Tracy holds that climate change needs to be posited "firmly at the heart of the discipline." Grundmann and Stehr, however, emphasize that "if this is going to [be] more than wishful thinking, we need to carefully consider the prospects of such an enterprise."
  • The importance of framing climate change in a way that allows it to resonate with the concerns of the average citizen is an issue that the Breakthrough Institute has long emphasized. Especially the apocalyptic politics of fear that is often associated with climate change tends to have a counterproductive effect on public opinion. Realizing this, Grundmann and Stehr make an important warning to sociologists: "the inherent alarmism in many social science contributions on climate change merely repeats the central message provided by mainstream media." In other words, it fails to provide the kind of distantiated observation needed to approach the issue with at least a mild degree of objectivity or impartiality.
  • While this tension is symptomatic of many social scientific attempts to get involved, we propose to study these very underlying assumptions. For example, we should ask: Does the dramatization of events lead to effective political responses? Do we need a politics of fear? Is scientific consensus instrumental for sound policies? And more generally, what are the relations between a changing technological infrastructure, social shifts and belief systems? What contribution can bottom-up initiatives have in fighting climate change? What roles are there for markets, hierarchies and voluntary action? How was it possible that the 'fight against climate change' rose from a marginal discourse to a hegemonic one (from heresy to dogma)? And will the discourse remain hegemonic or will too much pub¬lic debate about climate change lead to 'climate change fatigue'?
  • In this respect, Grundmann and Stehr make another crucial observation: "the severity of a problem does not mean that we as sociologists should forget about our analytical apparatus." Bringing the analytical apparatus of sociology back in, the hunting season for positivist approaches to knowledge and nature is opened. Grundmann and Stehr consequently criticize not only Lever-Tracy's unspoken adherence to a positivist nature-society duality, taking instead a more dialectical Marxian approach to the relationship between man and his environment, but they also criticize her idea that incremental increases in our scientific knowledge of climate change and its impacts will automatically coalesce into successful and meaningful policy responses.
  • Political decisions about climate change are made on the basis of scientific research and a host of other (economic, political, cultural) considerations. Regarding the scientific dimension, it is a common perception (one that Lever-Tracy seems to share) that the more knowledge we have, the better the political response will be. This is the assumption of the linear model of policy-making that has been dominant in the past but debunked time and again (Godin, 2006). What we increasingly realize is that knowl¬edge creation leads to an excess of information and 'objectivity' (Sarewitz, 2000). Even the consensual mechanisms of the IPCC lead to an increase in options because knowledge about climate change increases.
  • Instead, Grundmann and Stehr propose to look carefully at how we frame climate change socially and whether the hegemonic climate discourse is actually contributing to successful political action or hampering it. Defending this social constructivist approach from the unfounded allegation that it would play into the hands of the climate skeptics, the authors note that defining climate change as a social construction ... is not to diminish its importance, relevance, or reality. It simply means that sociologists study the process whereby something (like anthropogenic climate change) is transformed from a conjecture into an accepted fact. With regard to policy, we observe a near exclusive focus on carbon dioxide emissions. This framing has proven counter productive, as the Hartwell paper and other sources demonstrate (see Eastin et al., 2010; Prins et al., 2010). Reducing carbon emissions in the short term is among the most difficult tasks. More progress could be made by a re-framing of the issue, not as an issue of human sinfulness, but of human dignity. [emphasis added]
  • These observations allow the authors to come full circle, arriving right back at their first observation about the real reasons why sociologists have so far kept silent on climate change. Somehow, "there seems to be the curious conviction that lest you want to be accused of helping the fossil fuel lobbies and the climate skeptics, you better keep quiet."
  •  
    Towards a Social Theory of Climate Change
‹ Previous 21 - 40 of 53 Next ›
Showing 20 items per page