Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Efficiency

Rss Feed Group items tagged

Weiye Loh

Is 'More Efficient' Always Better? - NYTimes.com - 1 views

  • Efficiency is the seemingly value-free standard economists use when they make the case for particular policies — say, free trade, more liberal immigration policies, cap-and-trade policies on environmental pollution, the all-volunteer army or congestion tolls. The concept of efficiency is used to justify a reliance on free-market principles, rather than the government, to organize the health care sector, or to make recommendations on taxation, government spending and monetary policy. All of these public policies have one thing in common: They create winners and losers among members of society.
  • can it be said that a more efficient resource allocation is better than a less efficient one, given the changes in the distribution of welfare among members of society that these allocations imply?
  • Suppose a restructuring of the economy has the effect of increasing the growth of average gross domestic product per capita, but that the benefits of that growth accrue disproportionately disproportionately to a minority of citizens, while others are worse off as a result, as appears to have been the case in the United States in the last several decades. Can economists judge this to be a good thing?
  • ...1 more annotation...
  • Indeed, how useful is efficiency as a normative guide to public policy? Can economists legitimately base their advocacy of particular policies on that criterion? That advocacy, especially when supported by mathematical notation and complex graphs, may look like economic science. But when greater efficiency is accompanied by a redistribution of economic privilege in society, subjective ethical dimensions inevitably get baked into the economist’s recommendations.
  •  
    Is 'More Efficient' Always Better?
Weiye Loh

The Breakthrough Institute: New Report: How Efficiency Can Increase Energy Consumption - 0 views

  • There is a large expert consensus and strong evidence that below-cost energy efficiency measures drive a rebound in energy consumption that erodes much and in some cases all of the expected energy savings, concludes a new report by the Breakthrough Institute. "Energy Emergence: Rebound and Backfire as Emergent Phenomena" covers over 96 published journal articles and is one of the largest reviews of the peer-reviewed journal literature to date. (Readers in a hurry can download Breakthrough's PowerPoint demonstration here or download the full paper here.)
  • In a statement accompanying the report, Breakthrough Institute founders Ted Nordhaus and Michael Shellenberger wrote, "Below-cost energy efficiency is critical for economic growth and should thus be aggressively pursued by governments and firms. However, it should no longer be considered a direct and easy way to reduce energy consumption or greenhouse gas emissions." The lead author of the new report is Jesse Jenkins, Breakthrough's Director of Energy and Climate Policy; Nordhaus and Shellenberger are co-authors.
  • The findings of the new report are significant because governments have in recent years relied heavily on energy efficiency measures as a means to cut greenhouse gases. "I think we have to have a strong push toward energy efficiency," said President Obama recently. "We know that's the low-hanging fruit, we can save as much as 30 percent of our current energy usage without changing our quality of life." While there is robust evidence for rebound in academic peer-reviewed journals, it has largely been ignored by major analyses, including the widely cited 2009 McKinsey and Co. study on the cost of reducing greenhouse gases.
  • ...2 more annotations...
  • The idea that increased energy efficiency can increase energy consumption at the macro-economic level strikes many as a new idea, or paradoxical, but it was first observed in 1865 by British economist William Stanley Jevons, who pointed out that Watt's more efficient steam engine and other technical improvements that increased the efficiency of coal consumption actually increased rather than decreased demand for coal. More efficient engines, Jevons argued, would increase future coal consumption by lowering the effective price of energy, thus spurring greater demand and opening up useful and profitable new ways to utilize coal. Jevons was proven right, and the reality of what is today known as "Jevons Paradox" has long been uncontroversial among economists.
  • Economists have long observed that increasing the productivity of any single factor of production -- whether labor, capital, or energy -- increases demand for all of those factors. This is one of the basic dynamics of economic growth. Luddites who feared there would be fewer jobs with the emergence of weaving looms were proved wrong by lower price for woven clothing and demand that has skyrocketed (and continued to increase) ever since. And today, no economist would posit that an X% improvement in labor productivity would lead directly to an X% reduction in employment. In fact, the opposite is widely expected: labor productivity is a chief driver of economic growth and thus increases in employment overall. There is no evidence, the report points out, that energy is any different, as per capita energy consumption everywhere on earth continues to rise, even as economies become more efficient each year.
Weiye Loh

When Value Judgments Masquerade as Science - NYTimes.com - 0 views

  • Most people think of the term in the context of production of goods and services: more efficient means more valuable output is wrung from a given bundle of real resources (which is good) or that fewer real resources are burned up to produce a given output (which is also good).
  • In economics, efficiency is also used to evaluate alternative distributions of an available set of goods and services among members of society. In this context, I distinguished in last week’s post between changes in public policies (reallocations of economic welfare) that make some people feel better off and none feel worse off and those that make some people feel better off but others feel worse off.
  • consider whether economists should ever become advocates for a revaluation of China’s currency, the renminbi — or, alternatively, for imposing higher tariffs on Chinese imports. Such a policy would tend to improve the lot of shareholders and employees of manufacturers competing with Chinese imports. Yet it would make American consumers of Chinese goods worse off. If the renminbi were significantly and artificially undervalued against the United States dollar, relative to a free-market exchange rate without government intervention, that would be tantamount to China running a giant, perennial sale on Chinese goods sold to the United States. If you’re an American consumer, what’s not to like about that? So why are so many economists advocating an end to this sale?
  • ...9 more annotations...
  • Strict constructionists argue that their analyses should confine themselves strictly to positive (that is, descriptive) analysis: identify who wins and who loses from a public policy, and how much, but leave judgments about the social merits of the policy to politicians.
  • a researcher’s political ideology or vested interest in a particular theory can still enter even ostensibly descriptive analysis by the data set chosen for the research; the mathematical transformations of raw data and the exclusion of so-called outlier data; the specific form of the mathematical equations posited for estimation; the estimation method used; the number of retrials in estimation to get what strikes the researcher as “plausible” results, and the manner in which final research findings are presented. This is so even among natural scientists discussing global warming. As the late medical journalist Victor Cohn once quoted a scientist, “I would not have seen it if I did not believe it.”
  • anyone who sincerely believes that seemingly scientific, positive research in the sciences — especially the social sciences — is invariably free of the researcher’s own predilections is a Panglossian optimist.
  • majority of economists have been unhappy for more than a century with the limits that the strict constructionist school would place upon their professional purview. They routinely do enter the forum in which public policy is debated
  • The problem with welfare analysis is not so much that ethical dimensions typically enter into it, but that economists pretend that is not so. They do so by justifying their normative dicta with appeal to the seemly scientific but actually value-laden concept of efficiency.
  • economics is not a science that only describes, measures, explains and predicts human interests, values and policies — it also evaluates, promotes, endorses or rejects them. The predicament of economics and all other social sciences consists in their failure to acknowledge honestly their value orientation in their pathetic and inauthentic pretension to emulate the natural sciences they presume to be value free.
  • By the Kaldor-Hicks criterion, a public policy is judged to enhance economic efficiency and overall social welfare — and therefore is to be recommended by economists to decision-makers — if those who gain from the policy could potentially bribe those who lose from it into accepting it and still be better off (Kaldor), or those who lose from it were unable to bribe the gainers into forgoing the policy (Hicks). That the bribe was not paid merely underscores the point.
  • In applications, the Kaldor-Hicks criterion and the efficiency criterion amount to the same thing. When Jack gains $10 and Jill loses $5, social gains increase by $5, so the policy is a good one. When Jack gains $10 and Jill loses $15, there is a deadweight loss of $5, so the policy is bad. Evidently, on the Kaldor-Hicks criterion one need not know who Jack and Jill are, nor anything about their economic circumstances. Furthermore, a truly stunning implication of the criterion is that if a public policy takes $X away from one citizen and gives it to another, and nothing else changes, then such a policy is welfare neutral. Would any non-economist buy that proposition?
  • Virtually all modern textbooks in economics base their treatment of efficiency on Kaldor-Hicks, usually without acknowledging the ethical dimensions of the concept. I use these texts in my economics courses as, I suppose, do most my colleagues around the world. But I explicitly alert my students to the ethical pitfalls in normative welfare economics, with commentaries such as “How Economists Bastardized Benthamite Utilitarianism” and “The Welfare Economics of Health Insurance,” or with assignments that force students to think about this issue. My advice to students and readers is: When you hear us economists wax eloquent on the virtue of greater efficiency — beware!
  •  
    When Value Judgments Masquerade as Science
Weiye Loh

Let There Be More Efficient Light - NYTimes.com - 0 views

  • LAST week Michele Bachmann, a Republican representative from Minnesota, introduced a bill to roll back efficiency standards for light bulbs, which include a phasing out of incandescent bulbs in favor of more energy-efficient bulbs. The “government has no business telling an individual what kind of light bulb to buy,” she declared.
  • But this opposition ignores another, more important bit of American history: the critical role that government-mandated standards have played in scientific and industrial innovation.
  • inventions alone weren’t enough to guarantee progress. Indeed, at the time the lack of standards for everything from weights and measures to electricity — even the gallon, for example, had eight definitions — threatened to overwhelm industry and consumers with a confusing array of incompatible choices.
  • ...5 more annotations...
  • This wasn’t the case everywhere. Germany’s standards agency, established in 1887, was busy setting rules for everything from the content of dyes to the process for making porcelain; other European countries soon followed suit. Higher-quality products, in turn, helped the growth in Germany’s trade exceed that of the United States in the 1890s. America finally got its act together in 1894, when Congress standardized the meaning of what are today common scientific measures, including the ohm, the volt, the watt and the henry, in line with international metrics. And, in 1901, the United States became the last major economic power to establish an agency to set technological standards. The result was a boom in product innovation in all aspects of life during the 20th century. Today we can go to our hardware store and choose from hundreds of light bulbs that all conform to government-mandated quality and performance standards.
  • Technological standards not only promote innovation — they also can help protect one country’s industries from falling behind those of other countries. Today China, India and other rapidly growing nations are adopting standards that speed the deployment of new technologies. Without similar requirements to manufacture more technologically advanced products, American companies risk seeing the overseas markets for their products shrink while innovative goods from other countries flood the domestic market. To prevent that from happening, America needs not only to continue developing standards, but also to devise a strategy to apply them consistently and quickly.
  • The best approach would be to borrow from Japan, whose Top Runner program sets energy-efficiency standards by identifying technological leaders in a particular industry — say, washing machines — and mandating that the rest of the industry keep up. As technologies improve, the standards change as well, enabling a virtuous cycle of improvement. At the same time, the government should work with businesses to devise multidimensional standards, so that consumers don’t balk at products because they sacrifice, say, brightness and cost for energy efficiency.
  • This is not to say that innovation doesn’t bring disruption, and American policymakers can’t ignore the jobs that are lost when government standards sweep older technologies into the dustbin of history. An effective way forward on light bulbs, then, would be to apply standards only to those manufacturers that produce or import in large volume. Meanwhile, smaller, legacy light-bulb producers could remain, cushioning the blow to workers and meeting consumer demand.
  • Technologies and the standards that guide their deployment have revolutionized American society. They’ve been so successful, in fact, that the role of government has become invisible — so much so that even members of Congress should be excused for believing the government has no business mandating your choice of light bulbs.
Weiye Loh

The Problem with Climate Change | the kent ridge common - 0 views

  • what is climate change? From a scientific point of view, it is simply a statistical change in atmospheric variables (temperature, precipitation, humidity etc). It has been occurring ever since the Earth came into existence, far before humans even set foot on the planet: our climate has been fluctuating between warm periods and ice ages, with further variations within. In fact, we are living in a warm interglacial period in the middle of an ice age.
  • Global warming has often been portrayed in apocalyptic tones, whether from the mouth of the media or environmental groups: the daily news tell of natural disasters happening at a frightening pace, of crop failures due to strange weather, of mass extinctions and coral die-outs. When the devastating tsunami struck Southeast Asia years ago, some said it was the wrath of God against human mistreatment of the environment; when hurricane Katrina dealt out a catastrophe, others said it was because of (America’s) failure to deal with climate change. Science gives the figures and trends, and people take these to extremes.
  • One immediate problem with blaming climate change for every weather-related disaster or phenomenon is that it reduces humans’ responsibility of mitigating or preventing it. If natural disasters are already, as their name suggests, natural, adding the tag ‘global warming’ or ‘climate change’ emphasizes the dominance of natural forces, and our inability to do anything about it. Surely, humans cannot undo climate change? Even at Cancun, amid the carbon cuts that have been promised, questions are being brought up on whether they are sufficient to reverse our actions and ‘save’ the planet.  Yet the talk about this remote, omnipotent force known as climate change obscures the fact that, we can, and have always been, thinking of ways to reduce the impact of natural hazards. Forecasting, building better infrastructure and coordinating more efficient responses – all these are far more desirable to wading in woe. For example, we will do better at preventing floods in Singapore at tackling the problems rather than singing in praise of God.
  • ...5 more annotations...
  • However, a greater concern lies in the notion of climate change itself. Climate change is in essence one kind of nature-society relationship, in which humans influence the climate through greenhouse gas (particularly CO2) emissions, and the climate strikes back by heating up and going crazy at times. This can be further simplified into a battle between humans and CO2: reducing CO2 guards against climate change, and increasing it aggravates the consequences. This view is anchored in scientists’ recommendation that a ‘safe’ level of CO2 should be at 350 parts per million (ppm) instead of the current 390. Already, the need to reduce CO2 is understood, as is evident in the push for greener fuels, more efficient means of production, the proliferation of ‘green’ products and companies, and most recently, the Cancun talks.
  • So can there be anything wrong with reducing CO2? No, there isn’t, but singling out CO2 as the culprit of climate change or of the environmental problems we face prevents us from looking within. What do I mean? The enemy, CO2, is an ‘other’, an externality produced by our economic systems but never an inherent component of the systems. Thus, we can declare war on the gas or on climate change without taking a step back and questioning: is there anything wrong with the way we develop?  Take Singapore for example: the government pledged to reduce carbon emissions by 16% under ‘business as usual’ standards, which says nothing about how ‘business’ is going to be changed other than having less carbon emissions (in fact, it is questionable even that CO2 levels will decrease, as ‘business as usual’ standards project a steady increase emission of CO2 each year). With the development of green technologies, decrease in carbon emissions will mainly be brought about by increased energy efficiency and switch to alternative fuels (including the insidious nuclear energy).
  • Thus, the way we develop will hardly be changed. Nobody questions whether our neoliberal system of development, which relies heavily on consumption to drive economies, needs to be looked into. We assume that it is the right way to develop, and only tweak it for the amount of externalities produced. Whether or not we should be measuring development by the Gross Domestic Product (GDP) or if welfare is correlated to the amount of goods and services consumed is never considered. Even the UN-REDD (Reducing Emissions from Deforestation and Forest Degradation) scheme which aims to pay forest-rich countries for protecting their forests, ends up putting a price tag on them. The environment is being subsumed under the economy, when it should be that the economy is re-looked to take the environment into consideration.
  • when the world is celebrating after having held at bay the dangerous greenhouse gas, why would anyone bother rethinking about the economy? Yet we should, simply because there are alternative nature-society relationships and discourses about nature that are more or of equal importance as global warming. Annie Leonard’s informative videos on The Story of Stuff and specific products like electronics, bottled water and cosmetics shed light on the dangers of our ‘throw-away culture’ on the planet and poorer countries. What if the enemy was instead consumerism? Doing so would force countries (especially richer ones) to fundamentally question the nature of development, instead of just applying a quick technological fix. This is so much more difficult (and less economically viable), alongside other issues like environmental injustices – e.g. pollution or dumping of waste by Trans-National Corporations in poorer countries and removal of indigenous land rights. It is no wonder that we choose to disregard internal problems and focus instead on an external enemy; when CO2 is the culprit, the solution is too simple and detached from the communities that are affected by changes in their environment.
  • We need hence to allow for a greater politics of the environment. What I am proposing is not to diminish our action to reduce carbon emissions, for I do believe that it is part of the environmental problem that we are facing. What instead should be done is to reduce our fixation on CO2 as the main or only driver of climate change, and of climate change as the most pertinent nature-society issue we are facing. We should understand that there are many other ways of thinking about the environment; ‘developing’ countries, for example, tend to have a closer relationship with their environment – it is not something ‘out there’ but constantly interacted with for food, water, regulating services and cultural value. Their views and the impact of the socio-economic forces (often from TNCs and multi-lateral organizations like IMF) that shape the environment must also be taken into account, as do alternative meanings of sustainable development. Thus, even as we pat ourselves on the back for having achieved something significant at Cancun, our action should not and must not end there. Even if climate change hogs the headlines now, we must embrace more plurality in environmental discourse, for nature is not and never so simple as climate change alone. And hopefully sometime in the future, alongside a multi-lateral conference on climate change, the world can have one which rethinks the meaning of development.
  •  
    Chen Jinwen
Weiye Loh

It's Only A Theory: From the 2010 APA in Boston: Neuropsychology and ethics - 0 views

  • Joshua Greene from Harvard, known for his research on "neuroethics," the neurological underpinnings of ethical decision making in humans. The title of Greene's talk was "Beyond point-and-shoot morality: why cognitive neuroscience matters for ethics."
  • What Greene is interested in is to find out to what factors moral judgment is sensitive to, and whether it is sensitive to the relevant factors. He presented his dual process theory of morality. In this respect, he proposed an analogy with a camera. Cameras have automatic (point and shoot) settings as well as manual controls. The first mode is good enough for most purposes, the second allows the user to fine tune the settings more carefully. The two modes allow for a nice combination of efficiency and flexibility.
  • The idea is that the human brain also has two modes, a set of efficient automatic responses and a manual mode that makes us more flexible in response to non standard situations. The non moral example is our response to potential threats. Here the amygdala is very fast and efficient at focusing on potential threats (e.g., the outline of eyes in the dark), even when there actually is no threat (it's a controlled experiment in a lab, no lurking predator around).
  • ...12 more annotations...
  • Delayed gratification illustrates the interaction between the two modes. The brain is attracted by immediate rewards, no matter what kind. However, when larger rewards are eventually going to become available, other parts of the brain come into play to override (sometimes) the immediate urge.
  • Greene's research shows that our automatic setting is "Kantian," meaning that our intuitive responses are deontological, rule driven. The manual setting, on the other hand, tends to be more utilitarian / consequentialist. Accordingly, the first mode involves emotional areas of the brain, the second one involves more cognitive areas.
  • The evidence comes from the (in)famous trolley dilemma and it's many variations.
  • when people refuse to intervene in the footbridge (as opposed to the lever) version of the dilemma, they do so because of a strong emotional response, which contradicts the otherwise utilitarian calculus they make when considering the lever version.
  • psychopaths turn out to be more utilitarian than normal subjects - presumably not because consequentialism is inherently pathological, but because their emotional responses are stunted. Mood also affects the results, with people exposed to comedy (to enhance mood), for instance, more likely to say that it is okay to push the guy off the footbridge.
  • In a more recent experiment, subjects were asked to say which action carried the better consequences, which made them feel worse, and which was overall morally acceptable. The idea was to separate the cognitive, emotional and integrative aspects of moral decision making. Predictably, activity in the amygdala correlated with deontological judgment, activity in more cognitive areas was associated with utilitarianism, and different brain regions became involved in integrating the two.
  • Another recent experiment used visual vs. verbal descriptions of moral dilemmas. Turns out that more visual people tend to behave emotionally / deontologically, while more verbal people are more utilitarian.
  • studies show that interfering with moral judgment by engaging subjects with a cognitive task slows down (though it does not reverse) utilitarian judgment, but has no effect on deontological judgment. Again, in agreement with the conclusion that the first type of modality is the result of cognition, the latter of emotion.
  • Nice to know, by the way, that when experimenters controlled for "real world expectations" that people have about trolleys, or when they used more realistic scenarios than trolleys and bridges, the results don't vary. In other words, trolley thought experiments are actually informative, contrary to popular criticisms.
  • What factors affect people's decision making in moral judgment? The main one is proximity, with people feeling much stronger obligations if they are present to the event posing the dilemma, or even relatively near (a disaster happens in a nearby country), as opposed to when they are far (a country on the other side of the world).
  • Greene's general conclusion is that neuroscience matters to ethics because it reveals the hidden mechanisms of human moral decision making. However, he says this is interesting to philosophers because it may lead to question ethical theories that are implicitly or explicitly based on such judgments. But neither philosophical deontology nor consequentialism are in fact based on common moral judgments, seems to me. They are the result of explicit analysis. (Though Greene raises the possibility that some philosophers engage in rationalizing, rather than reason, as in Kant's famously convoluted idea that masturbation is wrong because one is using oneself as a mean to an end...)
  • this is not to say that understanding moral decision making in humans isn't interesting or in fact even helpful in real life cases. An example of the latter is the common moral condemnation of incest, which is an emotional reaction that probably evolved to avoid genetically diseased offspring. It follows that science can tell us that three is nothing morally wrong in cases of incest when precautions have been taken to avoid pregnancy (and assuming psychological reactions are also accounted for). Greene puts this in terms of science helping us to transform difficult ought questions into easier ought questions.
Weiye Loh

McKinsey & Company - Clouds, big data, and smart assets: Ten tech-enabled business tren... - 0 views

  • 1. Distributed cocreation moves into the mainstreamIn the past few years, the ability to organise communities of Web participants to develop, market, and support products and services has moved from the margins of business practice to the mainstream. Wikipedia and a handful of open-source software developers were the pioneers. But in signs of the steady march forward, 70 per cent of the executives we recently surveyed said that their companies regularly created value through Web communities. Similarly, more than 68m bloggers post reviews and recommendations about products and services.
  • for every success in tapping communities to create value, there are still many failures. Some companies neglect the up-front research needed to identify potential participants who have the right skill sets and will be motivated to participate over the longer term. Since cocreation is a two-way process, companies must also provide feedback to stimulate continuing participation and commitment. Getting incentives right is important as well: cocreators often value reputation more than money. Finally, an organisation must gain a high level of trust within a Web community to earn the engagement of top participants.
  • 2. Making the network the organisation In earlier research, we noted that the Web was starting to force open the boundaries of organisations, allowing nonemployees to offer their expertise in novel ways. We called this phenomenon "tapping into a world of talent." Now many companies are pushing substantially beyond that starting point, building and managing flexible networks that extend across internal and often even external borders. The recession underscored the value of such flexibility in managing volatility. We believe that the more porous, networked organisations of the future will need to organise work around critical tasks rather than molding it to constraints imposed by corporate structures.
  • ...10 more annotations...
  • 3. Collaboration at scale Across many economies, the number of people who undertake knowledge work has grown much more quickly than the number of production or transactions workers. Knowledge workers typically are paid more than others, so increasing their productivity is critical. As a result, there is broad interest in collaboration technologies that promise to improve these workers' efficiency and effectiveness. While the body of knowledge around the best use of such technologies is still developing, a number of companies have conducted experiments, as we see in the rapid growth rates of video and Web conferencing, expected to top 20 per cent annually during the next few years.
  • 4. The growing ‘Internet of Things' The adoption of RFID (radio-frequency identification) and related technologies was the basis of a trend we first recognised as "expanding the frontiers of automation." But these methods are rudimentary compared with what emerges when assets themselves become elements of an information system, with the ability to capture, compute, communicate, and collaborate around information—something that has come to be known as the "Internet of Things." Embedded with sensors, actuators, and communications capabilities, such objects will soon be able to absorb and transmit information on a massive scale and, in some cases, to adapt and react to changes in the environment automatically. These "smart" assets can make processes more efficient, give products new capabilities, and spark novel business models. Auto insurers in Europe and the United States are testing these waters with offers to install sensors in customers' vehicles. The result is new pricing models that base charges for risk on driving behavior rather than on a driver's demographic characteristics. Luxury-auto manufacturers are equipping vehicles with networked sensors that can automatically take evasive action when accidents are about to happen. In medicine, sensors embedded in or worn by patients continuously report changes in health conditions to physicians, who can adjust treatments when necessary. Sensors in manufacturing lines for products as diverse as computer chips and pulp and paper take detailed readings on process conditions and automatically make adjustments to reduce waste, downtime, and costly human interventions.
  • 5. Experimentation and big data Could the enterprise become a full-time laboratory? What if you could analyse every transaction, capture insights from every customer interaction, and didn't have to wait for months to get data from the field? What if…? Data are flooding in at rates never seen before—doubling every 18 months—as a result of greater access to customer data from public, proprietary, and purchased sources, as well as new information gathered from Web communities and newly deployed smart assets. These trends are broadly known as "big data." Technology for capturing and analysing information is widely available at ever-lower price points. But many companies are taking data use to new levels, using IT to support rigorous, constant business experimentation that guides decisions and to test new products, business models, and innovations in customer experience. In some cases, the new approaches help companies make decisions in real time. This trend has the potential to drive a radical transformation in research, innovation, and marketing.
  • Using experimentation and big data as essential components of management decision making requires new capabilities, as well as organisational and cultural change. Most companies are far from accessing all the available data. Some haven't even mastered the technologies needed to capture and analyse the valuable information they can access. More commonly, they don't have the right talent and processes to design experiments and extract business value from big data, which require changes in the way many executives now make decisions: trusting instincts and experience over experimentation and rigorous analysis. To get managers at all echelons to accept the value of experimentation, senior leaders must buy into a "test and learn" mind-set and then serve as role models for their teams.
  • 6. Wiring for a sustainable world Even as regulatory frameworks continue to evolve, environmental stewardship and sustainability clearly are C-level agenda topics. What's more, sustainability is fast becoming an important corporate-performance metric—one that stakeholders, outside influencers, and even financial markets have begun to track. Information technology plays a dual role in this debate: it is both a significant source of environmental emissions and a key enabler of many strategies to mitigate environmental damage. At present, information technology's share of the world's environmental footprint is growing because of the ever-increasing demand for IT capacity and services. Electricity produced to power the world's data centers generates greenhouse gases on the scale of countries such as Argentina or the Netherlands, and these emissions could increase fourfold by 2020. McKinsey research has shown, however, that the use of IT in areas such as smart power grids, efficient buildings, and better logistics planning could eliminate five times the carbon emissions that the IT industry produces.
  • 7. Imagining anything as a service Technology now enables companies to monitor, measure, customise, and bill for asset use at a much more fine-grained level than ever before. Asset owners can therefore create services around what have traditionally been sold as products. Business-to-business (B2B) customers like these service offerings because they allow companies to purchase units of a service and to account for them as a variable cost rather than undertake large capital investments. Consumers also like this "paying only for what you use" model, which helps them avoid large expenditures, as well as the hassles of buying and maintaining a product.
  • In the IT industry, the growth of "cloud computing" (accessing computer resources provided through networks rather than running software or storing data on a local computer) exemplifies this shift. Consumer acceptance of Web-based cloud services for everything from e-mail to video is of course becoming universal, and companies are following suit. Software as a service (SaaS), which enables organisations to access services such as customer relationship management, is growing at a 17 per cent annual rate. The biotechnology company Genentech, for example, uses Google Apps for e-mail and to create documents and spreadsheets, bypassing capital investments in servers and software licenses. This development has created a wave of computing capabilities delivered as a service, including infrastructure, platform, applications, and content. And vendors are competing, with innovation and new business models, to match the needs of different customers.
  • 8. The age of the multisided business model Multisided business models create value through interactions among multiple players rather than traditional one-on-one transactions or information exchanges. In the media industry, advertising is a classic example of how these models work. Newspapers, magasines, and television stations offer content to their audiences while generating a significant portion of their revenues from third parties: advertisers. Other revenue, often through subscriptions, comes directly from consumers. More recently, this advertising-supported model has proliferated on the Internet, underwriting Web content sites, as well as services such as search and e-mail (see trend number seven, "Imagining anything as a service," earlier in this article). It is now spreading to new markets, such as enterprise software: Spiceworks offers IT-management applications to 950,000 users at no cost, while it collects advertising from B2B companies that want access to IT professionals.
  • 9. Innovating from the bottom of the pyramid The adoption of technology is a global phenomenon, and the intensity of its usage is particularly impressive in emerging markets. Our research has shown that disruptive business models arise when technology combines with extreme market conditions, such as customer demand for very low price points, poor infrastructure, hard-to-access suppliers, and low cost curves for talent. With an economic recovery beginning to take hold in some parts of the world, high rates of growth have resumed in many developing nations, and we're seeing companies built around the new models emerging as global players. Many multinationals, meanwhile, are only starting to think about developing markets as wellsprings of technology-enabled innovation rather than as traditional manufacturing hubs.
  • 10. Producing public good on the grid The role of governments in shaping global economic policy will expand in coming years. Technology will be an important factor in this evolution by facilitating the creation of new types of public goods while helping to manage them more effectively. This last trend is broad in scope and draws upon many of the other trends described above.
Weiye Loh

journalism.sg » Racial and religious offence: Why censorship doesn't cut it - 1 views

  • All societies use a mix of approaches to address offensive speech. In international law, like at the European court of human rights and more and more jurisdictions, there is growing feeling that the law should really be a last resort and only used for the most extreme speech – speech that incites violence in a very direct way, or that is part of a campaign that violates the rights of minorities to live free of discrimination. In contrast, simply insulting and offending others, even if feelings are very hurt, is not seen as something that should invite a legal response. Using the law to protect feelings is too great an encroachment on freedom of speech.
  • Our laws are written very broadly, such that any sort of offence, even if it does not threaten imminent violence, is seen as deserving of strict regulation. This probably reflects a very strong social consensus that race and religion should be handled delicately. So we tend to rely on strong government. The state protects racial and religious sensibilities from offence, using censorship when there’s a danger of words and actions causing hurt.
  • in almost all cases, state action was instigated by complaints from members of the public. This is quite unlike political censorship, where action is initiated by the government, often with great resistance and opposition from netizens. In a string of cases involving racial and religious offence, however, it’s the netizens who tend to demand action, sometimes acting like a lynch mob.
  • ...5 more annotations...
  • in many cases, the offensive messages were spread further by those reporting the offence.
  • What is the justification for strong police action against any form of speech? Why do we sometimes feel that it may not be enough to counter bad speech with good speech in free and open debate, and that we must instead use the law to stop the bad speech? Surely, it must be because we think the bad speech is so dangerous that it can cause immediate harm; or because we don’t trust the public to respond rationally, so we don’t know if good speech would indeed triumph in open debate. Usually, if we call in the authorities, it must be because we have a mental picture of offensive speech being like lighting a match in a combustible atmosphere. It is dangerous and there’s no time to debate the merits of that match – we just have to put it out. The irony of most of the cases that we have seen in the past few years is that the people demanding government action, as if the offensive words were explosive, were also those who helped to spread them. It is like helping to spread a fire while calling for the fire brigade.
  • their act of spreading the offensive content must mean that they did not actually believe that the expression was really that dangerous in the sense of prompting violence through reprisal attacks or riots. In reposting the offensive words or pictures, they showed that they actually trusted the public enough to respond sympathetically – they had faith that enough people would add their voices to the outrage that they themselves felt when they saw the offensive images or videos or words.
  • This then raises the question, why the need to involve the police at all? If Singaporeans are grown-up enough to defend their society against offensive speech, why have calls for prosecution and censorship become the automatic response? I wonder if this is an example of the well-known habit of unthinkingly relying on government to solve all our problems even when, with a little bit of effort in the form of grassroots action can do the job.
  • The next time people encounter racist or religiously offensive speech, it would be nice to see swift responses from credible and respected civil society groups, Members of Parliament, and other ordinary citizens. If the speaker doesn’t get the message, organise boycotts, for example, and give him or her the clear message that our society isn’t going to take such offence lying down. The more we can respond ourselves through open debate and grassroots action, without the need to ask law and order to step in, the stronger our society will be.
  •  
    No matter how hard we work at developing media literacy, we should not expect to be rid of all racially offensive speech online. There are two broad ways to respond to these breaches. We can reach out horizontally and together with our fellow citizens repair the damage by persuading others to reject harmful ideas. Or, we can reach up vertically to government, getting the authorities to act against irresponsible speech by using the law. The advantage of the latter is that it seems more efficient, punishing those who cross the line of acceptability and violate social norms, and deterring others from doing the same. The horizontal approach works through persuasion rather than the law, so it is slower and not foolproof.
Weiye Loh

Judge dismisses authors' case against Google Books | Internet & Media - CNET News - 0 views

  •  
    "In my view, Google Books provides significant public benefits. It advances the progress of the arts and sciences, while maintaining respectful consideration for the rights of authors and other creative individuals, and without adversely impacting the rights of copyright holders. It has become an invaluable research tool that permits students, teachers, librarians, and others to more efficiently identify and locate books. It has given scholars the ability, for the first time, to conduct full-text searches of tens of millions of books. It preserves books, in particular out-of-print and old books that have been forgotten in the bowels of libraries, and it gives them new life. It facilitates access to books for print-disabled and remote or underserved populations. It generates new audiences and creates new sources of income for authors and publishers. Indeed, all society benefits."
Weiye Loh

How should we use data to improve our lives? - By Michael Agger - Slate Magazine - 0 views

  • The Swiss economists Bruno Frey and Alois Stutzer argue that people do not appreciate the real cost of a long commute. And especially when that commute is unpredictable, it takes a toll on our daily well-being.
  • imagine if we shared our commuting information so that we could calculate the average commute from various locations around a city. When the growing family of four pulls up to a house for sale for in New Jersey, the listing would indicate not only the price and the number of bathrooms but also the rush-hour commute time to Midtown Manhattan. That would be valuable information to have, since buyers could realistically factor the tradeoffs of remaining in a smaller space closer to work against moving to a larger space and taking on a longer commute.
  • In a cover story for the New York Times Magazine, the writer Gary Wolf documented the followers of “The Data-Driven Life,” programmers, students, and self-described geeks who track various aspects of their lives. Seth Roberts does a daily math exercise to measure small changes in his mental acuity. Kiel Gilleade is a "Body Blogger" who shares his heart rate via Twitter. On the more extreme end, Mark Carranza has a searchable database of every idea he's had since 1984. They're not alone. This community continues to thrive, and its efforts are chronicled at a blog called the Quantified Self, co-founded by Wolf and Kevin Kelly.
  • ...3 more annotations...
  • If you've ever asked Nike+ to log your runs or given Google permission to keep your search history, you've participated in a bit of self-tracking. Now that more people have location-aware smartphones and the Web has made data easy to share, personal data is poised to become an important tool to understand how we live, and how we all might live better. One great example of this phenomenon in action is the site Cure Together, which allows you to enter your symptoms—for, say, "anxiety" or "insomnia"—and the various remedies you've tried to feel better. One thing the site does is aggregate this information and present the results in chart form. Here is the chart for depression:
  • Instead of being isolated in your own condition, you can now see what has worked for others. The same principle is at work at the site Fuelly, where you can "track, share, and compare" your miles per gallon and see how efficient certain makes and models really are.
  • Businesses are also using data tracking to spur their employees to accomplishing companywide goals: Wal-Mart partnered with Zazengo to help employees track their "personal sustainability" actions such as making a home-cooked meal or buying local produce. The app Rescue Time, which records all of the activity on your computer, gives workers an easy way to account for their time. And that comes in handy when you want to show the boss how efficient telecommuting can be.
  •  
    Data for a better planet
Weiye Loh

Jonathan Stray » Measuring and improving accuracy in journalism - 0 views

  • Accuracy is a hard thing to measure because it’s a hard thing to define. There are subjective and objective errors, and no standard way of determining whether a reported fact is true or false
  • The last big study of mainstream reporting accuracy found errors (defined below) in 59% of 4,800 stories across 14 metro newspapers. This level of inaccuracy — where about one in every two articles contains an error — has persisted for as long as news accuracy has been studied, over seven decades now.
  • With the explosion of available information, more than ever it’s time to get serious about accuracy, about knowing which sources can be trusted. Fortunately, there are emerging techniques that might help us to measure media accuracy cheaply, and then increase it.
  • ...7 more annotations...
  • We could continuously sample a news source’s output to produce ongoing accuracy estimates, and build social software to help the audience report and filter errors. Meticulously applied, this approach would give a measure of the accuracy of each information source, and a measure of the efficiency of their corrections process (currently only about 3% of all errors are corrected.)
  • Real world reporting isn’t always clearly “right” or “wrong,” so it will often be hard to decide whether something is an error or not. But we’re not going for ultimate Truth here,  just a general way of measuring some important aspect of the idea we call “accuracy.” In practice it’s important that the error counting method is simple, clear and repeatable, so that you can compare error rates of different times and sources.
  • Subjective errors, though by definition involving judgment, should not be dismissed as merely differences in opinion. Sources found such errors to be about as common as factual errors and often more egregious [as rated by the sources.] But subjective errors are a very complex category
  • One of the major problems with previous news accuracy metrics is the effort and time required to produce them. In short, existing accuracy measurement methods are expensive and slow. I’ve been wondering if we can do better, and a simple idea comes to mind: sampling. The core idea is this: news sources could take an ongoing random sample of their output and check it for accuracy — a fact check spot check
  • Standard statistical theory tells us what the error on that estimate will be for any given number of samples (If I’ve got this right, the relevant formula is standard error of a population proportion estimate without replacement.) At a sample rate of a few stories per day, daily estimates of error rate won’t be worth much. But weekly and monthly aggregates will start to produce useful accuracy estimates
  • the first step would be admitting how inaccurate journalism has historically been. Then we have to come up with standardized accuracy evaluation procedures, in pursuit of metrics that capture enough of what we mean by “true” to be worth optimizing. Meanwhile, we can ramp up the efficiency of our online corrections processes until we find as many useful, legitimate errors as possible with as little staff time as possible. It might also be possible do data mining on types of errors and types of stories to figure out if there are patterns in how an organization fails to get facts right.
  • I’d love to live in a world where I could compare the accuracy of information sources, where errors got found and fixed with crowd-sourced ease, and where news organizations weren’t shy about telling me what they did and did not know. Basic factual accuracy is far from the only measure of good journalism, but perhaps it’s an improvement over the current sad state of affairs
  •  
    Professional journalism is supposed to be "factual," "accurate," or just plain true. Is it? Has news accuracy been getting better or worse in the last decade? How does it vary between news organizations, and how do other information sources rate? Is professional journalism more or less accurate than everything else on the internet? These all seem like important questions, so I've been poking around, trying to figure out what we know and don't know about the accuracy of our news sources. Meanwhile, the online news corrections process continues to evolve, which gives us hope that the news will become more accurate in the future.
juliet huang

Government 2.0 - it's the community, stupid - 8 views

This article comments on the government use of the internet to comment and collabrate with the public. According to the author, efficient use of tools to interact with the public will create a "c...

started by juliet huang on 16 Sep 09 no follow-up yet
YongTeck Lee

State 2.0: a new front end? - 3 views

http://www.opendemocracy.net/article/state-2-0-a-new-front-end A paragraph in the article sums up what the article is about quite well: "Indeed, new problematics are emerging around online democr...

digital democracy

started by YongTeck Lee on 15 Sep 09 no follow-up yet
Weiye Loh

Privacy in Singapore - 9 views

http://unpan1.un.org/intradoc/groups/public/documents/APCITY/UNPAN002553.pdf There is no general data protection or privacy law in Singapore. The government has been aggressive in using surveill...

Singapore Privacy Electronic Road Pricing Surveillance

Weiye Loh

Hayek, The Use of Knowledge in Society | Library of Economics and Liberty - 0 views

  • the "data" from which the economic calculus starts are never for the whole society "given" to a single mind which could work out the implications and can never be so given.
  • The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess.
  • The economic problem of society
  • ...14 more annotations...
  • is a problem of the utilization of knowledge which is not given to anyone in its totality.
  • who is to do the planning. It is about this question that all the dispute about "economic planning" centers. This is not a dispute about whether planning is to be done or not. It is a dispute as to whether planning is to be done centrally, by one authority for the whole economic system, or is to be divided among many individuals. Planning in the specific sense in which the term is used in contemporary controversy necessarily means central planning—direction of the whole economic system according to one unified plan. Competition, on the other hand, means decentralized planning by many separate persons. The halfway house between the two, about which many people talk but which few like when they see it, is the
  • Which of these systems is likely to be more efficient depends mainly on the question under which of them we can expect that fuller use will be made of the existing knowledge.
  • It may be admitted that, as far as scientific knowledge is concerned, a body of suitably chosen experts may be in the best position to command all the best knowledge available—though this is of course merely shifting the difficulty to the problem of selecting the experts.
  • Today it is almost heresy to suggest that scientific knowledge is not the sum of all knowledge. But a little reflection will show that there is beyond question a body of very important but unorganized knowledge which cannot possibly be called scientific in the sense of knowledge of general rules: the knowledge of the particular circumstances of time and place. It is with respect to this that practically every individual has some advantage over all others because he possesses unique information of which beneficial use might be made, but of which use can be made only if the decisions depending on it are left to him or are made with his active coöperation.
  • the relative importance of the different kinds of knowledge; those more likely to be at the disposal of particular individuals and those which we should with greater confidence expect to find in the possession of an authority made up of suitably chosen experts. If it is today so widely assumed that the latter will be in a better position, this is because one kind of knowledge, namely, scientific knowledge, occupies now so prominent a place in public imagination that we tend to forget that it is not the only kind that is relevant.
  • It is a curious fact that this sort of knowledge should today be generally regarded with a kind of contempt and that anyone who by such knowledge gains an advantage over somebody better equipped with theoretical or technical knowledge is thought to have acted almost disreputably. To gain an advantage from better knowledge of facilities of communication or transport is sometimes regarded as almost dishonest, although it is quite as important that society make use of the best opportunities in this respect as in using the latest scientific discoveries.
  • The common idea now seems to be that all such knowledge should as a matter of course be readily at the command of everybody, and the reproach of irrationality leveled against the existing economic order is frequently based on the fact that it is not so available. This view disregards the fact that the method by which such knowledge can be made as widely available as possible is precisely the problem to which we have to find an answer.
  • One reason why economists are increasingly apt to forget about the constant small changes which make up the whole economic picture is probably their growing preoccupation with statistical aggregates, which show a very much greater stability than the movements of the detail. The comparative stability of the aggregates cannot, however, be accounted for—as the statisticians occasionally seem to be inclined to do—by the "law of large numbers" or the mutual compensation of random changes.
  • the sort of knowledge with which I have been concerned is knowledge of the kind which by its nature cannot enter into statistics and therefore cannot be conveyed to any central authority in statistical form. The statistics which such a central authority would have to use would have to be arrived at precisely by abstracting from minor differences between the things, by lumping together, as resources of one kind, items which differ as regards location, quality, and other particulars, in a way which may be very significant for the specific decision. It follows from this that central planning based on statistical information by its nature cannot take direct account of these circumstances of time and place and that the central planner will have to find some way or other in which the decisions depending on them can be left to the "man on the spot."
  • We need decentralization because only thus can we insure that the knowledge of the particular circumstances of time and place will be promptly used. But the "man on the spot" cannot decide solely on the basis of his limited but intimate knowledge of the facts of his immediate surroundings. There still remains the problem of communicating to him such further information as he needs to fit his decisions into the whole pattern of changes of the larger economic system.
  • The problem which we meet here is by no means peculiar to economics but arises in connection with nearly all truly social phenomena, with language and with most of our cultural inheritance, and constitutes really the central theoretical problem of all social science. As Alfred Whitehead has said in another connection, "It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them." This is of profound significance in the social field. We make constant use of formulas, symbols, and rules whose meaning we do not understand and through the use of which we avail ourselves of the assistance of knowledge which individually we do not possess. We have developed these practices and institutions by building upon habits and institutions which have proved successful in their own sphere and which have in turn become the foundation of the civilization we have built up.
  • To assume all the knowledge to be given to a single mind in the same manner in which we assume it to be given to us as the explaining economists is to assume the problem away and to disregard everything that is important and significant in the real world.
  • That an economist of Professor Schumpeter's standing should thus have fallen into a trap which the ambiguity of the term "datum" sets to the unwary can hardly be explained as a simple error. It suggests rather that there is something fundamentally wrong with an approach which habitually disregards an essential part of the phenomena with which we have to deal: the unavoidable imperfection of man's knowledge and the consequent need for a process by which knowledge is constantly communicated and acquired. Any approach, such as that of much of mathematical economics with its simultaneous equations, which in effect starts from the assumption that people's knowledge corresponds with the objective facts of the situation, systematically leaves out what is our main task to explain. I am far from denying that in our system equilibrium analysis has a useful function to perform. But when it comes to the point where it misleads some of our leading thinkers into believing that the situation which it describes has direct relevance to the solution of practical problems, it is high time that we remember that it does not deal with the social process at all and that it is no more than a useful preliminary to the study of the main problem.
  •  
    The Use of Knowledge in Society Hayek, Friedrich A.(1899-1992)
Weiye Loh

Essay - The End of Tenure? - NYTimes.com - 0 views

  • The cost of a college education has risen, in real dollars, by 250 to 300 percent over the past three decades, far above the rate of inflation. Elite private colleges can cost more than $200,000 over four years. Total student-loan debt, at nearly $830 billion, recently surpassed total national credit card debt. Meanwhile, university presidents, who can make upward of $1 million annually, gravely intone that the $50,000 price tag doesn’t even cover the full cost of a year’s education.
  • Then your daughter reports that her history prof is a part-time adjunct, who might be making $1,500 for a semester’s work. There’s something wrong with this picture.
  • The higher-ed jeremiads of the last generation came mainly from the right. But this time, it’s the tenured radicals — or at least the tenured liberals — who are leading the charge. Hacker is a longtime contributor to The New York Review of Books and the author of the acclaimed study “Two Nations: Black and White, Separate, Hostile, Unequal,”
  • ...6 more annotations...
  • And these two books arrive at a time, unlike the early 1990s, when universities are, like many students, backed into a fiscal corner. Taylor writes of walking into a meeting one day and learning that Columbia’s endowment had dropped by “at least” 30 percent. Simply brushing off calls for reform, however strident and scattershot, may no longer be an option.
  • The labor system, for one thing, is clearly unjust. Tenured and tenure-track professors earn most of the money and benefits, but they’re a minority at the top of a pyramid. Nearly two-thirds of all college teachers are non-tenure-track adjuncts like Matt Williams, who told Hacker and Dreifus he had taught a dozen courses at two colleges in the Akron area the previous year, earning the equivalent of about $8.50 an hour by his reckoning. It is foolish that graduate programs are pumping new Ph.D.’s into a world without decent jobs for them. If some programs were phased out, teaching loads might be raised for some on the tenure track, to the benefit of undergraduate education.
  • it might well be time to think about vetoing Olympic-quality athletic ­facilities and trimming the ranks of administrators. At Williams, a small liberal arts college renowned for teaching, 70 percent of employees do something other than teach.
  • But Hacker and Dreifus go much further, all but calling for an end to the role of universities in the production of knowledge. Spin off the med schools and research institutes, they say. University presidents “should be musing about education, not angling for another center on antiterrorist technologies.” As for the humanities, let professors do research after-hours, on top of much heavier teaching schedules. “In other occupations, when people feel there is something they want to write, they do it on their own time and at their own expense,” the authors declare. But it seems doubtful that, say, “Battle Cry of Freedom,” the acclaimed Civil War history by Princeton’s James McPherson, could have been written on the weekends, or without the advance spadework of countless obscure monographs. If it is false that research invariably leads to better teaching, it is equally false to say that it never does.
  • Hacker’s home institution, the public Queens College, which has a spartan budget, commuter students and a three-or-four-course teaching load per semester. Taylor, by contrast, has spent his career on the elite end of higher education, but he is no less disillusioned. He shares Hacker and Dreifus’s concerns about overspecialized research and the unintended effects of tenure, which he believes blocks the way to fresh ideas. Taylor has backed away from some of the most incendiary proposals he made last year in a New York Times Op-Ed article, cheekily headlined “End the University as We Know It” — an article, he reports, that drew near-universal condemnation from academics and near-universal praise from everyone else. Back then, he called for the flat-out abolition of traditional departments, to be replaced by temporary, “problem-centered” programs focusing on issues like Mind, Space, Time, Life and Water. Now, he more realistically suggests the creation of cross-­disciplinary “Emerging Zones.” He thinks professors need to get over their fear of corporate partnerships and embrace efficiency-enhancing technologies.
  • It is not news that America is a land of haves and have-nots. It is news that colleges are themselves dividing into haves and have-nots; they are becoming engines of inequality. And that — not whether some professors can afford to wear Marc Jacobs — is the real scandal.
  •  
    The End of Tenure? By CHRISTOPHER SHEA Published: September 3, 2010
Weiye Loh

Study: Airport Security Should Stop Racial Profiling | Smart Journalism. Real Solutions... - 0 views

  • Plucking out of line most of the vaguely Middle Eastern-looking men at the airport for heightened screening is no more effective at catching terrorists than randomly sampling everyone. It may even be less effective. Press stumbled across this counterintuitive concept — sometimes the best way to find something is not to weight it by probability — in the unrelated context of computational biology. The parallels to airport security struck him when a friend mentioned he was constantly being pulled out of line at the airport.
  • Racial profiling, in other words, doesn’t work because it devotes heightened resources to innocent people — and then devotes those resources to them repeatedly even after they’ve been cleared as innocent the first time. The actual terrorists, meanwhile, may sneak through while Transportation Security Administration agents are focusing their limited attention on the wrong passengers.
  • Press tested the theory in a series of probability equations (the ambitious can check his math here and here).
  • ...3 more annotations...
  • Sampling based on profiling is mathematically no more effective than uniform random sampling. The optimal equation, rather, turns out to be something called “square-root sampling,” a compromise between the other two methods.
  • “Crudely,” Press writes of his findings in the journal Significance, if certain people are “nine times as likely to be the terrorist, we pull out only three times as many of them for special checks. Surprisingly, and bizarrely, this turns out to be the most efficient way of catching the terrorist.”
  • Square-root sampling, though, still represents a kind of profiling, and, Press adds, not one that could be realistically implemented at airports today. Square-root sampling only works if the profile probabilities are accurate in the first place — if we are able to say with mathematical certainty that some types of people are “nine times as likely to be the terrorist” compared to others. TSA agents in a crowded holiday terminal making snap judgments about facial hair would be far from this standard. “The nice thing about uniform sampling is there’s nothing to be inaccurate about, you don’t need any data, it never can be worse than you expect,” Press said. “As soon as you use profile probabilities, if the profile probabilities are just wrong, then the strong profiling just does worse than the random sampling.”
Weiye Loh

The Inequality That Matters - Tyler Cowen - The American Interest Magazine - 0 views

  • most of the worries about income inequality are bogus, but some are probably better grounded and even more serious than even many of their heralds realize.
  • In terms of immediate political stability, there is less to the income inequality issue than meets the eye. Most analyses of income inequality neglect two major points. First, the inequality of personal well-being is sharply down over the past hundred years and perhaps over the past twenty years as well. Bill Gates is much, much richer than I am, yet it is not obvious that he is much happier if, indeed, he is happier at all. I have access to penicillin, air travel, good cheap food, the Internet and virtually all of the technical innovations that Gates does. Like the vast majority of Americans, I have access to some important new pharmaceuticals, such as statins to protect against heart disease. To be sure, Gates receives the very best care from the world’s top doctors, but our health outcomes are in the same ballpark. I don’t have a private jet or take luxury vacations, and—I think it is fair to say—my house is much smaller than his. I can’t meet with the world’s elite on demand. Still, by broad historical standards, what I share with Bill Gates is far more significant than what I don’t share with him.
  • when average people read about or see income inequality, they don’t feel the moral outrage that radiates from the more passionate egalitarian quarters of society. Instead, they think their lives are pretty good and that they either earned through hard work or lucked into a healthy share of the American dream.
  • ...35 more annotations...
  • This is why, for example, large numbers of Americans oppose the idea of an estate tax even though the current form of the tax, slated to return in 2011, is very unlikely to affect them or their estates. In narrowly self-interested terms, that view may be irrational, but most Americans are unwilling to frame national issues in terms of rich versus poor. There’s a great deal of hostility toward various government bailouts, but the idea of “undeserving” recipients is the key factor in those feelings. Resentment against Wall Street gamesters hasn’t spilled over much into resentment against the wealthy more generally. The bailout for General Motors’ labor unions wasn’t so popular either—again, obviously not because of any bias against the wealthy but because a basic sense of fairness was violated. As of November 2010, congressional Democrats are of a mixed mind as to whether the Bush tax cuts should expire for those whose annual income exceeds $250,000; that is in large part because their constituents bear no animus toward rich people, only toward undeservedly rich people.
  • envy is usually local. At least in the United States, most economic resentment is not directed toward billionaires or high-roller financiers—not even corrupt ones. It’s directed at the guy down the hall who got a bigger raise. It’s directed at the husband of your wife’s sister, because the brand of beer he stocks costs $3 a case more than yours, and so on. That’s another reason why a lot of people aren’t so bothered by income or wealth inequality at the macro level. Most of us don’t compare ourselves to billionaires. Gore Vidal put it honestly: “Whenever a friend succeeds, a little something in me dies.”
  • Occasionally the cynic in me wonders why so many relatively well-off intellectuals lead the egalitarian charge against the privileges of the wealthy. One group has the status currency of money and the other has the status currency of intellect, so might they be competing for overall social regard? The high status of the wealthy in America, or for that matter the high status of celebrities, seems to bother our intellectual class most. That class composes a very small group, however, so the upshot is that growing income inequality won’t necessarily have major political implications at the macro level.
  • All that said, income inequality does matter—for both politics and the economy.
  • The numbers are clear: Income inequality has been rising in the United States, especially at the very top. The data show a big difference between two quite separate issues, namely income growth at the very top of the distribution and greater inequality throughout the distribution. The first trend is much more pronounced than the second, although the two are often confused.
  • When it comes to the first trend, the share of pre-tax income earned by the richest 1 percent of earners has increased from about 8 percent in 1974 to more than 18 percent in 2007. Furthermore, the richest 0.01 percent (the 15,000 or so richest families) had a share of less than 1 percent in 1974 but more than 6 percent of national income in 2007. As noted, those figures are from pre-tax income, so don’t look to the George W. Bush tax cuts to explain the pattern. Furthermore, these gains have been sustained and have evolved over many years, rather than coming in one or two small bursts between 1974 and today.1
  • At the same time, wage growth for the median earner has slowed since 1973. But that slower wage growth has afflicted large numbers of Americans, and it is conceptually distinct from the higher relative share of top income earners. For instance, if you take the 1979–2005 period, the average incomes of the bottom fifth of households increased only 6 percent while the incomes of the middle quintile rose by 21 percent. That’s a widening of the spread of incomes, but it’s not so drastic compared to the explosive gains at the very top.
  • The broader change in income distribution, the one occurring beneath the very top earners, can be deconstructed in a manner that makes nearly all of it look harmless. For instance, there is usually greater inequality of income among both older people and the more highly educated, if only because there is more time and more room for fortunes to vary. Since America is becoming both older and more highly educated, our measured income inequality will increase pretty much by demographic fiat. Economist Thomas Lemieux at the University of British Columbia estimates that these demographic effects explain three-quarters of the observed rise in income inequality for men, and even more for women.2
  • Attacking the problem from a different angle, other economists are challenging whether there is much growth in inequality at all below the super-rich. For instance, real incomes are measured using a common price index, yet poorer people are more likely to shop at discount outlets like Wal-Mart, which have seen big price drops over the past twenty years.3 Once we take this behavior into account, it is unclear whether the real income gaps between the poor and middle class have been widening much at all. Robert J. Gordon, an economist from Northwestern University who is hardly known as a right-wing apologist, wrote in a recent paper that “there was no increase of inequality after 1993 in the bottom 99 percent of the population”, and that whatever overall change there was “can be entirely explained by the behavior of income in the top 1 percent.”4
  • And so we come again to the gains of the top earners, clearly the big story told by the data. It’s worth noting that over this same period of time, inequality of work hours increased too. The top earners worked a lot more and most other Americans worked somewhat less. That’s another reason why high earners don’t occasion more resentment: Many people understand how hard they have to work to get there. It also seems that most of the income gains of the top earners were related to performance pay—bonuses, in other words—and not wildly out-of-whack yearly salaries.5
  • It is also the case that any society with a lot of “threshold earners” is likely to experience growing income inequality. A threshold earner is someone who seeks to earn a certain amount of money and no more. If wages go up, that person will respond by seeking less work or by working less hard or less often. That person simply wants to “get by” in terms of absolute earning power in order to experience other gains in the form of leisure—whether spending time with friends and family, walking in the woods and so on. Luck aside, that person’s income will never rise much above the threshold.
  • The funny thing is this: For years, many cultural critics in and of the United States have been telling us that Americans should behave more like threshold earners. We should be less harried, more interested in nurturing friendships, and more interested in the non-commercial sphere of life. That may well be good advice. Many studies suggest that above a certain level more money brings only marginal increments of happiness. What isn’t so widely advertised is that those same critics have basically been telling us, without realizing it, that we should be acting in such a manner as to increase measured income inequality. Not only is high inequality an inevitable concomitant of human diversity, but growing income inequality may be, too, if lots of us take the kind of advice that will make us happier.
  • Why is the top 1 percent doing so well?
  • Steven N. Kaplan and Joshua Rauh have recently provided a detailed estimation of particular American incomes.6 Their data do not comprise the entire U.S. population, but from partial financial records they find a very strong role for the financial sector in driving the trend toward income concentration at the top. For instance, for 2004, nonfinancial executives of publicly traded companies accounted for less than 6 percent of the top 0.01 percent income bracket. In that same year, the top 25 hedge fund managers combined appear to have earned more than all of the CEOs from the entire S&P 500. The number of Wall Street investors earning more than $100 million a year was nine times higher than the public company executives earning that amount. The authors also relate that they shared their estimates with a former U.S. Secretary of the Treasury, one who also has a Wall Street background. He thought their estimates of earnings in the financial sector were, if anything, understated.
  • Many of the other high earners are also connected to finance. After Wall Street, Kaplan and Rauh identify the legal sector as a contributor to the growing spread in earnings at the top. Yet many high-earning lawyers are doing financial deals, so a lot of the income generated through legal activity is rooted in finance. Other lawyers are defending corporations against lawsuits, filing lawsuits or helping corporations deal with complex regulations. The returns to these activities are an artifact of the growing complexity of the law and government growth rather than a tale of markets per se. Finance aside, there isn’t much of a story of market failure here, even if we don’t find the results aesthetically appealing.
  • When it comes to professional athletes and celebrities, there isn’t much of a mystery as to what has happened. Tiger Woods earns much more, even adjusting for inflation, than Arnold Palmer ever did. J.K. Rowling, the first billionaire author, earns much more than did Charles Dickens. These high incomes come, on balance, from the greater reach of modern communications and marketing. Kids all over the world read about Harry Potter. There is more purchasing power to spend on children’s books and, indeed, on culture and celebrities more generally. For high-earning celebrities, hardly anyone finds these earnings so morally objectionable as to suggest that they be politically actionable. Cultural critics can complain that good schoolteachers earn too little, and they may be right, but that does not make celebrities into political targets. They’re too popular. It’s also pretty clear that most of them work hard to earn their money, by persuading fans to buy or otherwise support their product. Most of these individuals do not come from elite or extremely privileged backgrounds, either. They worked their way to the top, and even if Rowling is not an author for the ages, her books tapped into the spirit of their time in a special way. We may or may not wish to tax the wealthy, including wealthy celebrities, at higher rates, but there is no need to “cure” the structural causes of higher celebrity incomes.
  • to be sure, the high incomes in finance should give us all pause.
  • The first factor driving high returns is sometimes called by practitioners “going short on volatility.” Sometimes it is called “negative skewness.” In plain English, this means that some investors opt for a strategy of betting against big, unexpected moves in market prices. Most of the time investors will do well by this strategy, since big, unexpected moves are outliers by definition. Traders will earn above-average returns in good times. In bad times they won’t suffer fully when catastrophic returns come in, as sooner or later is bound to happen, because the downside of these bets is partly socialized onto the Treasury, the Federal Reserve and, of course, the taxpayers and the unemployed.
  • if you bet against unlikely events, most of the time you will look smart and have the money to validate the appearance. Periodically, however, you will look very bad. Does that kind of pattern sound familiar? It happens in finance, too. Betting against a big decline in home prices is analogous to betting against the Wizards. Every now and then such a bet will blow up in your face, though in most years that trading activity will generate above-average profits and big bonuses for the traders and CEOs.
  • To this mix we can add the fact that many money managers are investing other people’s money. If you plan to stay with an investment bank for ten years or less, most of the people playing this investing strategy will make out very well most of the time. Everyone’s time horizon is a bit limited and you will bring in some nice years of extra returns and reap nice bonuses. And let’s say the whole thing does blow up in your face? What’s the worst that can happen? Your bosses fire you, but you will still have millions in the bank and that MBA from Harvard or Wharton. For the people actually investing the money, there’s barely any downside risk other than having to quit the party early. Furthermore, if everyone else made more or less the same mistake (very surprising major events, such as a busted housing market, affect virtually everybody), you’re hardly disgraced. You might even get rehired at another investment bank, or maybe a hedge fund, within months or even weeks.
  • Moreover, smart shareholders will acquiesce to or even encourage these gambles. They gain on the upside, while the downside, past the point of bankruptcy, is borne by the firm’s creditors. And will the bondholders object? Well, they might have a difficult time monitoring the internal trading operations of financial institutions. Of course, the firm’s trading book cannot be open to competitors, and that means it cannot be open to bondholders (or even most shareholders) either. So what, exactly, will they have in hand to object to?
  • Perhaps more important, government bailouts minimize the damage to creditors on the downside. Neither the Treasury nor the Fed allowed creditors to take any losses from the collapse of the major banks during the financial crisis. The U.S. government guaranteed these loans, either explicitly or implicitly. Guaranteeing the debt also encourages equity holders to take more risk. While current bailouts have not in general maintained equity values, and while share prices have often fallen to near zero following the bust of a major bank, the bailouts still give the bank a lifeline. Instead of the bank being destroyed, sometimes those equity prices do climb back out of the hole. This is true of the major surviving banks in the United States, and even AIG is paying back its bailout. For better or worse, we’re handing out free options on recovery, and that encourages banks to take more risk in the first place.
  • there is an unholy dynamic of short-term trading and investing, backed up by bailouts and risk reduction from the government and the Federal Reserve. This is not good. “Going short on volatility” is a dangerous strategy from a social point of view. For one thing, in so-called normal times, the finance sector attracts a big chunk of the smartest, most hard-working and most talented individuals. That represents a huge human capital opportunity cost to society and the economy at large. But more immediate and more important, it means that banks take far too many risks and go way out on a limb, often in correlated fashion. When their bets turn sour, as they did in 2007–09, everyone else pays the price.
  • And it’s not just the taxpayer cost of the bailout that stings. The financial disruption ends up throwing a lot of people out of work down the economic food chain, often for long periods. Furthermore, the Federal Reserve System has recapitalized major U.S. banks by paying interest on bank reserves and by keeping an unusually high interest rate spread, which allows banks to borrow short from Treasury at near-zero rates and invest in other higher-yielding assets and earn back lots of money rather quickly. In essence, we’re allowing banks to earn their way back by arbitraging interest rate spreads against the U.S. government. This is rarely called a bailout and it doesn’t count as a normal budget item, but it is a bailout nonetheless. This type of implicit bailout brings high social costs by slowing down economic recovery (the interest rate spreads require tight monetary policy) and by redistributing income from the Treasury to the major banks.
  • the “going short on volatility” strategy increases income inequality. In normal years the financial sector is flush with cash and high earnings. In implosion years a lot of the losses are borne by other sectors of society. In other words, financial crisis begets income inequality. Despite being conceptually distinct phenomena, the political economy of income inequality is, in part, the political economy of finance. Simon Johnson tabulates the numbers nicely: From 1973 to 1985, the financial sector never earned more than 16 percent of domestic corporate profits. In 1986, that figure reached 19 percent. In the 1990s, it oscillated between 21 percent and 30 percent, higher than it had ever been in the postwar period. This decade, it reached 41 percent. Pay rose just as dramatically. From 1948 to 1982, average compensation in the financial sector ranged between 99 percent and 108 percent of the average for all domestic private industries. From 1983, it shot upward, reaching 181 percent in 2007.7
  • There’s a second reason why the financial sector abets income inequality: the “moving first” issue. Let’s say that some news hits the market and that traders interpret this news at different speeds. One trader figures out what the news means in a second, while the other traders require five seconds. Still other traders require an entire day or maybe even a month to figure things out. The early traders earn the extra money. They buy the proper assets early, at the lower prices, and reap most of the gains when the other, later traders pile on. Similarly, if you buy into a successful tech company in the early stages, you are “moving first” in a very effective manner, and you will capture most of the gains if that company hits it big.
  • The moving-first phenomenon sums to a “winner-take-all” market. Only some relatively small number of traders, sometimes just one trader, can be first. Those who are first will make far more than those who are fourth or fifth. This difference will persist, even if those who are fourth come pretty close to competing with those who are first. In this context, first is first and it doesn’t matter much whether those who come in fourth pile on a month, a minute or a fraction of a second later. Those who bought (or sold, as the case may be) first have captured and locked in most of the available gains. Since gains are concentrated among the early winners, and the closeness of the runner-ups doesn’t so much matter for income distribution, asset-market trading thus encourages the ongoing concentration of wealth. Many investors make lots of mistakes and lose their money, but each year brings a new bunch of projects that can turn the early investors and traders into very wealthy individuals.
  • These two features of the problem—“going short on volatility” and “getting there first”—are related. Let’s say that Goldman Sachs regularly secures a lot of the best and quickest trades, whether because of its quality analysis, inside connections or high-frequency trading apparatus (it has all three). It builds up a treasure chest of profits and continues to hire very sharp traders and to receive valuable information. Those profits allow it to make “short on volatility” bets faster than anyone else, because if it messes up, it still has a large enough buffer to pad losses. This increases the odds that Goldman will repeatedly pull in spectacular profits.
  • Still, every now and then Goldman will go bust, or would go bust if not for government bailouts. But the odds are in any given year that it won’t because of the advantages it and other big banks have. It’s as if the major banks have tapped a hole in the social till and they are drinking from it with a straw. In any given year, this practice may seem tolerable—didn’t the bank earn the money fair and square by a series of fairly normal looking trades? Yet over time this situation will corrode productivity, because what the banks do bears almost no resemblance to a process of getting capital into the hands of those who can make most efficient use of it. And it leads to periodic financial explosions. That, in short, is the real problem of income inequality we face today. It’s what causes the inequality at the very top of the earning pyramid that has dangerous implications for the economy as a whole.
  • What about controlling bank risk-taking directly with tight government oversight? That is not practical. There are more ways for banks to take risks than even knowledgeable regulators can possibly control; it just isn’t that easy to oversee a balance sheet with hundreds of billions of dollars on it, especially when short-term positions are wound down before quarterly inspections. It’s also not clear how well regulators can identify risky assets. Some of the worst excesses of the financial crisis were grounded in mortgage-backed assets—a very traditional function of banks—not exotic derivatives trading strategies. Virtually any asset position can be used to bet long odds, one way or another. It is naive to think that underpaid, undertrained regulators can keep up with financial traders, especially when the latter stand to earn billions by circumventing the intent of regulations while remaining within the letter of the law.
  • For the time being, we need to accept the possibility that the financial sector has learned how to game the American (and UK-based) system of state capitalism. It’s no longer obvious that the system is stable at a macro level, and extreme income inequality at the top has been one result of that imbalance. Income inequality is a symptom, however, rather than a cause of the real problem. The root cause of income inequality, viewed in the most general terms, is extreme human ingenuity, albeit of a perverse kind. That is why it is so hard to control.
  • Another root cause of growing inequality is that the modern world, by so limiting our downside risk, makes extreme risk-taking all too comfortable and easy. More risk-taking will mean more inequality, sooner or later, because winners always emerge from risk-taking. Yet bankers who take bad risks (provided those risks are legal) simply do not end up with bad outcomes in any absolute sense. They still have millions in the bank, lots of human capital and plenty of social status. We’re not going to bring back torture, trial by ordeal or debtors’ prisons, nor should we. Yet the threat of impoverishment and disgrace no longer looms the way it once did, so we no longer can constrain excess financial risk-taking. It’s too soft and cushy a world.
  • Why don’t we simply eliminate the safety net for clueless or unlucky risk-takers so that losses equal gains overall? That’s a good idea in principle, but it is hard to put into practice. Once a financial crisis arrives, politicians will seek to limit the damage, and that means they will bail out major financial institutions. Had we not passed TARP and related policies, the United States probably would have faced unemployment rates of 25 percent of higher, as in the Great Depression. The political consequences would not have been pretty. Bank bailouts may sound quite interventionist, and indeed they are, but in relative terms they probably were the most libertarian policy we had on tap. It meant big one-time expenses, but, for the most part, it kept government out of the real economy (the General Motors bailout aside).
  • We probably don’t have any solution to the hazards created by our financial sector, not because plutocrats are preventing our political system from adopting appropriate remedies, but because we don’t know what those remedies are. Yet neither is another crisis immediately upon us. The underlying dynamic favors excess risk-taking, but banks at the current moment fear the scrutiny of regulators and the public and so are playing it fairly safe. They are sitting on money rather than lending it out. The biggest risk today is how few parties will take risks, and, in part, the caution of banks is driving our current protracted economic slowdown. According to this view, the long run will bring another financial crisis once moods pick up and external scrutiny weakens, but that day of reckoning is still some ways off.
  • Is the overall picture a shame? Yes. Is it distorting resource distribution and productivity in the meantime? Yes. Will it again bring our economy to its knees? Probably. Maybe that’s simply the price of modern society. Income inequality will likely continue to rise and we will search in vain for the appropriate political remedies for our underlying problems.
Weiye Loh

Let's make science metrics more scientific : Article : Nature - 0 views

  • Measuring and assessing academic performance is now a fact of scientific life.
  • Yet current systems of measurement are inadequate. Widely used metrics, from the newly-fashionable Hirsch index to the 50-year-old citation index, are of limited use1
  • Existing metrics do not capture the full range of activities that support and transmit scientific ideas, which can be as varied as mentoring, blogging or creating industrial prototypes.
  • ...15 more annotations...
  • narrow or biased measures of scientific achievement can lead to narrow and biased science.
  • Global demand for, and interest in, metrics should galvanize stakeholders — national funding agencies, scientific research organizations and publishing houses — to combine forces. They can set an agenda and foster research that establishes sound scientific metrics: grounded in theory, built with high-quality data and developed by a community with strong incentives to use them.
  • Scientists are often reticent to see themselves or their institutions labelled, categorized or ranked. Although happy to tag specimens as one species or another, many researchers do not like to see themselves as specimens under a microscope — they feel that their work is too complex to be evaluated in such simplistic terms. Some argue that science is unpredictable, and that any metric used to prioritize research money risks missing out on an important discovery from left field.
    • Weiye Loh
       
      It is ironic that while scientists feel that their work are too complex to be evaluated in simplistic terms or matrics, they nevertheless feel ok to evaluate the world in simplistic terms. 
  • It is true that good metrics are difficult to develop, but this is not a reason to abandon them. Rather it should be a spur to basing their development in sound science. If we do not press harder for better metrics, we risk making poor funding decisions or sidelining good scientists.
  • Metrics are data driven, so developing a reliable, joined-up infrastructure is a necessary first step.
  • We need a concerted international effort to combine, augment and institutionalize these databases within a cohesive infrastructure.
  • On an international level, the issue of a unique researcher identification system is one that needs urgent attention. There are various efforts under way in the open-source and publishing communities to create unique researcher identifiers using the same principles as the Digital Object Identifier (DOI) protocol, which has become the international standard for identifying unique documents. The ORCID (Open Researcher and Contributor ID) project, for example, was launched in December 2009 by parties including Thompson Reuters and Nature Publishing Group. The engagement of international funding agencies would help to push this movement towards an international standard.
  • if all funding agencies used a universal template for reporting scientific achievements, it could improve data quality and reduce the burden on investigators.
    • Weiye Loh
       
      So in future, we'll only have one robust matric to evaluate scientific contribution? hmm...
  • Importantly, data collected for use in metrics must be open to the scientific community, so that metric calculations can be reproduced. This also allows the data to be efficiently repurposed.
  • As well as building an open and consistent data infrastructure, there is the added challenge of deciding what data to collect and how to use them. This is not trivial. Knowledge creation is a complex process, so perhaps alternative measures of creativity and productivity should be included in scientific metrics, such as the filing of patents, the creation of prototypes4 and even the production of YouTube videos.
  • Perhaps publications in these different media should be weighted differently in different fields.
  • There needs to be a greater focus on what these data mean, and how they can be best interpreted.
  • This requires the input of social scientists, rather than just those more traditionally involved in data capture, such as computer scientists.
  • An international data platform supported by funding agencies could include a virtual 'collaboratory', in which ideas and potential solutions can be posited and discussed. This would bring social scientists together with working natural scientists to develop metrics and test their validity through wikis, blogs and discussion groups, thus building a community of practice. Such a discussion should be open to all ideas and theories and not restricted to traditional bibliometric approaches.
  • Far-sighted action can ensure that metrics goes beyond identifying 'star' researchers, nations or ideas, to capturing the essence of what it means to be a good scientist.
  •  
    Let's make science metrics more scientific Julia Lane1 Top of pageAbstract To capture the essence of good science, stakeholders must combine forces to create an open, sound and consistent system for measuring all the activities that make up academic productivity, says Julia Lane.
Weiye Loh

Roger Pielke Jr.'s Blog: Clean Tech Innovation and the "Iron Law" - 0 views

  • Cleantech companies just can’t seem to get it right. At least, that’s the notion Peter Thiel — a co-founder of PayPal and president of Clarium Capital — subscribes to when he looks at cleantech companies as potential investing opportunities. He made the comments at a Commonwealth Club event in San Francisco Wednesday.
  • That’s not because he doesn’t believe in the technology — he just doesn’t like the way the companies are run, he said. “Most of the people who run cleantech companies are sales people, not engineers,” Thiel said. “Something seems to have gone quite wrong with cleantech.”
  • most cleantech companies that try to develop alternative energy forms are building power sources that are more expensive. Solar panels, for example, are still not a cost-efficient way to generate power because companies have made the assumption that people will pay more for more environmentally friendly ways of producing energy, Thiel said. “We need something cheaper, not more expensive,” he said. “It doesn’t matter if the energy is cleaner, it doesn’t work if it’s more expensive.”
1 - 20 of 32 Next ›
Showing 20 items per page