Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Innovation

Rss Feed Group items tagged

Weiye Loh

What If The Very Theory That Underlies Why We Need Patents Is Wrong? | Techdirt - 0 views

  • Scott Walker points us to a fascinating paper by Carliss Y. Baldwin and Eric von Hippel, suggesting that some of the most basic theories on which the patent system is based are wrong, and because of that, the patent system might hinder innovation.
  • numerous other research papers and case studies that suggest that the patent system quite frequently hinders innovation, but this one approaches it from a different angle than ones we've seen before, and is actually quite convincing. It looks at the putative putative theory that innovation comes from a direct profit motive of a single corporation looking to sell the good in market, and for that to work, the company needs to take the initial invention and get temporary monopoly protection to keep out competitors in order to recoup the cost of research and development.
  • the paper goes through a whole bunch of studies suggesting that quite frequently innovation happens through a very different process: either individuals or companies directly trying to solve a problem they themselves have (i.e., the initial motive is not to profit directly from sales, but to help them in something they were doing) or through a much more collaborative process, whereby multiple parties all contribute to the process of innovation, somewhat openly, recognizing that as each contributes some, everyone benefits. As the report notes: This result hinges on the fact that the innovative design itself is a non-rival good: each participant in a collaborative effort gets the value of the whole design, but incurs only a fraction of the design cost.
  • ...5 more annotations...
  • patents are designed to make that sort of thing more difficult, because it assumes that the initial act of invention is the key point, rather than all the incremental innovations built on top of it that all parties can benefit from.
  • the report points to numerous studies that show, when given the chance, many companies freely share their ideas with others, recognizing the direct benefit they get.
  • Even more importantly, the paper finds that due to technological advances and the ability to more rapidly and easily communicate and collaborate widely, these forms of innovation (innovation for direct use as well as collaborative innovation) are becoming more and more viable across a variety of industries, which in the past may have relied more on the old way of innovating (single company innovative for the profit of selling that product).
  • because of the ease of communication and collaboration these days, there's tremendous incentive for those companies that innovate for their own use to collaborate with others, since the benefit from others improving as well help improve their own uses. Thus, the overall incentives are to move much more to a collaborative form of innovation in the market. That has huge implications for a patent system designed to help the "old model" of innovation (producer inventing for the market) and not the increasingly regular one (collaborative innovation for usage).
  • no one is saying that producer-based innovation (company inventing to sell on the market) doesn't occur or won't continue to occur. But it is an open policy question as to whether or not our innovation policies should favor that model over other models -- when evidence suggests that a significant amount of innovation occurs in these other ways -- and that amount is growing rapidly.
  •  
    What If The Very Theory That Underlies Why We Need Patents Is Wrong? from the collaborative-innovation-at-work dept
Weiye Loh

Kevin Kelly and Steven Johnson on Where Ideas Come From | Magazine - 0 views

  • Say the word “inventor” and most people think of a solitary genius toiling in a basement. But two ambitious new books on the history of innovation—by Steven Johnson and Kevin Kelly, both longtime wired contributors—argue that great discoveries typically spring not from individual minds but from the hive mind. In Where Good Ideas Come From: The Natural History of Innovation, Johnson draws on seven centuries of scientific and technological progress, from Gutenberg to GPS, to show what sorts of environments nurture ingenuity. He finds that great creative milieus, whether MIT or Los Alamos, New York City or the World Wide Web, are like coral reefs—teeming, diverse colonies of creators who interact with and influence one another.
  • Seven centuries are an eyeblink in the scope of Kelly’s book, What Technology Wants, which looks back over some 50,000 years of history and peers nearly that far into the future. His argument is similarly sweeping: Technology, Kelly believes, can be seen as a sort of autonomous life-form, with intrinsic goals toward which it gropes over the course of its long development. Those goals, he says, are much like the tendencies of biological life, which over time diversifies, specializes, and (eventually) becomes more sentient.
  • We share a fascination with the long history of simultaneous invention: cases where several people come up with the same idea at almost exactly the same time. Calculus, the electrical battery, the telephone, the steam engine, the radio—all these groundbreaking innovations were hit upon by multiple inventors working in parallel with no knowledge of one another.
  • ...25 more annotations...
  • It’s amazing that the myth of the lone genius has persisted for so long, since simultaneous invention has always been the norm, not the exception. Anthropologists have shown that the same inventions tended to crop up in prehistory at roughly similar times, in roughly the same order, among cultures on different continents that couldn’t possibly have contacted one another.
  • Also, there’s a related myth—that innovation comes primarily from the profit motive, from the competitive pressures of a market society. If you look at history, innovation doesn’t come just from giving people incentives; it comes from creating environments where their ideas can connect.
  • The musician Brian Eno invented a wonderful word to describe this phenomenon: scenius. We normally think of innovators as independent geniuses, but Eno’s point is that innovation comes from social scenes,from passionate and connected groups of people.
  • It turns out that the lone genius entrepreneur has always been a rarity—there’s far more innovation coming out of open, nonmarket networks than we tend to assume.
  • Really, we should think of ideas as connections,in our brains and among people. Ideas aren’t self-contained things; they’re more like ecologies and networks. They travel in clusters.
  • ideas are networks
  • In part, that’s because ideas that leap too far ahead are almost never implemented—they aren’t even valuable. People can absorb only one advance, one small hop, at a time. Gregor Mendel’s ideas about genetics, for example: He formulated them in 1865, but they were ignored for 35 years because they were too advanced. Nobody could incorporate them. Then, when the collective mind was ready and his idea was only one hop away, three different scientists independently rediscovered his work within roughly a year of one another.
  • Charles Babbage is another great case study. His “analytical engine,” which he started designing in the 1830s, was an incredibly detailed vision of what would become the modern computer, with a CPU, RAM, and so on. But it couldn’t possibly have been built at the time, and his ideas had to be rediscovered a hundred years later.
  • I think there are a lot of ideas today that are ahead of their time. Human cloning, autopilot cars, patent-free law—all are close technically but too many steps ahead culturally. Innovating is about more than just having the idea yourself; you also have to bring everyone else to where your idea is. And that becomes really difficult if you’re too many steps ahead.
  • The scientist Stuart Kauffman calls this the “adjacent possible.” At any given moment in evolution—of life, of natural systems, or of cultural systems—there’s a space of possibility that surrounds any current configuration of things. Change happens when you take that configuration and arrange it in a new way. But there are limits to how much you can change in a single move.
  • Which is why the great inventions are usually those that take the smallest possible step to unleash the most change. That was the difference between Tim Berners-Lee’s successful HTML code and Ted Nelson’s abortive Xanadu project. Both tried to jump into the same general space—a networked hypertext—but Tim’s approach did it with a dumb half-step, while Ted’s earlier, more elegant design required that everyone take five steps all at once.
  • Also, the steps have to be taken in the right order. You can’t invent the Internet and then the digital computer. This is true of life as well. The building blocks of DNA had to be in place before evolution could build more complex things. One of the key ideas I’ve gotten from you, by the way—when I read your book Out of Control in grad school—is this continuity between biological and technological systems.
  • technology is something that can give meaning to our lives, particularly in a secular world.
  • He had this bleak, soul-sucking vision of technology as an autonomous force for evil. You also present technology as a sort of autonomous force—as wanting something, over the long course of its evolution—but it’s a more balanced and ultimately positive vision, which I find much more appealing than the alternative.
  • As I started thinking about the history of technology, there did seem to be a sense in which, during any given period, lots of innovations were in the air, as it were. They came simultaneously. It appeared as if they wanted to happen. I should hasten to add that it’s not a conscious agency; it’s a lower form, something like the way an organism or bacterium can be said to have certain tendencies, certain trends, certain urges. But it’s an agency nevertheless.
  • technology wants increasing diversity—which is what I think also happens in biological systems, as the adjacent possible becomes larger with each innovation. As tech critics, I think we have to keep this in mind, because when you expand the diversity of a system, that leads to an increase in great things and an increase in crap.
  • the idea that the most creative environments allow for repeated failure.
  • And for wastes of time and resources. If you knew nothing about the Internet and were trying to figure it out from the data, you would reasonably conclude that it was designed for the transmission of spam and porn. And yet at the same time, there’s more amazing stuff available to us than ever before, thanks to the Internet.
  • To create something great, you need the means to make a lot of really bad crap. Another example is spectrum. One reason we have this great explosion of innovation in wireless right now is that the US deregulated spectrum. Before that, spectrum was something too precious to be wasted on silliness. But when you deregulate—and say, OK, now waste it—then you get Wi-Fi.
  • If we didn’t have genetic mutations, we wouldn’t have us. You need error to open the door to the adjacent possible.
  • image of the coral reef as a metaphor for where innovation comes from. So what, today, are some of the most reeflike places in the technological realm?
  • Twitter—not to see what people are having for breakfast, of course, but to see what people are talking about, the links to articles and posts that they’re passing along.
  • second example of an information coral reef, and maybe the less predictable one, is the university system. As much as we sometimes roll our eyes at the ivory-tower isolation of universities, they continue to serve as remarkable engines of innovation.
  • Life seems to gravitate toward these complex states where there’s just enough disorder to create new things. There’s a rate of mutation just high enough to let interesting new innovations happen, but not so many mutations that every new generation dies off immediately.
  • , technology is an extension of life. Both life and technology are faces of the same larger system.
  •  
    Kevin Kelly and Steven Johnson on Where Ideas Come From By Wired September 27, 2010  |  2:00 pm  |  Wired October 2010
Weiye Loh

Apples and PCs: Who innovates more, Apple or HP? | The Economist - 1 views

  • In terms of processing power, speed, memory, and so on, how do Macs and PCs actually compare? And does Apple innovate in terms of basic hardware quality as often or less often than the likes of HP, Compaq, and other producers? This question is of broader interest from an economist's point of view because it also has to do with the age-old question of whether competition or monopoly is a better spur to innovation. In a certain sense, Apple is a monopolist, and PC makers are in a more competitive market. (I say in a certain sense because obviously Macs and PCs are substitutes; it's just that they're more imperfect substitutes than two PCs are for each other, in part because of software migration issues.)
  • Schumpeter argued long back that because a monopolist reaps the full reward from innovation, such firms would be more innovative. The case for patents relies in part on a version of this argument: companies are given monopoly rights over a new product for a period of time in order for them to be able to recoup the costs of innovation; without such protection, it is argued, they would not find it beneficial to innovate in the first place.
  • others have argued that competition spurs innovation by giving firms a way to differentiate themselves from their competitors (in a way, creating something new gives a company a temporary, albeit brief, "monopoly")
  •  
    Who innovates more, Apple or HP?
Weiye Loh

Roger Pielke Jr.'s Blog: Core Questions in the Governance of Innovation - 0 views

  • Today's NYT has a couple interesting articles about technological innovations that we may not want, and that we may wish to regulate in some manner, formally or informally.  These technologies suggest some core questions that lie at the heart of the management of innovation.
  • The first article discusses Google' Goggles which is an application allows people to search the internet based on an image taken by a smartphone.  Google has decided not to allow this technology to include face recognition in its software, even though people have requested it.
  • Google could have put face recognition into the Goggles application; indeed, many users have asked for it. But Google decided against it because smartphones can be used to take pictures of individuals without their knowledge, and a face match could retrieve all kinds of personal information — name, occupation, address, workplace.
  • ...4 more annotations...
  • “It was just too sensitive, and we didn’t want to go there,” said Eric E. Schmidt, the chief executive of Google. “You want to avoid enabling stalker behavior.”
  • The second article focuses on innovations in high frequency trading in financial markets, which bears some responsibility for the so-called "flash crash" of May 6th last year, in which the DJIA plunged more than 700 points in just minutes.
  • One debate has focused on whether some traders are firing off fake orders thousands of times a second to slow down exchanges and mislead others. Michael Durbin, who helped build high-frequency trading systems for companies like Citadel and is the author of the book “All About High-Frequency Trading,” says that most of the industry is legitimate and benefits investors. But, he says, the rules need to be strengthened to curb some disturbing practices.
  • This situation raises what I see to be core questions in the governance of innovation -- to what degree can innovation be shaped for achieving intended purposes? and, To what degree can the consequences of innovation be anticipated?
Weiye Loh

Does patent/ copyright stifle or promote innovation? - 6 views

From a Critical Ethic perspective, Who do patents and copyrights protect? What kind of ideologies underly such a policy? I would argue that it is the capitalist ideologies, individualist ideolo...

MS Word patent copyright

Weiye Loh

Roger Pielke Jr.'s Blog: Innovation in Drug Development: An Inverse Moore's Law? - 0 views

  • Today's FT has this interesting graph and an accompanying story, showing a sort of inverse Moore's Law of drug development.  Over almost 60 years the number of new drugs developed per unit of investment has declined in a fairly constant manner, and some drug companies are now slashing their R&D budgets.
  • why this trend has occurred.  The FT points to a combination of low-hanging fruit that has been plucked and increasing costs of drug development. To some observers, that reflects the end of the mid to late 20th century golden era for drug discovery, when first-generation medicines such as antibiotics and beta-blockers to treat high blood pressure transformed healthcare. At the same time, regulatory demands to prove safety and efficacy have grown firmer. The result is larger and more costly clinical trials, and high failure rates for experimental drugs.
  • Others point to flawed innovation policies in industry and governments: “The markets treat drug companies as though research and development spending destroys value,” says Jack Scannell, an analyst at Bernstein Research. “People have stopped distinguishing the good from the bad. All those which performed well returned cash to shareholders. Unless the industry can articulate what the problem is, I don’t expect that to change.”
  • ...6 more annotations...
  • Mr [Andrew] Baum [of Morgan Stanley] argues that the solution for drug companies is to share the risks of research with others. That means reducing in-house investment in research, and instead partnering and licensing experimental medicines from smaller companies after some of the early failures have been eliminated.
  • Chas Bountra of Oxford university calls for a more radical partnership combining industry and academic research. “What we are trying to do is just too difficult,” he says. “No one organisation can do it, so we have to pool resources and expertise.” He suggests removing intellectual property rights until a drug is in mid-stage testing in humans, which would make academics more willing to co-operate because they could publish their results freely. The sharing of data would enable companies to avoid duplicating work.
  • The challenge is for academia and biotech companies to fill the research gap. Mr Ratcliffe argues that after a lull in 2009 and 2010, private capital is returning to the sector – as demonstrated by a particular buzz at JPMorgan’s new year biotech conference in California.
  • Patrick Vallance, senior vice-president for discovery at GSK, is cautious about deferring patents until so late, arguing that drug companies need to be able to protect their intellectual property in order to fund expensive late-stage development. But he too is experimenting with ways to co-operate more closely with academics over longer periods. He is also championing the “externalisation” of the company’s pipeline, with biotech and university partners accounting for half the total. GSK has earmarked £50m to support fledgling British companies, many “wrapped around” the group’s sites. One such example is Convergence, a spin-out from a GSK lab researching pain relief.
  • Big pharmaceutical companies are scrambling to find ways to overcome the loss of tens of billions of dollars in revenue as patents on top-selling drugs run out. Many sound similar notes about encouraging entrepreneurialism in their ranks, making smart deals and capitalizing on emerging-market growth, But their actual plans are often quite different—and each carries significant risks. Novartis AG, for instance, is so convinced that diversification is the best course that the company has a considerable business selling low-priced generics. Meantime, Bristol-Myers Squibb Co. has decided to concentrate on innovative medicines, shedding so many nonpharmaceutical units that it' has become midsize. GlaxoSmithKline PLC is still investing in research, but like Pfizer it has narrowed the range of disease areas in which it's seeking new treatments. Underlying the divergence is a deep-seated philosophical dispute over the merits of the heavy investment that companies must make to discover new drugs. By most estimates, bringing a new molecule to market costs drug makers more than $1 billion. Industry officials have been engaged in a vigorous debate over whether the investment is worth it, or whether they should leave it to others whose work they can acquire or license after a demonstration of strong potential.
  • To what extent can approached to innovation influence the trend line in the graph above?  I don't think that anyone really knows the answer.  The different approaches being taken by Merck and Pfizer, for instance, represent a real world policy experiment: The contrast between Merck and Pfizer reflects the very different personal approaches of their CEOs. An accountant by training, Mr. Read has held various business positions during a three-decade career at Pfizer. The 57-year-old cited torcetrapib, a cholesterol medicine that the company spent more than $800 million developing but then pulled due to safety concerns, as an example of the kind of wasteful spending Pfizer would avoid. "We're going to have metrics," Mr. Read said. He wants Pfizer to stop "always investing on hope rather than strong signals and the quality of the science, the quality of the medicine." Mr. Frazier, 56, a Harvard-educated lawyer who joined Merck in 1994 from private practice, said the company was sticking by its own troubled heart drug, vorapaxar. Mr. Frazier said he wanted to see all of the data from the trials before rushing to judgment. "We believe in the innovation approach," he said.
Weiye Loh

Let There Be More Efficient Light - NYTimes.com - 0 views

  • LAST week Michele Bachmann, a Republican representative from Minnesota, introduced a bill to roll back efficiency standards for light bulbs, which include a phasing out of incandescent bulbs in favor of more energy-efficient bulbs. The “government has no business telling an individual what kind of light bulb to buy,” she declared.
  • But this opposition ignores another, more important bit of American history: the critical role that government-mandated standards have played in scientific and industrial innovation.
  • inventions alone weren’t enough to guarantee progress. Indeed, at the time the lack of standards for everything from weights and measures to electricity — even the gallon, for example, had eight definitions — threatened to overwhelm industry and consumers with a confusing array of incompatible choices.
  • ...5 more annotations...
  • This wasn’t the case everywhere. Germany’s standards agency, established in 1887, was busy setting rules for everything from the content of dyes to the process for making porcelain; other European countries soon followed suit. Higher-quality products, in turn, helped the growth in Germany’s trade exceed that of the United States in the 1890s. America finally got its act together in 1894, when Congress standardized the meaning of what are today common scientific measures, including the ohm, the volt, the watt and the henry, in line with international metrics. And, in 1901, the United States became the last major economic power to establish an agency to set technological standards. The result was a boom in product innovation in all aspects of life during the 20th century. Today we can go to our hardware store and choose from hundreds of light bulbs that all conform to government-mandated quality and performance standards.
  • Technological standards not only promote innovation — they also can help protect one country’s industries from falling behind those of other countries. Today China, India and other rapidly growing nations are adopting standards that speed the deployment of new technologies. Without similar requirements to manufacture more technologically advanced products, American companies risk seeing the overseas markets for their products shrink while innovative goods from other countries flood the domestic market. To prevent that from happening, America needs not only to continue developing standards, but also to devise a strategy to apply them consistently and quickly.
  • The best approach would be to borrow from Japan, whose Top Runner program sets energy-efficiency standards by identifying technological leaders in a particular industry — say, washing machines — and mandating that the rest of the industry keep up. As technologies improve, the standards change as well, enabling a virtuous cycle of improvement. At the same time, the government should work with businesses to devise multidimensional standards, so that consumers don’t balk at products because they sacrifice, say, brightness and cost for energy efficiency.
  • This is not to say that innovation doesn’t bring disruption, and American policymakers can’t ignore the jobs that are lost when government standards sweep older technologies into the dustbin of history. An effective way forward on light bulbs, then, would be to apply standards only to those manufacturers that produce or import in large volume. Meanwhile, smaller, legacy light-bulb producers could remain, cushioning the blow to workers and meeting consumer demand.
  • Technologies and the standards that guide their deployment have revolutionized American society. They’ve been so successful, in fact, that the role of government has become invisible — so much so that even members of Congress should be excused for believing the government has no business mandating your choice of light bulbs.
Weiye Loh

Some Scientists Fear Computer Chips Will Soon Hit a Wall - NYTimes.com - 0 views

  • The problem has the potential to counteract an important principle in computing that has held true for decades: Moore’s Law. It was Gordon Moore, a founder of Intel, who first predicted that the number of transistors that could be nestled comfortably and inexpensively on an integrated circuit chip would double roughly every two years, bringing exponential improvements in consumer electronics.
  • In their paper, Dr. Burger and fellow researchers simulated the electricity used by more than 150 popular microprocessors and estimated that by 2024 computing speed would increase only 7.9 times, on average. By contrast, if there were no limits on the capabilities of the transistors, the maximum potential speedup would be nearly 47 times, the researchers said.
  • Some scientists disagree, if only because new ideas and designs have repeatedly come along to preserve the computer industry’s rapid pace of improvement. Dr. Dally of Nvidia, for instance, is sanguine about the future of chip design. “The good news is that the old designs are really inefficient, leaving lots of room for innovation,” he said.
  • ...3 more annotations...
  • Shekhar Y. Borkar, a fellow at Intel Labs, called Dr. Burger’s analysis “right on the dot,” but added: “His conclusions are a little different than what my conclusions would have been. The future is not as golden as it used to be, but it’s not bleak either.” Dr. Borkar cited a variety of new design ideas that he said would help ease the limits identified in the paper. Intel recently developed a way to vary the power consumed by different parts of a processor, making it possible to have both slower, lower-power transistors as well as faster-switching ones that consume more power. Increasingly, today’s processor chips contain two or more cores, or central processing units, that make it possible to use multiple programs simultaneously. In the future, Intel computers will have different kinds of cores optimized for different kinds of problems, only some of which require high power.
  • And while Intel announced in May that it had found a way to use 3-D design to crowd more transistors onto a single chip, that technology does not solve the energy problem described in the paper about dark silicon. The authors of the paper said they had tried to account for some of the promised innovation, and they argued that the question was how far innovators could go in overcoming the power limits.
  • “It’s one of those ‘If we don’t innovate, we’re all going to die’ papers,” Dr. Patterson said in an e-mail. “I’m pretty sure it means we need to innovate, since we don’t want to die!”
Jude John

What's so Original in Academic Research? - 26 views

Thanks for your comments. I may have appeared to be contradictory, but what I really meant was that ownership of IP should not be a motivating factor to innovate. I realise that in our capitalistic...

Weiye Loh

McKinsey & Company - Clouds, big data, and smart assets: Ten tech-enabled business tren... - 0 views

  • 1. Distributed cocreation moves into the mainstreamIn the past few years, the ability to organise communities of Web participants to develop, market, and support products and services has moved from the margins of business practice to the mainstream. Wikipedia and a handful of open-source software developers were the pioneers. But in signs of the steady march forward, 70 per cent of the executives we recently surveyed said that their companies regularly created value through Web communities. Similarly, more than 68m bloggers post reviews and recommendations about products and services.
  • for every success in tapping communities to create value, there are still many failures. Some companies neglect the up-front research needed to identify potential participants who have the right skill sets and will be motivated to participate over the longer term. Since cocreation is a two-way process, companies must also provide feedback to stimulate continuing participation and commitment. Getting incentives right is important as well: cocreators often value reputation more than money. Finally, an organisation must gain a high level of trust within a Web community to earn the engagement of top participants.
  • 2. Making the network the organisation In earlier research, we noted that the Web was starting to force open the boundaries of organisations, allowing nonemployees to offer their expertise in novel ways. We called this phenomenon "tapping into a world of talent." Now many companies are pushing substantially beyond that starting point, building and managing flexible networks that extend across internal and often even external borders. The recession underscored the value of such flexibility in managing volatility. We believe that the more porous, networked organisations of the future will need to organise work around critical tasks rather than molding it to constraints imposed by corporate structures.
  • ...10 more annotations...
  • 3. Collaboration at scale Across many economies, the number of people who undertake knowledge work has grown much more quickly than the number of production or transactions workers. Knowledge workers typically are paid more than others, so increasing their productivity is critical. As a result, there is broad interest in collaboration technologies that promise to improve these workers' efficiency and effectiveness. While the body of knowledge around the best use of such technologies is still developing, a number of companies have conducted experiments, as we see in the rapid growth rates of video and Web conferencing, expected to top 20 per cent annually during the next few years.
  • 4. The growing ‘Internet of Things' The adoption of RFID (radio-frequency identification) and related technologies was the basis of a trend we first recognised as "expanding the frontiers of automation." But these methods are rudimentary compared with what emerges when assets themselves become elements of an information system, with the ability to capture, compute, communicate, and collaborate around information—something that has come to be known as the "Internet of Things." Embedded with sensors, actuators, and communications capabilities, such objects will soon be able to absorb and transmit information on a massive scale and, in some cases, to adapt and react to changes in the environment automatically. These "smart" assets can make processes more efficient, give products new capabilities, and spark novel business models. Auto insurers in Europe and the United States are testing these waters with offers to install sensors in customers' vehicles. The result is new pricing models that base charges for risk on driving behavior rather than on a driver's demographic characteristics. Luxury-auto manufacturers are equipping vehicles with networked sensors that can automatically take evasive action when accidents are about to happen. In medicine, sensors embedded in or worn by patients continuously report changes in health conditions to physicians, who can adjust treatments when necessary. Sensors in manufacturing lines for products as diverse as computer chips and pulp and paper take detailed readings on process conditions and automatically make adjustments to reduce waste, downtime, and costly human interventions.
  • 5. Experimentation and big data Could the enterprise become a full-time laboratory? What if you could analyse every transaction, capture insights from every customer interaction, and didn't have to wait for months to get data from the field? What if…? Data are flooding in at rates never seen before—doubling every 18 months—as a result of greater access to customer data from public, proprietary, and purchased sources, as well as new information gathered from Web communities and newly deployed smart assets. These trends are broadly known as "big data." Technology for capturing and analysing information is widely available at ever-lower price points. But many companies are taking data use to new levels, using IT to support rigorous, constant business experimentation that guides decisions and to test new products, business models, and innovations in customer experience. In some cases, the new approaches help companies make decisions in real time. This trend has the potential to drive a radical transformation in research, innovation, and marketing.
  • Using experimentation and big data as essential components of management decision making requires new capabilities, as well as organisational and cultural change. Most companies are far from accessing all the available data. Some haven't even mastered the technologies needed to capture and analyse the valuable information they can access. More commonly, they don't have the right talent and processes to design experiments and extract business value from big data, which require changes in the way many executives now make decisions: trusting instincts and experience over experimentation and rigorous analysis. To get managers at all echelons to accept the value of experimentation, senior leaders must buy into a "test and learn" mind-set and then serve as role models for their teams.
  • 6. Wiring for a sustainable world Even as regulatory frameworks continue to evolve, environmental stewardship and sustainability clearly are C-level agenda topics. What's more, sustainability is fast becoming an important corporate-performance metric—one that stakeholders, outside influencers, and even financial markets have begun to track. Information technology plays a dual role in this debate: it is both a significant source of environmental emissions and a key enabler of many strategies to mitigate environmental damage. At present, information technology's share of the world's environmental footprint is growing because of the ever-increasing demand for IT capacity and services. Electricity produced to power the world's data centers generates greenhouse gases on the scale of countries such as Argentina or the Netherlands, and these emissions could increase fourfold by 2020. McKinsey research has shown, however, that the use of IT in areas such as smart power grids, efficient buildings, and better logistics planning could eliminate five times the carbon emissions that the IT industry produces.
  • 7. Imagining anything as a service Technology now enables companies to monitor, measure, customise, and bill for asset use at a much more fine-grained level than ever before. Asset owners can therefore create services around what have traditionally been sold as products. Business-to-business (B2B) customers like these service offerings because they allow companies to purchase units of a service and to account for them as a variable cost rather than undertake large capital investments. Consumers also like this "paying only for what you use" model, which helps them avoid large expenditures, as well as the hassles of buying and maintaining a product.
  • In the IT industry, the growth of "cloud computing" (accessing computer resources provided through networks rather than running software or storing data on a local computer) exemplifies this shift. Consumer acceptance of Web-based cloud services for everything from e-mail to video is of course becoming universal, and companies are following suit. Software as a service (SaaS), which enables organisations to access services such as customer relationship management, is growing at a 17 per cent annual rate. The biotechnology company Genentech, for example, uses Google Apps for e-mail and to create documents and spreadsheets, bypassing capital investments in servers and software licenses. This development has created a wave of computing capabilities delivered as a service, including infrastructure, platform, applications, and content. And vendors are competing, with innovation and new business models, to match the needs of different customers.
  • 8. The age of the multisided business model Multisided business models create value through interactions among multiple players rather than traditional one-on-one transactions or information exchanges. In the media industry, advertising is a classic example of how these models work. Newspapers, magasines, and television stations offer content to their audiences while generating a significant portion of their revenues from third parties: advertisers. Other revenue, often through subscriptions, comes directly from consumers. More recently, this advertising-supported model has proliferated on the Internet, underwriting Web content sites, as well as services such as search and e-mail (see trend number seven, "Imagining anything as a service," earlier in this article). It is now spreading to new markets, such as enterprise software: Spiceworks offers IT-management applications to 950,000 users at no cost, while it collects advertising from B2B companies that want access to IT professionals.
  • 9. Innovating from the bottom of the pyramid The adoption of technology is a global phenomenon, and the intensity of its usage is particularly impressive in emerging markets. Our research has shown that disruptive business models arise when technology combines with extreme market conditions, such as customer demand for very low price points, poor infrastructure, hard-to-access suppliers, and low cost curves for talent. With an economic recovery beginning to take hold in some parts of the world, high rates of growth have resumed in many developing nations, and we're seeing companies built around the new models emerging as global players. Many multinationals, meanwhile, are only starting to think about developing markets as wellsprings of technology-enabled innovation rather than as traditional manufacturing hubs.
  • 10. Producing public good on the grid The role of governments in shaping global economic policy will expand in coming years. Technology will be an important factor in this evolution by facilitating the creation of new types of public goods while helping to manage them more effectively. This last trend is broad in scope and draws upon many of the other trends described above.
Weiye Loh

The messy business of cleaning up carbon policy (and how to sell it to the electorate) ... - 0 views

  • 1. Putting a price on carbon is not only about the climate.Yes, humans are affecting the climate and reducing carbon dioxide emissions is a key commitment of this government, and indeed the stated views of the opposition. But there are other reasons to price carbon, primarily to put Australia at the forefront of a global energy technology revolution that is already underway.In future years and decades the world is going to need vastly more energy that is secure, reliable, clean and affordable. Achieving these outcomes will require an energy technology revolution. The purpose of pricing carbon is to raise the revenues needed to invest in this future, just as we invest in health, agriculture and defence.
  • 2. A price on carbon raises revenues to invest in stimulating that energy technology revolution.Australia emits almost 400 million tonnes of carbon dioxide into the atmosphere every year. In round numbers, every dollar carbon tax per tonne on those emissions would raise about A$100 million. A significant portion of the proceeds from a carbon tax should be used to invest in energy technology innovation, using today’s energy economy to build a bridge to tomorrow’s economy. This is exactly the strategy that India has adopted with a small levy on coal and Germany has adopted with a tax on nuclear fuel rods, with proceeds in both instances invested into energy innovation.
  • 3. The purpose of a carbon tax is not to make energy, food, petrol or consumer goods appreciably more expensive.Just as scientists are in broad agreement that humans are affecting the global climate, economists and other experts are in broad agreement that we cannot revolutionise our energy economy through pricing mechanisms alone. Thus, we propose starting with a low carbon tax - one that has broad political support - and then committing to increasing it in a predictable manner over time.The Coalition has proposed a “direct action plan” on carbon policy that would cost A$30 billion over the next 8 years, which is the equivalent of about a $2.50 per tonne carbon tax. The question to be put to the Coalition is not whether we should be investing in a carbon policy, as we agree on that point, but how much and how it should be paid for. The Coalition’s plans leave unanswered how they would pay for their plan.A carbon tax offers a responsible and effective manner to raise funds without harming the economy or jobs. In fact, to the extent that investments in energy innovation bear fruit, new markets will be opened and new jobs will be created. The Coalition’s plan is not focused on energy technology innovation.The question for the Coalition should thus be, at what level would you set a carbon tax (or what other taxes would you raise?), and how would you invest the proceeds in a manner that accelerates energy technology innovation?
  • ...1 more annotation...
  • 4. Even a low carbon tax will make some goods cost a bit more, so it is important to help those who are most affected.Our carbon tax proposal is revenue neutral in the sense that we will lower other taxes in direct proportion to the impact, however modest, of a low carbon tax. We will do this with particular attention to those who may be most directly affected by a price on carbon.In addition, some portion of the revenue raised by a carbon tax will be returned to the public. But not all. It is important to invest in tomorrow’s energy technologies today and a carbon tax provides the mechanism for doing so.
Weiye Loh

MSDN Blogs - 0 views

  • Google could still put ads in front of more people than Facebook, but Facebook knows so much more about those people. Advertisers and publishers cherish this kind of personal information, so much so that they are willing to put the Facebook brand before their own. Exhibit A: www.facebook.com/nike, a company with the power and clout of Nike putting their own brand after Facebook’s?
  • As it turned out, sharing was not broken. Sharing was working fine and dandy, Google just wasn’t part of it. People were sharing all around us and seemed quite happy. A user exodus from Facebook never materialized. I couldn’t even get my own teenage daughter to look at Google+ twice, “social isn’t a product,” she told me after I gave her a demo, “social is people and the people are on Facebook.” Google was the rich kid who, after having discovered he wasn’t invited to the party, built his own party in retaliation. The fact that no one came to Google’s party became the elephant in the room.
Weiye Loh

The Dawn of Paid Search Without Keywords - Search Engine Watch (SEW) - 0 views

  • This year will fundamentally change how we think about and buy access to prospects, namely keywords. It is the dawn of paid search without keywords.
  • Google's search results were dominated by the "10 blue links" -- simple headlines, descriptions, and URLs to entice and satisfy searchers. Until it wasn't. Universal search wove in images, video, and real-time updates.
  • For most of its history, too, AdWords been presented in a text format even as the search results morphed into a multimedia experience. The result is that attention was pulled towards organic results at the expense of ads.
  • ...8 more annotations...
  • Google countered that trend with their big push for universal paid search in 2010. It was, perhaps, the most radical evolution to the paid search results since the introduction of Quality Score. Consider the changes:
  • New ad formats: Text is no longer the exclusive medium for advertising on Google. No format exemplifies that more than Product List Ads (and their cousin, Product Extensions). There is no headline, copy or display URL. Instead, it's just a product image, name, price and vendor slotted in the highest positions on the right side. What's more, you don't choose keywords. We also saw display creep into image search results with Image Search Ads and traditional display ads.
  • New calls-to-action: The way you satisfy your search with advertising on Google has evolved as well. Most notably, through the introduction of click-to-call as an option for mobile search ads (as well as the limited release AdWords call metrics). Similarly, more of the site experience is being pulled into the search results. The beta Comparison Ads creates a marketplace for loan and credit card comparison all on Google. The call to action is comparison and filtering, not just clicking on an ad.
  • New buying/monetization models: Cost-per-click (CPC) and cost-per-thousand-impressions (CPM) are no longer the only ways you can buy. Comparison Ads are sold on a cost-per-lead basis. Product listing ads are sold on a cost-per-acquisition (CPA) basis for some advertisers (CPC for most).
  • New display targeting options: Remarketing (a.k.a. retargeting) brought highly focused display buys to the AdWords interface. Specifically, the ability to only show display ads to segments of people who visit your site, in many cases after clicking on a text ad.
  • New advertising automation: In a move that radically simplifies advertising for small businesses, Google began testing Google Boost. It involves no keyword research and no bidding. If you have a Google Places page, you can even do it without a website. It's virtually hands-off advertising for SMBs.
  • Of those changes, Google Product Listing Ads and Google Boost offer the best glimpse into the future of paid search without keywords. They're notable for dramatic departures in every step of how you advertise on Google: Targeting: Automated targeting toward certain audiences as determined by Google vs. keywords chosen by the advertiser. Ads: Product listing ads bring a product search like result in the top position in the right column and Boost promotes a map-like result in a preferred position above organic results. Pricing: CPA and monthly budget caps replace daily budgets and CPC bids.
  • For Google to continue their pace of growth, they need two things: Another line of business to complement AdWords, and display advertising is it. They've pushed even more aggressively into the channel, most notably with the acquisition of Invite Media, a demand side platform. To remove obstacles to profit and incremental growth within AdWords. These barriers are primarily how wide advertisers target and how much they pay for the people they reach (see: "Why Google Wants to Eliminate Bidding In Exchange for Your Profits").
Weiye Loh

Breakthrough Europe: A (Heterodox) Lesson in Economics from Ha-Joon Chang - 0 views

  • But, to the surprise of the West, that steel mill grew out to be POSCO, the world's third-largest and Asia's most profitable steel maker.
  • South Korea's developmental state, which relied on active government investment in R&D and crucial support for capital-intensive sectors in the form of start-up subsidies and infant industry protection, transformed the country into the richest on the Asian continent (with the exception of Singapore and Hong Kong). LG and Hyundai are similar legacies of Korea's spectacular industrial policy success.
  • Even though they were not trained as economists, the economic officials of East Asia knew some economics. However, especially until the 1970s, the economics they knew was mostly not of the free-market variety. The economics they happened to know was the economics of Karl Marx, Friedrich List, Joseph Schumpeter, Nicholas Kaldor and Albert Hirschman. Of course, these economists lived in different times, contended with different problems and had radically differing political views (ranging from the very right-wing List to the very left-wing Marx). However, there was a commonality between their economics. It was the recognition that capitalism develops through long-term investments and technological innovations that transform the productive structure, and not merely an expansion of existing structures, like inflating a balloon.
  • ...5 more annotations...
  • Arguing that governments can pick winners, Professor Chang urges us to reclaim economic planning, not as a token of centrally-planned communism, but rather as the simple reality behind our market economies today:
  • Capitalist economies are in large part planned. Governments in capitalist economies practice planning too, albeit on a more limited basis than under communist central planning. All of them finance a significant share of investment in R&D and infrastructure. Most of them plan a significant chunk of the economy through the planning of the activities of state-owned enterprises. Many capitalist governments plan the future shape of individual industrial sectors through sectoral industrial policy or even that of the national economy through indicative planning. More importantly, modern capitalist economies are made up of large, hierarchical corporations that plan their activities in great detail, even across national borders. Therefore, the question is not whether you plan or not. It is about planning the right things at the right levels.
  • Drawing a clear distinction between communist central planning and capitalist 'indicative' planning, Chang notes that the latter: ... involves the government ... setting some broad targets concerning key economic variables (e.g., investments in strategic industries, infrastructure development, exports) and working with, not against, the private sector to achieve them. Unlike under central planning, these targets are not legally binding; hence the adjective 'indicative'. However, the government will do its best to achieve them by mobilizing various carrots (e.g., subsidies, granting of monopoly rights) and sticks (e.g., regulations, influence through state-owned banks) at its disposal.
  • Chang observes that: France had great success in promoting investment and technological innovation through indicative planning in the 1950s and 60s, thereby overtaking the British economy as Europe's second industrial power. Other European countries, such as Finland, Norway and Austria, also successfully used indicative planning to upgrade their economies between the 1950s and the 1970s. The East Asian miracle economies of Japan, Korea and Taiwan used indicative planning too between the 1950s and 1980s. This is not to say that all indicative planning exercises have been successful; in India, for example, it has not. Nevertheless, the European and East Asian examples show that planning in certain forms is not incompatible with capitalism and may even promote capitalist development very well.
  • As we have argued before, the current crisis raging through Europe (in large part caused by free-market economics), forces us to reconsider our economic options. More than ever before, now is the time to rehabilitate indicative planning and industrial policy as key levers in our arsenal of policy tools.
  •  
    heterodox Cambridge economist exposes 23 myths behind the neoliberal free-market dogma and urges us to recognize that "capitalism develops through long-term investments and technological innovations," spearheaded by an activist state committed to sustainable economic development.
Weiye Loh

Net-Neutrality: The First Amendment of the Internet | LSE Media Policy Project - 0 views

  • debates about the nature, the architecture and the governing principles of the internet are not merely technical or economic discussions.  Above all, these debates have deep political, social, and cultural implications and become a matter of public, national and global interest.
  • In many ways, net neutrality could be considered the first amendment of the internet; no pun intended here. However, just as with freedom of speech the principle of net neutrality cannot be approached as absolute or as a fetish. Even in a democracy we cannot say everything applies all the time in all contexts. Limiting the core principle of freedom of speech in a democracy is only possible in very specific circumstances, such as harm, racism or in view of the public interest. Along the same lines, compromising on the principle of net neutrality should be for very specific and clearly defined reasons that are transparent and do not serve commercial private interests, but rather public interests or are implemented in view of guaranteeing an excellent quality of service for all.
  • One of the only really convincing arguments of those challenging net neutrality is that due to the dramatic increases in streaming activity and data-exchange through peer-to-peer networks, the overall quality of service risks being compromised if we stick to data being treated on a first come first serve basis. We are being told that popular content will need to be stored closer to the consumer, which evidently comes at an extra cost.
  • ...5 more annotations...
  • Implicitly two separate debates are being collapsed here and I would argue that we need to separate both. The first one relates to the stability of the internet as an information and communication infrastructure because of the way we collectively use that infrastructure. The second debate is whether ISPs and telecommunication companies should be allowed to differentiate in their pricing between different levels of quality of access, both towards consumers and content providers.
  • Just as with freedom of speech, circumstances can be found in which the principle while still cherished and upheld, can be adapted and constrained to some extent. To paraphrase Tim Wu (2008), the aspiration should still be ‘to treat all content, sites, and platforms equally’, but maybe some forms of content should be treated more equally than others in order to guarantee an excellent quality of service for all. However, the societal and political implications of this need to be thought through in detail and as with freedom of speech itself, it will, I believe, require strict regulation and conditions.
  • In regards to the first debate on internet stability, a case can be made for allowing internet operators to differentiate between different types of data with different needs – if for any reason the quality of service of the internet as a whole cannot be guaranteed anymore. 
  • Concerning the second debate on differential pricing, it is fair to say that from a public interest and civic liberty perspective the consolidation and institutionalization of a commercially driven two-tiered internet is not acceptable and impossible to legitimate. As is allowing operators to differentiate in the quality of provision of certain kind of content above others.  A core principle such as net neutrality should never be relinquished for the sake of private interests and profit-making strategies – on behalf of industry or for others. If we need to compromise on net neutrality it would always have to be partial, to be circumscribed and only to improve the quality of service for all, not just for the few who can afford it.
  • Separating these two debates exposes the crux of the current net-neutrality debate. In essence, we are being urged to give up on the principle of net-neutrality to guarantee a good quality of service.  However, this argument is actually a pre-text for the telecom industry to make content-providers pay for the facilitation of access to their audiences – the internet subscribers. And this again can be linked to another debate being waged amongst content providers: how do we make internet users pay for the content they access online? I won’t open that can of worms here, but I will make my point clear.  Telecommunication industry efforts to make content providers pay for access to their audiences do not offer legitimate reasons to suspend the first amendment of the internet.
Weiye Loh

BBC News - Facebook v academia: The gloves are off - 0 views

  •  
    "But this latest story once again sparked headlines around the world, even if articles often made the point that the research was not peer-reviewed. What was different, however, was Facebook's reaction. Previously, its PR team has gone into overdrive behind the scenes to rubbish this kind of research but said nothing in public. This time they used a new tactic, humour, to undermine the story. Mike Develin, a data scientist for the social network, published a note on Facebook mocking the Princeton team's "innovative use of Google search trends". He went on to use the same techniques to analyse the university's own prospects, concluding that a decline in searches over recent years "suggests that Princeton will have only half its current enrollment by 2018, and by 2021 it will have no students at all". Now, who knows, Facebook may well face an uncertain future. But academics looking to predict its demise have been put on notice - the company employs some pretty smart scientists who may take your research apart and fire back. The gloves are off."
Weiye Loh

Will Apple's culture hurt the iPhone? -  xinmsn tech & gadgets - 0 views

  •  
    Will Apple's culture hurt the iPhone? Open approach of competitors like Android spurs quicker innovation
Weiye Loh

Why Did 17 Million Students Go to College? - Innovations - The Chronicle of Higher Educ... - 0 views

  • Over 317,000 waiters and waitresses have college degrees (over 8,000 of them have doctoral or professional degrees), along with over 80,000 bartenders, and over 18,000 parking lot attendants. All told, some 17,000,000 Americans with college degrees are doing jobs that the BLS says require less than the skill levels associated with a bachelor’s degree.
  • Charles Murray’s thesis that an increasing number of people attending college do not have the cognitive abilities or other attributes usually necessary for success at higher levels of learning. As more and more try to attend colleges, either college degrees will be watered down (something already happening I suspect) or drop-out rates will rise.
  • interesting new study was posted on the Web site of America’s most prestigious economic-research organization, the National Bureau of Economic Research. Three highly regarded economists (one of whom has won the Nobel Prize in Economic Science) have produced “Estimating Marginal Returns to Education,” Working Paper 16474 of the NBER. After very sophisticated and elaborate analysis, the authors conclude “In general, marginal and average returns to college are not the same.” (p. 28)
  • ...8 more annotations...
  • even if on average, an investment in higher education yields a good, say 10 percent, rate of return, it does not follow that adding to existing investments will yield that return, partly for reasons outlined above. The authors (Pedro Carneiro, James Heckman, and Edward Vytlacil) make that point explicitly, stating “Some marginal expansions of schooling produce gains that are well below average returns, in general agreement with the analysis of Charles Murray.”  (p.29)
  • Once the economy improves, and history tells us it will improve within our lifetimes, those who already have a college degree under their belts will be better equipped to take advantage of new employment opportunities than those who don’t. Perhaps not because of the actual knowledge obtained through their degrees, but definitely as an offset to the social stigma that still exists for those who do not attend college. A college degree may not help a young person secure professional work immediately – so new graduates spend a few years waiting tables until the right opportunity comes along. So what? It’s probably good for them. But they have 40-50 years in the workforce ahead of them and need to be forward-thinking if they don’t want to wait tables for that entire time. If we stop encouraging all young people to view college as both a goal and a possibility, and start weeding out those whose “prior academic records suggest little likelihood of academic success” which, let’s face it, will happen in larger proportions in poorer schools, then in 20 years we’ll find that efforts to reduce socioeconomic gaps between minorities and non-minorities have been seriously undermined.
  • Bet you a lot of those janitors with PhDs are from the humanities, in particular ethic studies, film studies,…basket weaving courses… or non-economics social sciences, eg., sociology, anthropology of never heard of country….There should be a buyer beware warning on all those non-quantitative majors that make people into sophisticated malcontent complainers!
  • This article also presumes that the purpose of higher education is merely to train one for a career path and enhance future income. This devalues the university, turning it into a vocational training institution. There’s nothing in this data that suggests that they are “sophisticated complainers”; that’s an unwarranted inference.
  • it was mentioned that the Bill and Melinda Gates Foundation would like 80% of American youth to attend and graduate from college. It is a nice thought in many ways. As a teacher and professor, intellectually I am all for it (if the university experience is a serious one, which these days, I don’t know).
  • students’ expectations in attending college are not just intellectual; they are careerist (probably far more so)
  • This employment issue has more to do with levels of training and subsequent levels of expectation. When a Korean student emerges from 20 years of intense study with a university degree, he or she reasonably expects a “good” job — which is to say, a well-paying professional or managerial job with good forward prospects. But here’s the problem. There does not exist, nor will there ever exist, a society in which 80% of the available jobs are professional, managerial, comfortable, and well-paid. No way.
  • Korea has a number of other jobs, but some are low-paid service work, and many others — in factories, farming, fishing — are scorned as 3-D jobs (difficult, dirty, and dangerous). Educated Koreans don’t want them. So the country is importing labor in droves — from China, Vietnam, Cambodia, the Philippines, even Uzbekistan. In the countryside, rural Korean men are having such a difficult time finding prospective wives to share their agricultural lifestyle that fully 40% of rural marriages are to poor women from those other Asian countries, who are brought in by match-makers and marriage brokers.
  •  
    Why Did 17 Million Students Go to College?
Weiye Loh

Roger Pielke Jr.'s Blog: Clean Tech Innovation and the "Iron Law" - 0 views

  • Cleantech companies just can’t seem to get it right. At least, that’s the notion Peter Thiel — a co-founder of PayPal and president of Clarium Capital — subscribes to when he looks at cleantech companies as potential investing opportunities. He made the comments at a Commonwealth Club event in San Francisco Wednesday.
  • That’s not because he doesn’t believe in the technology — he just doesn’t like the way the companies are run, he said. “Most of the people who run cleantech companies are sales people, not engineers,” Thiel said. “Something seems to have gone quite wrong with cleantech.”
  • most cleantech companies that try to develop alternative energy forms are building power sources that are more expensive. Solar panels, for example, are still not a cost-efficient way to generate power because companies have made the assumption that people will pay more for more environmentally friendly ways of producing energy, Thiel said. “We need something cheaper, not more expensive,” he said. “It doesn’t matter if the energy is cleaner, it doesn’t work if it’s more expensive.”
Weiye Loh

FT.com / Business education / Soapbox - Popular fads replace relevant teaching - 0 views

  • There is a great divide in business schools, one that few outsiders are aware of. It is the divide between research and teaching. There is little relation between them. What is being taught in management books and classrooms is usually not based on rigorous research and vice-versa; the research published in prestigious academic journals seldom finds its way into the MBA classroom.
  • none of this research is really intended to be used in the classroom, or to be communicated to managers in some other form, it is not suited to serve that purpose. The goal is publication in a prestigious academic journal, but that does not make it useful or even offer a guarantee that the research findings provide much insight into the workings of business reality.
  • is not a new problem. In 1994, Don Hambrick, then the president of the Academy of Management, said: “We read each others’ papers in our journals and write our own papers so that we may, in turn, have an audience . . . an incestuous, closed loop”. Management research is not required to be relevant. Consequently much of it is not.
  • ...6 more annotations...
  • But business education clearly also suffers. What is being taught in management courses is usually not based on solid scientific evidence. Instead, it concerns the generalisation of individual business cases or the lessons from popular management books. Such books often are based on the appealing formula that they look at several successful companies, see what they have in common and conclude that other companies should strive to do the same thing.
  • how do you know that the advice provided is reasonable, or if it comes from tomorrow’s Enrons, RBSs, Lehmans and WorldComs? How do you know that today’s advice and cases will not later be heralded as the epitome of mismanagement?
  • In the 1990s, ISO9000 (a quality management systems standard) spread through many industries. But research by professors Mary Benner and Mike Tushman showed that its adoption could, in time, lead to a fall in innovation (because ISO9000 does not allow for deviations from a set standard, which innovation requires), making the adopter worse off. This research was overlooked by practitioners, many business schools continued to applaud the benefits of ISO9000 in their courses, while firms continued – and still do – to implement the practice, ignorant of its potential pitfalls. Yet this research offers a clear example of the possible benefits of scientific research methods: rigorous research that reveals unintended consequences to expose the true nature of a business practice.
  • such research with important practical implications unfortunately is the exception rather than the rule. Moreover, even relevant research is largely ignored in business education – as happened to the findings by Benner and Tushman.
  • Of course one should not make the mistake that business cases and business books based on personal observation and opinion are without value. They potentially offer a great source of practical experience. Similarly, it would be naive to assume that scientific research can provide custom-made answers. Rigorous management research could and should provide the basis for skilled managers to make better decisions. However, they cannot do that without the in-depth knowledge of their specific organisation and circumstances.
  • at present, business schools largely fail in providing rigorous, evidence-based teaching.
1 - 20 of 57 Next › Last »
Showing 20 items per page