Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Power

Rss Feed Group items tagged

Weiye Loh

The Breakthrough Institute: ANALYSIS: Nuclear Moratorium in Germany Could Cause Spike i... - 0 views

  • The German government announced today that it will shut down seven of the country's seventeen nuclear power plants for an indefinite period, a decision taken in response to widespread protests and a German public increasingly fearful of nuclear power after a nuclear emergency in Japan. The decision places a moratorium on a law that would extend the lifespan of these plants, and is uncharacteristic of Angela Merkel, whose government previously overturned its predecessor's decision to phase nuclear out of Germany's energy supply.
  • The seven plants, each built before 1980, represent 30% of Germany's nuclear electricity generation and 24% of its gross installed nuclear capacity. Shutting down these plants, or even just placing an indefinite hold on their operation, would be a major loss of zero-emissions generation capacity for Germany. The country currently relies on nuclear power from its seventeen nuclear power plants for about a quarter of its electricity supply.
  • The long-term closure of these plants would therefore seriously challenge Germany's carbon emissions efforts, as they try to meet the goal of 40% reduction below 1990 carbon emissions rates by 2020.
  • ...4 more annotations...
  • The moratorium could cause a spike in CO2 emissions as Germany turns to its other, more carbon-intensive sources to supply its energy demand. Already, the country has been engaged in a "dash for coal", building dozens of new coal plants in response to the perverse incentives and intense lobbying from the coal industries made possible by the European Emissions Trading Scheme. (As previously reported by Breakthrough Europe).
  • if lost generation were made up for entirely by coal-fired plants, carbon emissions would increase annually by as much as 33 million tons. This would represent an overall 4% annual increase in carbon emissions for the country and an 8% increase in carbon emissions for the power sector alone.
  • Alternatively, should the country try to replace lost generation entirely with power from renewables, it would need to more than double generation of renewable energy, from where it currently stands at 97 billion kWh to about 237 billion kWh. As part of the country's low-carbon strategy, Germany has planned to deploy at least 20% renewable energy sources by 2020. If the nation now chooses to meet this goal by displacing nuclear plants, 2020 emissions levels would be higher than had the country otherwise phased out its carbon-intensive coal or natural gas plants.
  • *Carbon emissions factors used are those estimated by the World Bank in 2009 for new coal-fired power plants (0.795 t C02/MWh) and new gas-fired power plants (0.398 t C02/MWh) **Data from Carbon Monitoring For Action, European Nuclear Society Data, and US Energy Information Administration
  •  
    Carbon dioxide emissions in Germany may increase by 4 percent annually in response to a moratorium on seven of the country's oldest nuclear power plants, as power generation is shifted from nuclear power, a zero carbon source, to the other carbon-intensive energy sources that currently make up the country's energy supply.
Weiye Loh

Roger Pielke Jr.'s Blog: You Take the Risks, We'll Take the Watts - 0 views

  • From Der Spiegel: Germany has been importing nuclear power from France and the Czech Republic since it switched off its seven oldest nuclear power stations last month in the wake of the Fukushima accident, power company RWE said on Monday. A spokesman for RWE confirmed a report in Bild newspaper that Germany had become a net importer of power since March 16. Previously, Germany had been a net electricity exporter because of its rising output of power from renewable energy sources. RWE said the country's power imports from France and Czech have been amounting up to 3,000 megawatts and up to 2,000 megawatts respectively. Three quarters of France's power supply comes from nuclear energy while the Czech Republic relies on reactors for 34 percent of its energy needs. Hildegard Müller, head of the German Association of Energy and Water Industries also said on Monday that power imports were up. "Since March 17, there has been an increase in imports. Flows from France and the Czech Republic have doubled," she said.
  •  
    Germany's moratorium on nuclear power has not necessarily reduced Germany's reliance on nuclear power by as much as one might think
Weiye Loh

Breakthrough Europe: Emerging Economies Reassert Commitment to Nuclear Power - 0 views

  • Nearly half a billion of India's 1.2 billion citizens continue to live in energy poverty. According to the Chairman of the Indian Atomic Energy Commission, Srikumar Banerjee, "ours is a very power-hungry country. It is essential for us to have further electricity generation." The Chinese have cited similar concerns in sticking to their major expansion plans of its nuclear energy sector. At its current GDP growth, China's electricity demands rise an average of 12 percent per year.
  • the Japanese nuclear crisis demonstrates the vast chasm in political priorities between the developing world and the post-material West.
  • Other regions that have reiterated their plans to stick to nuclear energy are Eastern Europe and the Middle East. The Prime Minister of Poland, the fastest growing country in the EU, has said that "fears of a nuclear disaster in Japan following last Friday's earthquake and tsunami would not disturb Poland's own plans to develop two nuclear plants." Russia and the Czech Republic have also restated their commitment to further nuclear development, while the Times reports that "across the Middle East, countries have been racing to build up nuclear power, as a growth and population boom has created unprecedented demand for energy." The United Arab Emirates is building four plants that will generate roughly a quarter of its power by 2020.
  • ...1 more annotation...
  • Some European leaders, including Angela Merkel, may be backtracking fast on their commitment to nuclear power, but despite yesterday's escalation of the ongoing crisis in Fukushima, Japan, there appear to be no signs that India, China and other emerging economies will succumb to a similar backlash. For the emerging economies, overcoming poverty and insecurity of supply remain the overriding priorities of energy policy.
  •  
    As the New York Times reports: The Japanese disaster has led some energy officials in the United States and in industrialized European nations to think twice about nuclear expansion. And if a huge release of radiation worsens the crisis, even big developing nations might reconsider their ambitious plans. But for now, while acknowledging the need for safety, they say their unmet energy needs give them little choice but to continue investing in nuclear power.
Weiye Loh

When information is power, these are the questions we should be asking | Online Journal... - 0 views

  • “There is absolutely no empiric evidence that shows that anyone actually uses the accounts produced by public bodies to make any decision. There is no group of principals analogous to investors. There are many lists of potential users of the accounts. The Treasury, CIPFA (the UK public sector accounting body) and others have said that users might include the public, taxpayers, regulators and oversight bodies. I would be prepared to put up a reward for anyone who could prove to me that any of these people have ever made a decision based on the financial reports of a public body. If there are no users of the information then there is no point in making the reports better. If there are no users more technically correct reports do nothing to improve the understanding of public finances. In effect all that better reports do is legitimise the role of professional accountants in the accountability process.
  • raw data – and the ability to interrogate that – should instead be made available because (quoting Anthony Hopwood): “Those with the power to determine what enters into organisational accounts have the means to articulate and diffuse their values and concerns, and subsequently to monitor, observe and regulate the actions of those that are now accounted for.”
  • Data is not just some opaque term; something for geeks: it’s information: the raw material we deal in as journalists. Knowledge. Power. The site of a struggle for control. And considering it’s a site that journalists have always fought over, it’s surprisingly placid as we enter one of the most important ages in the history of information control.
  • ...1 more annotation...
  • 3 questions to ask of any transparency initiative: If information is to be published in a database behind a form, then it’s hidden in plain sight. It cannot be easily found by a journalist, and only simple questions will be answered. If information is to be published in PDFs or JPEGs, or some format that you need proprietary software to see, then it cannot be easily be questioned by a journalist If you will have to pass a test to use the information, then obstacles will be placed between the journalist and that information The next time an organisation claims that they are opening up their information, tick those questions off. (If you want more, see Gurstein’s list of 7 elements that are needed to make effective use of open data).
  •  
    control of information still represents the exercise of power, and how shifts in that control as a result of the transparency/open data/linked data agenda are open to abuse, gaming, or spin.
Weiye Loh

Arianna Huffington: The Media Gets It Wrong on WikiLeaks: It's About Broken Trust, Not ... - 0 views

  • Too much of the coverage has been meta -- focusing on questions about whether the leaks were justified, while too little has dealt with the details of what has actually been revealed and what those revelations say about the wisdom of our ongoing effort in Afghanistan. There's a reason why the administration is so upset about these leaks.
  • True, there hasn't been one smoking-gun, bombshell revelation -- but that's certainly not to say the cables haven't been revealing. What there has been instead is more of the consistent drip, drip, drip of damning details we keep getting about the war.
  • It's notable that the latest leaks came out the same week President Obama went to Afghanistan for his surprise visit to the troops -- and made a speech about how we are "succeeding" and "making important progress" and bound to "prevail."
  • ...16 more annotations...
  • The WikiLeaks cables present quite a different picture. What emerges is one reality (the real one) colliding with another (the official one). We see smart, good-faith diplomats and foreign service personnel trying to make the truth on the ground match up to the one the administration has proclaimed to the public. The cables show the widening disconnect. It's like a foreign policy Ponzi scheme -- this one fueled not by the public's money, but the public's acquiescence.
  • The second aspect of the story -- the one that was the focus of the symposium -- is the changing relationship to government that technology has made possible.
  • Back in the year 2007, B.W. (Before WikiLeaks), Barack Obama waxed lyrical about government and the internet: "We have to use technology to open up our democracy. It's no coincidence that one of the most secretive administrations in our history has favored special interest and pursued policy that could not stand up to the sunlight."
  • Not long after the election, in announcing his "Transparency and Open Government" policy, the president proclaimed: "Transparency promotes accountability and provides information for citizens about what their Government is doing. Information maintained by the Federal Government is a national asset." Cut to a few years later. Now that he's defending a reality that doesn't match up to, well, reality, he's suddenly not so keen on the people having a chance to access this "national asset."
  • Even more wikironic are the statements by his Secretary of State who, less than a year ago, was lecturing other nations about the value of an unfettered and free internet. Given her description of the WikiLeaks as "an attack on America's foreign policy interests" that have put in danger "innocent people," her comments take on a whole different light. Some highlights: In authoritarian countries, information networks are helping people discover new facts and making governments more accountable... technologies with the potential to open up access to government and promote transparency can also be hijacked by governments to crush dissent and deny human rights... As in the dictatorships of the past, governments are targeting independent thinkers who use these tools. Now "making government accountable" is, as White House spokesman Robert Gibbs put it, a "reckless and dangerous action."
  • ay Rosen, one of the participants in the symposium, wrote a brilliant essay entitled "From Judith Miller to Julian Assange." He writes: For the portion of the American press that still looks to Watergate and the Pentagon Papers for inspiration, and that considers itself a check on state power, the hour of its greatest humiliation can, I think, be located with some precision: it happened on Sunday, September 8, 2002. That was when the New York Times published Judith Miller and Michael Gordon's breathless, spoon-fed -- and ultimately inaccurate -- account of Iraqi attempts to buy aluminum tubes to produce fuel for a nuclear bomb.
  • Miller's after-the-facts-proved-wrong response, as quoted in a Michael Massing piece in the New York Review of Books, was: "My job isn't to assess the government's information and be an independent intelligence analyst myself. My job is to tell readers of The New York Times what the government thought about Iraq's arsenal." In other words, her job is to tell citizens what their government is saying, not, as Obama called for in his transparency initiative, what their government is doing.
  • As Jay Rosen put it: Today it is recognized at the Times and in the journalism world that Judy Miller was a bad actor who did a lot of damage and had to go. But it has never been recognized that secrecy was itself a bad actor in the events that led to the collapse, that it did a lot of damage, and parts of it might have to go. Our press has never come to terms with the ways in which it got itself on the wrong side of secrecy as the national security state swelled in size after September 11th.
  • And in the WikiLeaks case, much of media has again found itself on the wrong side of secrecy -- and so much of the reporting about WikiLeaks has served to obscure, to conflate, to mislead. For instance, how many stories have you heard or read about all the cables being "dumped" in "indiscriminate" ways with no attempt to "vet" and "redact" the stories first. In truth, only just over 1,200 of the 250,000 cables have been released, and WikiLeaks is now publishing only those cables vetted and redacted by their media partners, which includes the New York Times here and the Guardian in England.
  • The establishment media may be part of the media, but they're also part of the establishment. And they're circling the wagons. One method they're using, as Andrew Rasiej put it after the symposium, is to conflate the secrecy that governments use to operate and the secrecy that is used to hide the truth and allow governments to mislead us.
  • Nobody, including WikiLeaks, is promoting the idea that government should exist in total transparency,
  • Assange himself would not disagree. "Secrecy is important for many things," he told Time's Richard Stengel. "We keep secret the identity of our sources, as an example, take great pains to do it." At the same time, however, secrecy "shouldn't be used to cover up abuses."
  • Decentralizing government power, limiting it, and challenging it was the Founders' intent and these have always been core conservative principles. Conservatives should prefer an explosion of whistleblower groups like WikiLeaks to a federal government powerful enough to take them down. Government officials who now attack WikiLeaks don't fear national endangerment, they fear personal embarrassment. And while scores of conservatives have long promised to undermine or challenge the current monstrosity in Washington, D.C., it is now an organization not recognizably conservative that best undermines the political establishment and challenges its very foundations.
  • It is not, as Simon Jenkins put it in the Guardian, the job of the media to protect the powerful from embarrassment. As I said at the symposium, its job is to play the role of the little boy in The Emperor's New Clothes -- brave enough to point out what nobody else is willing to say.
  • When the press trades truth for access, it is WikiLeaks that acts like the little boy. "Power," wrote Jenkins, "loathes truth revealed. When the public interest is undermined by the lies and paranoia of power, it is disclosure that takes sanity by the scruff of its neck and sets it back on its feet."
  • A final aspect of the story is Julian Assange himself. Is he a visionary? Is he an anarchist? Is he a jerk? This is fun speculation, but why does it have an impact on the value of the WikiLeaks revelations?
Weiye Loh

Some Scientists Fear Computer Chips Will Soon Hit a Wall - NYTimes.com - 0 views

  • The problem has the potential to counteract an important principle in computing that has held true for decades: Moore’s Law. It was Gordon Moore, a founder of Intel, who first predicted that the number of transistors that could be nestled comfortably and inexpensively on an integrated circuit chip would double roughly every two years, bringing exponential improvements in consumer electronics.
  • In their paper, Dr. Burger and fellow researchers simulated the electricity used by more than 150 popular microprocessors and estimated that by 2024 computing speed would increase only 7.9 times, on average. By contrast, if there were no limits on the capabilities of the transistors, the maximum potential speedup would be nearly 47 times, the researchers said.
  • Some scientists disagree, if only because new ideas and designs have repeatedly come along to preserve the computer industry’s rapid pace of improvement. Dr. Dally of Nvidia, for instance, is sanguine about the future of chip design. “The good news is that the old designs are really inefficient, leaving lots of room for innovation,” he said.
  • ...3 more annotations...
  • Shekhar Y. Borkar, a fellow at Intel Labs, called Dr. Burger’s analysis “right on the dot,” but added: “His conclusions are a little different than what my conclusions would have been. The future is not as golden as it used to be, but it’s not bleak either.” Dr. Borkar cited a variety of new design ideas that he said would help ease the limits identified in the paper. Intel recently developed a way to vary the power consumed by different parts of a processor, making it possible to have both slower, lower-power transistors as well as faster-switching ones that consume more power. Increasingly, today’s processor chips contain two or more cores, or central processing units, that make it possible to use multiple programs simultaneously. In the future, Intel computers will have different kinds of cores optimized for different kinds of problems, only some of which require high power.
  • And while Intel announced in May that it had found a way to use 3-D design to crowd more transistors onto a single chip, that technology does not solve the energy problem described in the paper about dark silicon. The authors of the paper said they had tried to account for some of the promised innovation, and they argued that the question was how far innovators could go in overcoming the power limits.
  • “It’s one of those ‘If we don’t innovate, we’re all going to die’ papers,” Dr. Patterson said in an e-mail. “I’m pretty sure it means we need to innovate, since we don’t want to die!”
Weiye Loh

Miss Malaysia Toy Boy - 7 views

Yes, commodification has led to liberation. After all, capitalism is all about creating new markets for more production and consumption. Beauty has all along been commodified since the oldest trade...

Weiye Loh

Do avatars have digital rights? - 20 views

hi weiye, i agree with you that this brings in the topic of representation. maybe you should try taking media and representation by Dr. Ingrid to discuss more on this. Going back to your questio...

avatars

Weiye Loh

What is the role of the state? | Martin Wolf's Exchange | FT.com - 0 views

  • This question has concerned western thinkers at least since Plato (5th-4th century BCE). It has also concerned thinkers in other cultural traditions: Confucius (6th-5th century BCE); China’s legalist tradition; and India’s Kautilya (4th-3rd century BCE). The perspective here is that of the contemporary democratic west.
  • The core purpose of the state is protection. This view would be shared by everybody, except anarchists, who believe that the protective role of the state is unnecessary or, more precisely, that people can rely on purely voluntary arrangements.
  • Contemporary Somalia shows the horrors that can befall a stateless society. Yet horrors can also befall a society with an over-mighty state. It is evident, because it is the story of post-tribal humanity that the powers of the state can be abused for the benefit of those who control it.
  • ...9 more annotations...
  • In his final book, Power and Prosperity, the late Mancur Olson argued that the state was a “stationary bandit”. A stationary bandit is better than a “roving bandit”, because the latter has no interest in developing the economy, while the former does. But it may not be much better, because those who control the state will seek to extract the surplus over subsistence generated by those under their control.
  • In the contemporary west, there are three protections against undue exploitation by the stationary bandit: exit, voice (on the first two of these, see this on Albert Hirschman) and restraint. By “exit”, I mean the possibility of escaping from the control of a given jurisdiction, by emigration, capital flight or some form of market exchange. By “voice”, I mean a degree of control over, the state, most obviously by voting. By “restraint”, I mean independent courts, division of powers, federalism and entrenched rights.
  • defining what a democratic state, viewed precisely as such a constrained protective arrangement, is entitled to do.
  • There exists a strand in classical liberal or, in contemporary US parlance, libertarian thought which believes the answer is to define the role of the state so narrowly and the rights of individuals so broadly that many political choices (the income tax or universal health care, for example) would be ruled out a priori. In other words, it seeks to abolish much of politics through constitutional restraints. I view this as a hopeless strategy, both intellectually and politically. It is hopeless intellectually, because the values people hold are many and divergent and some of these values do not merely allow, but demand, government protection of weak, vulnerable or unfortunate people. Moreover, such values are not “wrong”. The reality is that people hold many, often incompatible, core values. Libertarians argue that the only relevant wrong is coercion by the state. Others disagree and are entitled to do so. It is hopeless politically, because democracy necessitates debate among widely divergent opinions. Trying to rule out a vast range of values from the political sphere by constitutional means will fail. Under enough pressure, the constitution itself will be changed, via amendment or reinterpretation.
  • So what ought the protective role of the state to include? Again, in such a discussion, classical liberals would argue for the “night-watchman” role. The government’s responsibilities are limited to protecting individuals from coercion, fraud and theft and to defending the country from foreign aggression. Yet once one has accepted the legitimacy of using coercion (taxation) to provide the goods listed above, there is no reason in principle why one should not accept it for the provision of other goods that cannot be provided as well, or at all, by non-political means.
  • Those other measures would include addressing a range of externalities (e.g. pollution), providing information and supplying insurance against otherwise uninsurable risks, such as unemployment, spousal abandonment and so forth. The subsidisation or public provision of childcare and education is a way to promote equality of opportunity. The subsidisation or public provision of health insurance is a way to preserve life, unquestionably one of the purposes of the state. Safety standards are a way to protect people against the carelessness or malevolence of others or (more controversially) themselves. All these, then, are legitimate protective measures. The more complex the society and economy, the greater the range of the protections that will be sought.
  • What, then, are the objections to such actions? The answers might be: the proposed measures are ineffective, compared with what would happen in the absence of state intervention; the measures are unaffordable and might lead to state bankruptcy; the measures encourage irresponsible behaviour; and, at the limit, the measures restrict individual autonomy to an unacceptable degree. These are all, we should note, questions of consequences.
  • The vote is more evenly distributed than wealth and income. Thus, one would expect the tenor of democratic policymaking to be redistributive and so, indeed, it is. Those with wealth and income to protect will then make political power expensive to acquire and encourage potential supporters to focus on common enemies (inside and outside the country) and on cultural values. The more unequal are incomes and wealth and the more determined are the “haves” to avoid being compelled to support the “have-nots”, the more politics will take on such characteristics.
  • In the 1970s, the view that democracy would collapse under the weight of its excessive promises seemed to me disturbingly true. I am no longer convinced of this: as Adam Smith said, “There is a great deal of ruin in a nation”. Moreover, the capacity for learning by democracies is greater than I had realised. The conservative movements of the 1980s were part of that learning. But they went too far in their confidence in market arrangements and their indifference to the social and political consequences of inequality. I would support state pensions, state-funded health insurance and state regulation of environmental and other externalities. I am happy to debate details. The ancient Athenians called someone who had a purely private life “idiotes”. This is, of course, the origin of our word “idiot”. Individual liberty does indeed matter. But it is not the only thing that matters. The market is a remarkable social institution. But it is far from perfect. Democratic politics can be destructive. But it is much better than the alternatives. Each of us has an obligation, as a citizen, to make politics work as well as he (or she) can and to embrace the debate over a wide range of difficult choices that this entails.
  •  
    What is the role of the state?
Weiye Loh

Eben Moglen Is Reshaping Internet With a Freedom Box - NYTimes.com - 0 views

  • Secretary of State Hillary Rodham Clinton spoke in Washington about the Internet and human liberty, a Columbia law professor in Manhattan, Eben Moglen, was putting together a shopping list to rebuild the Internet — this time, without governments and big companies able to watch every twitch of our fingers.
  • The list begins with “cheap, small, low-power plug servers,” Mr. Moglen said. “A small device the size of a cellphone charger, running on a low-power chip. You plug it into the wall and forget about it.”
  • Almost anyone could have one of these tiny servers, which are now produced for limited purposes but could be adapted to a full range of Internet applications, he said. “They will get very cheap, very quick,” Mr. Moglen said. “They’re $99; they will go to $69. Once everyone is getting them, they will cost $29.”
  • ...5 more annotations...
  • The missing ingredients are software packages, which are available at no cost but have to be made easy to use. “You would have a whole system with privacy and security built in for the civil world we are living in,” he said. “It stores everything you care about.” Put free software into the little plug server in the wall, and you would have a Freedom Box that would decentralize information and power, Mr. Moglen said. This month, he created the Freedom Box Foundation to organize the software.
  • In the first days of the personal computer era, many scoffed at the idea that free software could have an important place in the modern world. Today, it is the digital genome for millions of phones, printers, cameras, MP3 players, televisions, the Pentagon, the New York Stock Exchange and the computers that underpin Google’s empire.
  • Social networking has changed the balance of political power, he said, “but everything we know about technology tells us that the current forms of social network communication, despite their enormous current value for politics, are also intensely dangerous to use. They are too centralized; they are too vulnerable to state retaliation and control.”
  • investors were said to have put a value of about $50 billion on Facebook, the social network founded by Mark Zuckerberg. If revolutions for freedom rest on the shoulders of Facebook, Mr. Moglen said, the revolutionaries will have to count on individuals who have huge stakes in keeping the powerful happy.
  • “It is not hard, when everybody is just in one big database controlled by Mr. Zuckerberg, to decapitate a revolution by sending an order to Mr. Zuckerberg that he cannot afford to refuse,” Mr. Moglen said. By contrast, with tens of thousands of individual encrypted servers, there would be no one place where a repressive government could find out who was publishing or reading “subversive” material.
Weiye Loh

In Japan, a Culture That Promotes Nuclear Dependency - NYTimes.com - 0 views

  • look no further than the Fukada Sports Park, which serves the 7,500 mostly older residents here with a baseball diamond, lighted tennis courts, a soccer field and a $35 million gymnasium with indoor pool and Olympic-size volleyball arena. The gym is just one of several big public works projects paid for with the hundreds of millions of dollars this community is receiving for acce
  • the aid has enriched rural communities that were rapidly losing jobs and people to the cities. With no substantial reserves of oil or coal, Japan relies on nuclear power for the energy needed to drive its economic machine. But critics contend that the largess has also made communities dependent on central government spending — and thus unwilling to rock the boat by pushing for robust safety measures.
  • Tsuneyoshi Adachi, a 63-year-old fisherman, joined the huge protests in the 1970s and 1980s against the plant’s No. 2 reactor. He said many fishermen were angry then because chlorine from the pumps of the plant’s No. 1 reactor, which began operating in 1974, was killing seaweed and fish in local fishing grounds. However, Mr. Adachi said, once compensation payments from the No. 2 reactor began to flow in, neighbors began to give him cold looks and then ignore him. By the time the No. 3 reactor was proposed in the early 1990s, no one, including Mr. Adachi, was willing to speak out against the plant. He said that there was the same peer pressure even after the accident at Fukushima, which scared many here because they live within a few miles of the Shimane plant. “Sure, we are all worried in our hearts about whether the same disaster could happen at the Shimane nuclear plant,” Mr. Adachi said. However, “the town knows it can no longer survive economically without the nuclear plant.”
  • ...1 more annotation...
  • Much of this flow of cash was the product of the Three Power Source Development Laws, a sophisticated system of government subsidies created in 1974 by Kakuei Tanaka, the powerful prime minister who shaped Japan’s nuclear power landscape and used big public works projects to build postwar Japan’s most formidable political machine. The law required all Japanese power consumers to pay, as part of their utility bills, a tax that was funneled to communities with nuclear plants. Officials at the Ministry of Economy, Trade and Industry, which regulates the nuclear industry, and oversees the subsidies, refused to specify how much communities have come to rely on those subsidies. “This is money to promote the locality’s acceptance of a nuclear plant,” said Tatsumi Nakano of the ministry’s Agency for Natural Resources and Energy.
Weiye Loh

Does "Inclusion" Matter for Open Government? (The Answer Is, Very Much Indeed... - 0 views

  • But in the context of the Open Government Partnership and the 70 or so countries that have already committed themselves to this or are in the process I’m not sure that the world can afford to wait to see whether this correlation is direct, indirect or spurious especially if we can recognize that in the world of OGP, the currency of accumulation and concentration is not raw economic wealth but rather raw political power.
  • in the same way as there appears to be an association between the rise of the Internet and increasing concentrations of wealth one might anticipate that the rise of Internet enabled structures of government might be associated with the increasing concentration of political power in fewer and fewer hands and particularly the hands of those most adept at manipulating the artifacts and symbols of the new Internet age.
  • I am struck by the fact that while the OGP over and over talks about the importance and value and need for Open Government there is no similar or even partial call for Inclusive Government.  I’ve argued elsewhere how “Open”, in the absence of attention being paid to ensuring that the pre-conditions for the broadest base of participation will almost inevitably lead to the empowerment of the powerful. What I fear with the OGP is that by not paying even a modicum of attention to the issue of inclusion or inclusive development and participation that all of the idealism and energy that is displayed today in Brasilia is being directed towards the creation of the Governance equivalents of the Internet billionaires whatever that might look like.
  • ...1 more annotation...
  • crowd sourced public policy
  •  
    alongside the rise of the Internet and the empowerment of the Internet generation has emerged the greatest inequalities of wealth and privilege that any of the increasingly Internet enabled economies/societies have experienced at least since the great Depression and perhaps since the beginnings of systematic economic record keeping.  The association between the rise of inequality and the rise of the Internet has not yet been explained and if may simply be a coincidence but somehow I'm doubtful and we await a newer generation of rather more critical and less dewey economists to give us the models and explanations for this co-evolution.
Weiye Loh

Paul Crowley's Blog - A survey of anti-cryonics writing - 0 views

  • cryonics offers almost eternal life. To its critics, cryonics is pseudoscience; the idea that we could freeze someone today in such a way that future technology might be able to re-animate them is nothing more than wishful thinking on the desire to avoid death. Many who battle nonsense dressed as science have spoken out against it: see for example Nano Nonsense and Cryonics, a 2001 article by celebrated skeptic Michael Shermer; or check the Skeptic’s Dictionary or Quackwatch entries on the subject, or for more detail read the essay Cryonics–A futile desire for everlasting life by “Invisible Flan”.
  • And of course the pro-cryonics people have written reams and reams of material such as Ben Best’s Scientific Justification of Cryonics Practice on why they think this is exactly as plausible as I might think, and going into tremendous technical detail setting out arguments for its plausibility and addressing particular difficulties. It’s almost enough to make you want to sign up on the spot. Except, of course, that plenty of totally unscientific ideas are backed by reams of scientific-sounding documents good enough to fool non-experts like me. Backed by the deep pockets of the oil industry, global warming denialism has produced thousands of convincing-sounding arguments against the scientific consensus on CO2 and AGW. T
  • Nano Nonsense and Cryonics goes for the nitty-gritty right away in the opening paragraph:To see the flaw in this system, thaw out a can of frozen strawberries. During freezing, the water within each cell expands, crystallizes, and ruptures the cell membranes. When defrosted, all the intracellular goo oozes out, turning your strawberries into runny mush. This is your brain on cryonics.This sounds convincing, but doesn’t address what cryonicists actually claim. Ben Best, President and CEO of the Cryonics Institute, replies in the comments:Strawberries (and mammalian tissues) are not turned to mush by freezing because water expands and crystallizes inside the cells. Water crystallizes in the extracellular space because more nucleators are found extracellularly. As water crystallizes in the extracellular space, the extracellular salt concentration increases causing cells to lose water osmotically and shrink. Ultimately the cell membranes are broken by crushing from extracellular ice and/or high extracellular salt concentration. […] Cryonics organizations use vitrification perfusion before cooling to cryogenic temperatures. With good brain perfusion, vitrification can reduce ice formation to negligible amounts.
  • ...6 more annotations...
  • The Skeptic’s Dictionary entry is no advance. Again, it refers erroneously to a “mushy brain”. It points out that the technology to reanimate those in storage does not already exist, but provides no help for us non-experts in assessing whether it is a plausible future technology, like super-fast computers or fusion power, or whether it is as crazy as the sand-powered tank; it simply asserts baldly and to me counterintuitively that it is the latter. Again, perhaps cryonic reanimation is a sand-powered tank, but I can explain to you why a sand-powered tank is implausible if you don’t already know, and if cryonics is in the same league I’d appreciate hearing the explanation.
  • Another part of the article points out the well-known difficulties with whole-body freezing — because the focus is on achieving the best possible preservation of the brain, other parts suffer more. But the reason why the brain is the focus is that you can afford to be a lot bolder in repairing other parts of the body — unlike the brain, if my liver doesn’t survive the freezing, it can be replaced altogether.
  • Further, the article ignores one of the most promising possibilities for reanimation, that of scanning and whole-brain emulation, a route that requires some big advances in computer and scanning technology as well as our understanding of the lowest levels of the brain’s function, but which completely sidesteps any problems with repairing either damage from the freezing process or whatever it was that led to legal death.
  • Sixteen years later, it seems that hasn’t changed; in fact, as far as the issue of technical feasability goes it is starting to look as if on all the Earth, or at least all the Internet, there is not one person who has ever taken the time to read and understand cryonics claims in any detail, still considers it pseudoscience, and has written a paper, article or even a blog post to rebut anything that cryonics advocates actually say. In fact, the best of the comments on my first blog post on the subject are already a higher standard than anything my searches have turned up.
  • I don’t have anything useful to add, I just wanted to say that I feel exactly as you do about cryonics and living forever. And I thought that this statement: I know that I don’t know enough to judge. shows extreme wisdom. If only people wishing to comment on global warming would apply the same test.
  • WRT global warming, the mistake people make is trying to go direct to the first-order evidence, which is much too complicated and too easy to misrepresent to hope to directly interpret unless you make it your life’s work, and even then only in a particular area. The correct thing to do is to collect second-order evidence, such as that every major scientific academy has backed the IPCC.
    • Weiye Loh
       
      First-order evidence vs second-order evidence...
  •  
    Cryonics
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

Cancer resembles life 1 billion years ago, say astrobiologists - microbiology, genomics... - 0 views

  • astrobiologists, working with oncologists in the US, have suggested that cancer resembles ancient forms of life that flourished between 600 million and 1 billion years ago.
  • Read more about what this discovery means for cancer research.
  • The genes that controlled the behaviour of these early multicellular organisms still reside within our own cells, managed by more recent genes that keep them in check.It's when these newer controlling genes fail that the older mechanisms take over, and the cell reverts to its earlier behaviours and grows out of control.
  • ...11 more annotations...
  • The new theory, published in the journal Physical Biology, has been put forward by two leading figures in the world of cosmology and astrobiology: Paul Davies, director of the Beyond Center for Fundamental Concepts in Science, Arizona State University; and Charles Lineweaver, from the Australian National University.
  • According to Lineweaver, this suggests that cancer is an atavism, or an evolutionary throwback.
  • In the paper, they suggest that a close look at cancer shows similarities with early forms of multicellular life.
  • “Unlike bacteria and viruses, cancer has not developed the capacity to evolve into new forms. In fact, cancer is better understood as the reversion of cells to the way they behaved a little over one billion years ago, when humans were nothing more than loose-knit colonies of only partially differentiated cells. “We think that the tumours that develop in cancer patients today take the same form as these simple cellular structures did more than a billion years ago,” he said.
  • One piece of evidence to support this theory is that cancers appear in virtually all metazoans, with the notable exception of the bizarre naked mole rat."This quasi-ubiquity suggests that the mechanisms of cancer are deep-rooted in evolutionary history, a conjecture that receives support from both paleontology and genetics," they write.
  • the genes that controlled this early multi-cellular form of life are like a computer operating system's 'safe mode', and when there are failures or mutations in the more recent genes that manage the way cells specialise and interact to form the complex life of today, then the earlier level of programming takes over.
  • Their notion is in contrast to a prevailing theory that cancer cells are 'rogue' cells that evolve rapidly within the body, overcoming the normal slew of cellular defences.
  • However, Davies and Lineweaver point out that cancer cells are highly cooperative with each other, if competing with the host's cells. This suggests a pre-existing complexity that is reminiscent of early multicellular life.
  • cancers' manifold survival mechanisms are predictable, and unlikely to emerge spontaneously through evolution within each individual in such a consistent way.
  • The good news is that this means combating cancer is not necessarily as complex as if the cancers were rogue cells evolving new and novel defence mechanisms within the body.Instead, because cancers fall back on the same evolved mechanisms that were used by early life, we can expect them to remain predictable, thus if they're susceptible to treatment, it's unlikely they'll evolve new ways to get around it.
  • If the atavism hypothesis is correct, there are new reasons for optimism," they write.
  •  
    Feature: Inside DNA vaccines bioMD makes a bid for Andrew Forest's Allied Medical and Coridon Alexion acquires technology for MoCD therapy More > Most Popular Media Releases Cancer resembles life 1 billion years ago, say astrobiologists Feature: The challenge of a herpes simplex vaccine Feature: Proteomics power of pawpaw bioMD makes a bid for Andrew Forest's Allied Medical and Coridon Immune system boosting hormone might lead to HIV cure Biotechnology Directory Company Profile Check out this company's profile and more in the Biotechnology Directory! Biotechnology Directory Find company by name Find company by category Latest Jobs Senior Software Developer / Java Analyst Programm App Support Developer - Java / J2ee Solutions Consultant - VIC Technical Writer Product Manager (Fisheye/Crucible)   BUYING GUIDES Portable Multimedia Players Digital Cameras Digital Video Cameras LATEST PRODUCTS HTC Wildfire S Android phone (preview) Panasonic LUMIX DMC-GH2 digital camera HTC Desire S Android phone (preview) Qld ICT minister Robert Schwarten retires Movie piracy costs Aus economy $1.37 billion in 12 months: AFACT Wireless smartphones essential to e-health: CSIRO Aussie outsourcing CRM budgets to soar in 2011: Ovum Federal government to evaluate education revolution targets Business continuity planning - more than just disaster recovery Proving the value of IT - Part one 5 open source security projects to watch In-memory computing Information security in 2011 EFA shoots down 'unproductive' AFACT movie piracy study In Pictures: IBM hosts Galactic dinner Emerson Network Power launches new infrastructure solutions Consumers not smart enough for smartphones? Google one-ups Apple online media subscription service M2M offerings expand as more machines go online India cancels satellite spectrum deal after controversy Lenovo profit rises in Q3 on strong PC sales in China Taiwan firm to supply touch sensors to Samsung HP regains top position in India's PC market Copyright 20
Weiye Loh

Fukushima: The End of the Nuclear Renaissance? - Ecocentric - TIME.com - 0 views

  •  
    The environmental movement has a strange historical relationship with nuclear power. In many countries, opposition to nuclear reactors in the wake of Chernobyl gave rise to major Green political parties. Many environmentalists still oppose nuclear power--Greenpeace, for example, still calls for the phase out of all reactors. But as climate change has taken over the Green agenda, other environmentalists have come to favor nuclear  as part of a low-carbon energy mix.  It was this confluence of factors-fading memories of Chernobyl and increased concern about greenhouse gases--that gave the nuclear industry such confidence just a few years ago. That confidence will have been deeply shaken by events in Japan.
Weiye Loh

Science, Strong Inference -- Proper Scientific Method - 0 views

  • Scientists these days tend to keep up a polite fiction that all science is equal. Except for the work of the misguided opponent whose arguments we happen to be refuting at the time, we speak as though every scientist's field and methods of study are as good as every other scientist's and perhaps a little better. This keeps us all cordial when it comes to recommending each other for government grants.
  • Why should there be such rapid advances in some fields and not in others? I think the usual explanations that we tend to think of - such as the tractability of the subject, or the quality or education of the men drawn into it, or the size of research contracts - are important but inadequate. I have begun to believe that the primary factor in scientific advance is an intellectual one. These rapidly moving fields are fields where a particular method of doing scientific research is systematically used and taught, an accumulative method of inductive inference that is so effective that I think it should be given the name of "strong inference." I believe it is important to examine this method, its use and history and rationale, and to see whether other groups and individuals might learn to adopt it profitably in their own scientific and intellectual work. In its separate elements, strong inference is just the simple and old-fashioned method of inductive inference that goes back to Francis Bacon. The steps are familiar to every college student and are practiced, off and on, by every scientist. The difference comes in their systematic application. Strong inference consists of applying the following steps to every problem in science, formally and explicitly and regularly: Devising alternative hypotheses; Devising a crucial experiment (or several of them), with alternative possible outcomes, each of which will, as nearly is possible, exclude one or more of the hypotheses; Carrying out the experiment so as to get a clean result; Recycling the procedure, making subhypotheses or sequential hypotheses to refine the possibilities that remain, and so on.
  • On any new problem, of course, inductive inference is not as simple and certain as deduction, because it involves reaching out into the unknown. Steps 1 and 2 require intellectual inventions, which must be cleverly chosen so that hypothesis, experiment, outcome, and exclusion will be related in a rigorous syllogism; and the question of how to generate such inventions is one which has been extensively discussed elsewhere (2, 3). What the formal schema reminds us to do is to try to make these inventions, to take the next step, to proceed to the next fork, without dawdling or getting tied up in irrelevancies.
  • ...28 more annotations...
  • It is clear why this makes for rapid and powerful progress. For exploring the unknown, there is no faster method; this is the minimum sequence of steps. Any conclusion that is not an exclusion is insecure and must be rechecked. Any delay in recycling to the next set of hypotheses is only a delay. Strong inference, and the logical tree it generates, are to inductive reasoning what the syllogism is to deductive reasoning in that it offers a regular method for reaching firm inductive conclusions one after the other as rapidly as possible.
  • "But what is so novel about this?" someone will say. This is the method of science and always has been, why give it a special name? The reason is that many of us have almost forgotten it. Science is now an everyday business. Equipment, calculations, lectures become ends in themselves. How many of us write down our alternatives and crucial experiments every day, focusing on the exclusion of a hypothesis? We may write our scientific papers so that it looks as if we had steps 1, 2, and 3 in mind all along. But in between, we do busywork. We become "method- oriented" rather than "problem-oriented." We say we prefer to "feel our way" toward generalizations. We fail to teach our students how to sharpen up their inductive inferences. And we do not realize the added power that the regular and explicit use of alternative hypothesis and sharp exclusion could give us at every step of our research.
  • A distinguished cell biologist rose and said, "No two cells give the same properties. Biology is the science of heterogeneous systems." And he added privately. "You know there are scientists, and there are people in science who are just working with these over-simplified model systems - DNA chains and in vitro systems - who are not doing science at all. We need their auxiliary work: they build apparatus, they make minor studies, but they are not scientists." To which Cy Levinthal replied: "Well, there are two kinds of biologists, those who are looking to see if there is one thing that can be understood and those who keep saying it is very complicated and that nothing can be understood. . . . You must study the simplest system you think has the properties you are interested in."
  • At the 1958 Conference on Biophysics, at Boulder, there was a dramatic confrontation between the two points of view. Leo Szilard said: "The problems of how enzymes are induced, of how proteins are synthesized, of how antibodies are formed, are closer to solution than is generally believed. If you do stupid experiments, and finish one a year, it can take 50 years. But if you stop doing experiments for a little while and think how proteins can possibly be synthesized, there are only about 5 different ways, not 50! And it will take only a few experiments to distinguish these." One of the young men added: "It is essentially the old question: How small and elegant an experiment can you perform?" These comments upset a number of those present. An electron microscopist said. "Gentlemen, this is off the track. This is philosophy of science." Szilard retorted. "I was not quarreling with third-rate scientists: I was quarreling with first-rate scientists."
  • Any criticism or challenge to consider changing our methods strikes of course at all our ego-defenses. But in this case the analytical method offers the possibility of such great increases in effectiveness that it is unfortunate that it cannot be regarded more often as a challenge to learning rather than as challenge to combat. Many of the recent triumphs in molecular biology have in fact been achieved on just such "oversimplified model systems," very much along the analytical lines laid down in the 1958 discussion. They have not fallen to the kind of men who justify themselves by saying "No two cells are alike," regardless of how true that may ultimately be. The triumphs are in fact triumphs of a new way of thinking.
  • the emphasis on strong inference
  • is also partly due to the nature of the fields themselves. Biology, with its vast informational detail and complexity, is a "high-information" field, where years and decades can easily be wasted on the usual type of "low-information" observations or experiments if one does not think carefully in advance about what the most important and conclusive experiments would be. And in high-energy physics, both the "information flux" of particles from the new accelerators and the million-dollar costs of operation have forced a similar analytical approach. It pays to have a top-notch group debate every experiment ahead of time; and the habit spreads throughout the field.
  • Historically, I think, there have been two main contributions to the development of a satisfactory strong-inference method. The first is that of Francis Bacon (13). He wanted a "surer method" of "finding out nature" than either the logic-chopping or all-inclusive theories of the time or the laudable but crude attempts to make inductions "by simple enumeration." He did not merely urge experiments as some suppose, he showed the fruitfulness of interconnecting theory and experiment so that the one checked the other. Of the many inductive procedures he suggested, the most important, I think, was the conditional inductive tree, which proceeded from alternative hypothesis (possible "causes," as he calls them), through crucial experiments ("Instances of the Fingerpost"), to exclusion of some alternatives and adoption of what is left ("establishing axioms"). His Instances of the Fingerpost are explicitly at the forks in the logical tree, the term being borrowed "from the fingerposts which are set up where roads part, to indicate the several directions."
  • ere was a method that could separate off the empty theories! Bacon, said the inductive method could be learned by anybody, just like learning to "draw a straighter line or more perfect circle . . . with the help of a ruler or a pair of compasses." "My way of discovering sciences goes far to level men's wit and leaves but little to individual excellence, because it performs everything by the surest rules and demonstrations." Even occasional mistakes would not be fatal. "Truth will sooner come out from error than from confusion."
  • Nevertheless there is a difficulty with this method. As Bacon emphasizes, it is necessary to make "exclusions." He says, "The induction which is to be available for the discovery and demonstration of sciences and arts, must analyze nature by proper rejections and exclusions, and then, after a sufficient number of negatives come to a conclusion on the affirmative instances." "[To man] it is granted only to proceed at first by negatives, and at last to end in affirmatives after exclusion has been exhausted." Or, as the philosopher Karl Popper says today there is no such thing as proof in science - because some later alternative explanation may be as good or better - so that science advances only by disproofs. There is no point in making hypotheses that are not falsifiable because such hypotheses do not say anything, "it must be possible for all empirical scientific system to be refuted by experience" (14).
  • The difficulty is that disproof is a hard doctrine. If you have a hypothesis and I have another hypothesis, evidently one of them must be eliminated. The scientist seems to have no choice but to be either soft-headed or disputatious. Perhaps this is why so many tend to resist the strong analytical approach and why some great scientists are so disputatious.
  • Fortunately, it seems to me, this difficulty can be removed by the use of a second great intellectual invention, the "method of multiple hypotheses," which is what was needed to round out the Baconian scheme. This is a method that was put forward by T.C. Chamberlin (15), a geologist at Chicago at the turn of the century, who is best known for his contribution to the Chamberlain-Moulton hypothesis of the origin of the solar system.
  • Chamberlin says our trouble is that when we make a single hypothesis, we become attached to it. "The moment one has offered an original explanation for a phenomenon which seems satisfactory, that moment affection for his intellectual child springs into existence, and as the explanation grows into a definite theory his parental affections cluster about his offspring and it grows more and more dear to him. . . . There springs up also unwittingly a pressing of the theory to make it fit the facts and a pressing of the facts to make them fit the theory..." "To avoid this grave danger, the method of multiple working hypotheses is urged. It differs from the simple working hypothesis in that it distributes the effort and divides the affections. . . . Each hypothesis suggests its own criteria, its own method of proof, its own method of developing the truth, and if a group of hypotheses encompass the subject on all sides, the total outcome of means and of methods is full and rich."
  • The conflict and exclusion of alternatives that is necessary to sharp inductive inference has been all too often a conflict between men, each with his single Ruling Theory. But whenever each man begins to have multiple working hypotheses, it becomes purely a conflict between ideas. It becomes much easier then for each of us to aim every day at conclusive disproofs - at strong inference - without either reluctance or combativeness. In fact, when there are multiple hypotheses, which are not anyone's "personal property," and when there are crucial experiments to test them, the daily life in the laboratory takes on an interest and excitement it never had, and the students can hardly wait to get to work to see how the detective story will come out. It seems to me that this is the reason for the development of those distinctive habits of mind and the "complex thought" that Chamberlin described, the reason for the sharpness, the excitement, the zeal, the teamwork - yes, even international teamwork - in molecular biology and high- energy physics today. What else could be so effective?
  • Unfortunately, I think, there are other other areas of science today that are sick by comparison, because they have forgotten the necessity for alternative hypotheses and disproof. Each man has only one branch - or none - on the logical tree, and it twists at random without ever coming to the need for a crucial decision at any point. We can see from the external symptoms that there is something scientifically wrong. The Frozen Method, The Eternal Surveyor, The Never Finished, The Great Man With a Single Hypothcsis, The Little Club of Dependents, The Vendetta, The All-Encompassing Theory Which Can Never Be Falsified.
  • a "theory" of this sort is not a theory at all, because it does not exclude anything. It predicts everything, and therefore does not predict anything. It becomes simply a verbal formula which the graduate student repeats and believes because the professor has said it so often. This is not science, but faith; not theory, but theology. Whether it is hand-waving or number-waving, or equation-waving, a theory is not a theory unless it can be disproved. That is, unless it can be falsified by some possible experimental outcome.
  • the work methods of a number of scientists have been testimony to the power of strong inference. Is success not due in many cases to systematic use of Bacon's "surest rules and demonstrations" as much as to rare and unattainable intellectual power? Faraday's famous diary (16), or Fermi's notebooks (3, 17), show how these men believed in the effectiveness of daily steps in applying formal inductive methods to one problem after another.
  • Surveys, taxonomy, design of equipment, systematic measurements and tables, theoretical computations - all have their proper and honored place, provided they are parts of a chain of precise induction of how nature works. Unfortunately, all too often they become ends in themselves, mere time-serving from the point of view of real scientific advance, a hypertrophied methodology that justifies itself as a lore of respectability.
  • We speak piously of taking measurements and making small studies that will "add another brick to the temple of science." Most such bricks just lie around the brickyard (20). Tables of constraints have their place and value, but the study of one spectrum after another, if not frequently re-evaluated, may become a substitute for thinking, a sad waste of intelligence in a research laboratory, and a mistraining whose crippling effects may last a lifetime.
  • Beware of the man of one method or one instrument, either experimental or theoretical. He tends to become method-oriented rather than problem-oriented. The method-oriented man is shackled; the problem-oriented man is at least reaching freely toward that is most important. Strong inference redirects a man to problem-orientation, but it requires him to be willing repeatedly to put aside his last methods and teach himself new ones.
  • anyone who asks the question about scientific effectiveness will also conclude that much of the mathematizing in physics and chemistry today is irrelevant if not misleading. The great value of mathematical formulation is that when an experiment agrees with a calculation to five decimal places, a great many alternative hypotheses are pretty well excluded (though the Bohr theory and the Schrödinger theory both predict exactly the same Rydberg constant!). But when the fit is only to two decimal places, or one, it may be a trap for the unwary; it may be no better than any rule-of-thumb extrapolation, and some other kind of qualitative exclusion might be more rigorous for testing the assumptions and more important to scientific understanding than the quantitative fit.
  • Today we preach that science is not science unless it is quantitative. We substitute correlations for causal studies, and physical equations for organic reasoning. Measurements and equations are supposed to sharpen thinking, but, in my observation, they more often tend to make the thinking noncausal and fuzzy. They tend to become the object of scientific manipulation instead of auxiliary tests of crucial inferences.
  • Many - perhaps most - of the great issues of science are qualitative, not quantitative, even in physics and chemistry. Equations and measurements are useful when and only when they are related to proof; but proof or disproof comes first and is in fact strongest when it is absolutely convincing without any quantitative measurement.
  • you can catch phenomena in a logical box or in a mathematical box. The logical box is coarse but strong. The mathematical box is fine-grained but flimsy. The mathematical box is a beautiful way of wrapping up a problem, but it will not hold the phenomena unless they have been caught in a logical box to begin with.
  • Of course it is easy - and all too common - for one scientist to call the others unscientific. My point is not that my particular conclusions here are necessarily correct, but that we have long needed some absolute standard of possible scientific effectiveness by which to measure how well we are succeeding in various areas - a standard that many could agree on and one that would be undistorted by the scientific pressures and fashions of the times and the vested interests and busywork that they develop. It is not public evaluation I am interested in so much as a private measure by which to compare one's own scientific performance with what it might be. I believe that strong inference provides this kind of standard of what the maximum possible scientific effectiveness could be - as well as a recipe for reaching it.
  • The strong-inference point of view is so resolutely critical of methods of work and values in science that any attempt to compare specific cases is likely to sound but smug and destructive. Mainly one should try to teach it by example and by exhorting to self-analysis and self-improvement only in general terms
  • one severe but useful private test - a touchstone of strong inference - that removes the necessity for third-person criticism, because it is a test that anyone can learn to carry with him for use as needed. It is our old friend the Baconian "exclusion," but I call it "The Question." Obviously it should be applied as much to one's own thinking as to others'. It consists of asking in your own mind, on hearing any scientific explanation or theory put forward, "But sir, what experiment could disprove your hypothesis?"; or, on hearing a scientific experiment described, "But sir, what hypothesis does your experiment disprove?"
  • It is not true that all science is equal; or that we cannot justly compare the effectiveness of scientists by any method other than a mutual-recommendation system. The man to watch, the man to put your money on, is not the man who wants to make "a survey" or a "more detailed study" but the man with the notebook, the man with the alternative hypotheses and the crucial experiments, the man who knows how to answer your Question of disproof and is already working on it.
  •  
    There is so much bad science and bad statistics information in media reports, publications, and shared between conversants that I think it is important to understand about facts and proofs and the associated pitfalls.
Weiye Loh

asahi.com(朝日新聞社):INTERVIEW/Lee Kuan Yew: Nuclear accident hurt Japan's reputa... - 0 views

  •  
    Q: One big question, which has emerged from March 11, is how we should find the right source of energy, and how we satisfy our energy demands without jeopardizing the security of the lives of the people and the region. The dependence on nuclear power generation has been called into question in many countries. It will also have an extensive impact on the regional security order. What is the right way to think through this enormous challenge? Lee: This is a difficult question to answer. If no other sources of energy are discovered besides coal, gas and oil, we may have no alternative but nuclear power.
Weiye Loh

Roger Pielke Jr.'s Blog: What Prompted the Decline of Oil Power? - 0 views

  • The figure above comes from the IMF World Economic Outlook released earlier this week in a chapter on "oil scarcity" (PDF).  The report explains the figure as follows: Most OECD countries saw a big switch away from oil in electric power generation in the early 1980s. After oil prices rose sharply compared with the prices of other fossil fuels in the 1970s, the power sector switched from oil to other input (Figure 3.6): some countries went back to coal (for example, the United States); others increased their nuclear capacity (for example, France) or turned to alternative energy sources.
  •  
    Over about 40 years oil lost about 90% of its role as a source of energy for electricity production (from a 25% share to a 2.5% share).  There are a few interesting points to take from this dramatic shift, some of which seem obvious but nonetheless worth highlighting. 1. Significant energy shifts happen. 2. They can take many decades. 3. Such shifts depend upon available substitutes. 4. The trend was from more expensive energy to less expensive energy, not vice versa.
Weiye Loh

Is Assange the "world-spirit embodied"? A Hegel scholar reports fro... - 0 views

  • Although the atmosphere at the Troxy was very genial, and Žižek generally enthusiastic about WikiLeaks (as he was in the London Review of Books article he published about it), there was a distinct tension between the rather standard Enlightenment rhetoric employed by Assange (more facts, a more complete historical record, better educated journalists)  and the significantly more radical conclusions the philosopher was drawing. This is why - whilst it should no doubt be read in a similar light as Žižek’s own remarks on his position during the conversation (I feel now like that Stalinist commentator: the leader has spoken, I provide the deeper meaning) - the ventured analogy nevertheless contains a kernel of truth beyond its bombast: defining the emancipatory significance of phenomena should not be left to the actors alone.
  • in response to Goodman's initial question on the significance of the Iraq war logs, Assange primarily emphasized the concrete revelations WikiLeaks had provided. He mentioned the 400.000 cables leaked, 15.000 previously unreported deaths revealed, a video of an American helicopter mowing down civilians, and so on. In contrast, Žižek went far enough to say that even if WikiLeaks had not revealed a single new thing, it should be considered game-changing. Why? Because of the very way it functions. For the philosopher, our democracies not only have rules regarding what can be revealed, but also rules which regulate the transgression of those first rules (the independent press, NGOs, etc). The contention then is that WikiLeaks operates outside both these sets of rules, and that there is the source of its power.
  • the reply was firmly anchored in the key trope Žižek has championed since his first major work in English: that ideology in today's "post-ideological" world is not dead, but rather more powerful than ever - alive not so much on the level of knowledge but in the ways it structures social reality itself.
  • ...2 more annotations...
  • Žižek points out, the innocence of the accusers is anything but innocent; they decry the violence of WikiLeaks revelations, themselves oblivious to the military, economic, political and social framework of everyday violence that goes unmentioned in public discourse. The violence of leaks is on a formal level, and precisely this is at the root of the Slovene’s exclamation to Assange: “Yes, you are a terrorist, but by God, then what are they?”
  • WikiLeaks should not be seen as merely another chapter in investigative journalism and free flow of information, but a positive, subversive emancipatory force by virtue of the way it operates outside the system of secrets and allowed revelations. What then remains ahead is the hard task of keeping this subversive strength alive.
  •  
     in response to Goodman's initial question on the significance of the Iraq war logs, Assange primarily emphasized the concrete revelations WikiLeaks had provided. He mentioned the 400.000 cables leaked, 15.000 previously unreported deaths revealed, a video of an American helicopter mowing down civilians, and so on. In contrast, Žižek went far enough to say that even if WikiLeaks had not revealed a single new thing, it should be considered game-changing. Why? Because of the very way it functions. For the philosopher, our democracies not only have rules regarding what can be revealed, but also rules which regulate the transgression of those first rules (the independent press, NGOs, etc). The contention then is that WikiLeaks operates outside both these sets of rules, and that there is the source of its power.
1 - 20 of 185 Next › Last »
Showing 20 items per page