Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Weather

Rss Feed Group items tagged

Weiye Loh

Climate change and extreme flooding linked by new evidence | George Monbiot | Environme... - 0 views

  • Two studies suggest for the first time a clear link between global warming and extreme precipitation
  • There's a sound rule for reporting weather events that may be related to climate change. You can't say that a particular heatwave or a particular downpour – or even a particular freeze – was definitely caused by human emissions of greenhouse gases. But you can say whether these events are consistent with predictions, or that their likelihood rises or falls in a warming world.
  • Weather is a complex system. Long-running trends, natural fluctuations and random patterns are fed into the global weather machine, and it spews out a series of events. All these events will be influenced to some degree by global temperatures, but it's impossible to say with certainty that any of them would not have happened in the absence of man-made global warming.
  • ...5 more annotations...
  • over time, as the data build up, we begin to see trends which suggest that rising temperatures are making a particular kind of weather more likely to occur. One such trend has now become clearer. Two new papers, published by Nature, should make us sit up, as they suggest for the first time a clear link between global warming and extreme precipitation (precipitation means water falling out of the sky in any form: rain, hail or snow).
  • We still can't say that any given weather event is definitely caused by man-made global warming. But we can say, with an even higher degree of confidence than before, that climate change makes extreme events more likely to happen.
  • One paper, by Seung-Ki Min and others, shows that rising concentrations of greenhouse gases in the atmosphere have caused an intensification of heavy rainfall events over some two-thirds of the weather stations on land in the northern hemisphere. The climate models appear to have underestimated the contribution of global warming on extreme rainfall: it's worse than we thought it would be.
  • The other paper, by Pardeep Pall and others, shows that man-made global warming is very likely to have increased the probability of severe flooding in England and Wales, and could well have been behind the extreme events in 2000. The researchers ran thousands of simulations of the weather in autumn 2000 (using idle time on computers made available by a network of volunteers) with and without the temperature rises caused by man-made global warming. They found that, in nine out of 10 cases, man-made greenhouse gases increased the risks of flooding. This is probably as solid a signal as simulations can produce, and it gives us a clear warning that more global heating is likely to cause more floods here.
  • As Richard Allan points out, also in Nature, the warmer the atmosphere is, the more water vapour it can carry. There's even a formula which quantifies this: 6-7% more moisture in the air for every degree of warming near the Earth's surface. But both models and observations also show changes in the distribution of rainfall, with moisture concentrating in some parts of the world and fleeing from others: climate change is likely to produce both more floods and more droughts.
Weiye Loh

Leading climate scientist challenges Mail on Sunday's use of his research | Environment... - 0 views

  • Mojib Latif denies his research supports theory that current cold weather undermines scientific consensus on global warming
  • A leading scientist has hit out at misleading newspaper reports that linked his research to claims that the current cold weather undermines the scientific case for manmade global warming.
  • Mojib Latif, a climate expert at the Leibniz Institute at Kiel University in Germany, said he "cannot understand" reports that used his research to question the scientific consensus on climate change.He told the Guardian: "It comes as a surprise to me that people would try to use my statements to try to dispute the nature of global warming. I believe in manmade global warming. I have said that if my name was not Mojib Latif it would be global warming."
  • ...3 more annotations...
  • A report in the Mail on Sunday said that Latif's results "challenge some of the global warming orthodoxy's most deeply cherished beliefs" and "undermine the standard climate computer models". Monday's Daily Mail and Daily Telegraph repeated the claims.The reports attempted to link the Arctic weather that has enveloped the UK with research published by Latif's team in the journal Nature in 2008. The research said that natural fluctuations in ocean temperature could have a bigger impact on global temperature than expected. In particular, the study concluded that cooling in the oceans could offset global warming, with the average temperature over the decades 2000-2010 and 2005-2015 predicted to be no higher than the average for 1994-2004. Despite clarifications from the scientists at the time, who stressed that the research did not challenge the predicted long-term warming trend, the study was widely misreported as signalling a switch from global warming to global cooling.
  • The Mail on Sunday article said that Latif's research showed that the current cold weather heralds such "a global trend towards cooler weather".It said: "The BBC assured viewers that the big chill was was merely short-term 'weather' that had nothing to do with 'climate', which was still warming. The work of Prof Latif and the other scientists refutes that view."
  • Not according to Latif. "They are not related at all," he said. "What we are experiencing now is a weather phenomenon, while we talked about the mean temperature over the next 10 years. You can't compare the two."
Weiye Loh

Office of Science & Technology - Democracy's Open Secret - 0 views

  • there is a deeper issue here that spans political parties across nations:  a lack of recognition among policy makers of their dependence on experts in making wise decisions.  Experts do not, of course, determine how policy decisions ought to be made but they do add considerable value to wise decision making.
  • The deeper issue at work here is an open secret in the practice of democracy, and that is the fact that our elected leaders are chosen from among us, the people.  As such, politicians tend to reflect the views of the general public on many subjects - not just those subjects governed solely by political passions, but also those that are traditionally the province of experts.  Elected officials are not just a lot like us, they are us.
  • For example, perhaps foreshadowing contemporary US politics, in 1996 a freshman member of the US Congress proposed eliminating the US government's National Weather Service , declaring that the agency was not needed because "I get my weather from The Weather Channel."  Of course the weather informaton found on The Weather Channel comes from a sophisticated scientific and technological infrastructure built by the federal government over many decades which supports a wide range of economic activity, from agriculture to airlines, as well as from the private sector weather services.
  • ...7 more annotations...
  • European politicians have their own blind spots at the interface of science and policy.  For instance, several years ago former German environment minister Sigmar Gabriel claimed rather implausibly that: "You can build 100 coal-fired power plants and don't have to have higher CO2 emissions."  His explanation was that Germany participates in emissions trading and this would necessarily limit carbon dioxide no matter how much was produced. Obviously, emissions trading cannot make the impossible possible.
  • We should expect policy makers to face difficulties when it comes to governance when it involves considerations of science, technology, and innovation for the simple reason that they are just like everyone else -- mostly ignorant about mostly everything.
  • in 2010, the US NSF reported that 28% of Americans and 34% of Europeans believed that the sun goes around the earth.  Similarly, 30% of Americans and 41% of Europeans believe that radioactivity results only from human activities.  It should not be so surprising when we learn that policy makers may share such perspectives.
  • A popular view is that more education about science and technology will lead to better decisions.  While education is, of course, important to a healthy democracy, it will never result in a populace (or their representatives) with expertise in everything.  
  • Achieving such heroic levels of expertise is not realistic for anyone.  Instead, we must rely on specialized experts to inform decision making. Just as you and I often need to consult with experts when dealing with our health, home repairs, finances, and other tasks, so too do policy makers need to tap into expertise in order to make good decisions.
  • it should be far less worrisome that the public or policy makers do not understand this or that information that experts may know well.  What should be of more concern is that policy makers appear to lack an understanding of how they can tap into expertise to inform decision making.  This situation is akin to flying blind. Specialized expertise typically does not compel particular decisions, but it does help to make decisions more informed.  This distinction lies behind Winston Churchill's oft-cited advice that science should be "on tap, but not on top." Effective governance does not depend upon philosopher kings in governments or in the populace, but rather on the use of effective mechanisms for bringing expertise into the political process.
  • It is the responsibility - even the special expertise - of policy makers to know how to use the instruments of government to bring experts into the process of governance. The troubling aspect of the statements and actions by the Gummers, Gabriels, and Bachmanns of the political world lies not in their lack of knowledge about science, but in their lack of knowledge about government.
Weiye Loh

The Weather Isn't Getting Weirder - WSJ.com - 0 views

  • you need to understand whether recent weather trends are extreme by historical standards. The Twentieth Century Reanalysis Project is the latest attempt to find out, using super-computers to generate a dataset of global atmospheric circulation from 1871 to the present. As it happens, the project's initial findings, published last month, show no evidence of an intensifying weather trend. "In the climate models, the extremes get more extreme as we move into a doubled CO2 world in 100 years," atmospheric scientist Gilbert Compo, one of the researchers on the project, tells me from his office at the University of Colorado, Boulder. "So we were surprised that none of the three major indices of climate variability that we used show a trend of increased circulation going back to 1871."
  • researchers have yet to find evidence of more-extreme weather patterns over the period, contrary to what the models predict. "There's no data-driven answer yet to the question of how human activity has affected extreme weather," adds Roger Pielke Jr., another University of Colorado climate researcher.
  • We do know that carbon dioxide and other gases trap and re-radiate heat. We also know that humans have emitted ever-more of these gases since the Industrial Revolution. What we don't know is exactly how sensitive the climate is to increases in these gases versus other possible factors—solar variability, oceanic currents, Pacific heating and cooling cycles, planets' gravitational and magnetic oscillations, and so on. Given the unknowns, it's possible that even if we spend trillions of dollars, and forgo trillions more in future economic growth, to cut carbon emissions to pre-industrial levels, the climate will continue to change—as it always has.
  • ...1 more annotation...
  • That's not to say we're helpless. There is at least one climate lesson that we can draw from the recent weather: Whatever happens, prosperity and preparedness help. North Texas's ice storm wreaked havoc and left hundreds of football fans stranded, cold, and angry. But thanks to modern infrastructure, 21st century health care, and stockpiles of magnesium chloride and snow plows, the storm caused no reported deaths and Dallas managed to host the big game on Sunday.
Weiye Loh

Roger Pielke Jr.'s Blog: Donald Boudreaux: I'll Take That Bet - 0 views

  •  
    The fact of the matter is that our vulnerability to extreme weather is increasing, due to a combination of a growing population and especially urbanization in locations prone to extreme weather events.  This means that even with the hard work by many professionals in a range of fields that has contributed to the dramatic decrease in the number of deaths over recent decades low death totals are unlikely to continue into the future, as this year's tragic tornado season tells us.  Of course, given expected societal trends a reversal in statistics would not necessarily mean that our disaster policies are failing.  What it means is that our responses to extreme weather require constant vigilance, investment and continued hard work.
Weiye Loh

The Politics of Weather, Issa Investigates Countrywide and More in Capital Eye Opener: ... - 0 views

  • even great weather cannot be fully appreciated without looking at its political slant.In analyzing how meteorologists have contributed money to political candidates and committees over the years, the Center for Responsive Politics' campaign finance data shows that they collectively donated $146,000 during the 2010 election cycle. Meteorologists split their donations exactly in half, giving 50 percent of their contributions to Republicans and 50 percent to Democrats.The National Weather Service Employees PAC also was active during the 2010 cycle, donating slightly more than $67,000 to federal politics candidates. This particular PAC, however, significantly favored Democrats over Republicans in its donations, 87 percent to 11 percent.
Weiye Loh

Roger Pielke Jr.'s Blog: Quote Clarification - 0 views

  • Writing in the WSJ Europe this week Anne Jolis had a piece on extreme weather events that quotes me, and unfortunately the terse quote is missing some context that is apparently leading to some confusion.
  • I spoke with Jolis at length and she asked very good questions and expressed a desire to get the science right. She even called me back to confirm how I was to be quoted. Unfortunately the longer quote was abbreviated, which Jolis warned was always possible.  I do not view this as a particularly big deal, but since I am being asked about it via email by a few folks, here is what the quote said and how it should be: "There's no data-driven answer yet to the question of how human activity has affected extreme weather," adds Roger Pielke Jr., another University of Colorado climate researcher. Instead it would be more precise to read: "There's no data-driven answer yet to the question of how human activity has affected extreme weather disasters," adds Roger Pielke Jr., another University of Colorado climate researcher.
  • given the context of the article the implication should be abundantly clear that in the quote I am not referring to daily temperature records, Arctic ice melt or global average surface temperatures or precipitation. The quote refers directly to recent extreme events with large societal impacts around the world that are explicitly mentioned in the piece such as Cyclone Yasi, the Australian floods, Europe's cold winter and the Russian drought.  Of course, in the climate debate, anything that can be misinterpreted usually will be.
Weiye Loh

The Problem with Climate Change | the kent ridge common - 0 views

  • what is climate change? From a scientific point of view, it is simply a statistical change in atmospheric variables (temperature, precipitation, humidity etc). It has been occurring ever since the Earth came into existence, far before humans even set foot on the planet: our climate has been fluctuating between warm periods and ice ages, with further variations within. In fact, we are living in a warm interglacial period in the middle of an ice age.
  • Global warming has often been portrayed in apocalyptic tones, whether from the mouth of the media or environmental groups: the daily news tell of natural disasters happening at a frightening pace, of crop failures due to strange weather, of mass extinctions and coral die-outs. When the devastating tsunami struck Southeast Asia years ago, some said it was the wrath of God against human mistreatment of the environment; when hurricane Katrina dealt out a catastrophe, others said it was because of (America’s) failure to deal with climate change. Science gives the figures and trends, and people take these to extremes.
  • One immediate problem with blaming climate change for every weather-related disaster or phenomenon is that it reduces humans’ responsibility of mitigating or preventing it. If natural disasters are already, as their name suggests, natural, adding the tag ‘global warming’ or ‘climate change’ emphasizes the dominance of natural forces, and our inability to do anything about it. Surely, humans cannot undo climate change? Even at Cancun, amid the carbon cuts that have been promised, questions are being brought up on whether they are sufficient to reverse our actions and ‘save’ the planet.  Yet the talk about this remote, omnipotent force known as climate change obscures the fact that, we can, and have always been, thinking of ways to reduce the impact of natural hazards. Forecasting, building better infrastructure and coordinating more efficient responses – all these are far more desirable to wading in woe. For example, we will do better at preventing floods in Singapore at tackling the problems rather than singing in praise of God.
  • ...5 more annotations...
  • However, a greater concern lies in the notion of climate change itself. Climate change is in essence one kind of nature-society relationship, in which humans influence the climate through greenhouse gas (particularly CO2) emissions, and the climate strikes back by heating up and going crazy at times. This can be further simplified into a battle between humans and CO2: reducing CO2 guards against climate change, and increasing it aggravates the consequences. This view is anchored in scientists’ recommendation that a ‘safe’ level of CO2 should be at 350 parts per million (ppm) instead of the current 390. Already, the need to reduce CO2 is understood, as is evident in the push for greener fuels, more efficient means of production, the proliferation of ‘green’ products and companies, and most recently, the Cancun talks.
  • So can there be anything wrong with reducing CO2? No, there isn’t, but singling out CO2 as the culprit of climate change or of the environmental problems we face prevents us from looking within. What do I mean? The enemy, CO2, is an ‘other’, an externality produced by our economic systems but never an inherent component of the systems. Thus, we can declare war on the gas or on climate change without taking a step back and questioning: is there anything wrong with the way we develop?  Take Singapore for example: the government pledged to reduce carbon emissions by 16% under ‘business as usual’ standards, which says nothing about how ‘business’ is going to be changed other than having less carbon emissions (in fact, it is questionable even that CO2 levels will decrease, as ‘business as usual’ standards project a steady increase emission of CO2 each year). With the development of green technologies, decrease in carbon emissions will mainly be brought about by increased energy efficiency and switch to alternative fuels (including the insidious nuclear energy).
  • Thus, the way we develop will hardly be changed. Nobody questions whether our neoliberal system of development, which relies heavily on consumption to drive economies, needs to be looked into. We assume that it is the right way to develop, and only tweak it for the amount of externalities produced. Whether or not we should be measuring development by the Gross Domestic Product (GDP) or if welfare is correlated to the amount of goods and services consumed is never considered. Even the UN-REDD (Reducing Emissions from Deforestation and Forest Degradation) scheme which aims to pay forest-rich countries for protecting their forests, ends up putting a price tag on them. The environment is being subsumed under the economy, when it should be that the economy is re-looked to take the environment into consideration.
  • when the world is celebrating after having held at bay the dangerous greenhouse gas, why would anyone bother rethinking about the economy? Yet we should, simply because there are alternative nature-society relationships and discourses about nature that are more or of equal importance as global warming. Annie Leonard’s informative videos on The Story of Stuff and specific products like electronics, bottled water and cosmetics shed light on the dangers of our ‘throw-away culture’ on the planet and poorer countries. What if the enemy was instead consumerism? Doing so would force countries (especially richer ones) to fundamentally question the nature of development, instead of just applying a quick technological fix. This is so much more difficult (and less economically viable), alongside other issues like environmental injustices – e.g. pollution or dumping of waste by Trans-National Corporations in poorer countries and removal of indigenous land rights. It is no wonder that we choose to disregard internal problems and focus instead on an external enemy; when CO2 is the culprit, the solution is too simple and detached from the communities that are affected by changes in their environment.
  • We need hence to allow for a greater politics of the environment. What I am proposing is not to diminish our action to reduce carbon emissions, for I do believe that it is part of the environmental problem that we are facing. What instead should be done is to reduce our fixation on CO2 as the main or only driver of climate change, and of climate change as the most pertinent nature-society issue we are facing. We should understand that there are many other ways of thinking about the environment; ‘developing’ countries, for example, tend to have a closer relationship with their environment – it is not something ‘out there’ but constantly interacted with for food, water, regulating services and cultural value. Their views and the impact of the socio-economic forces (often from TNCs and multi-lateral organizations like IMF) that shape the environment must also be taken into account, as do alternative meanings of sustainable development. Thus, even as we pat ourselves on the back for having achieved something significant at Cancun, our action should not and must not end there. Even if climate change hogs the headlines now, we must embrace more plurality in environmental discourse, for nature is not and never so simple as climate change alone. And hopefully sometime in the future, alongside a multi-lateral conference on climate change, the world can have one which rethinks the meaning of development.
  •  
    Chen Jinwen
Weiye Loh

Don't Miss this Video: "A link between climate change and Joplin tornadoes? N... - 0 views

  •  
    The video takes Bill Mckibben's recent editorial from the Washington Post, sets it to music and powerful video of the last year's weather events. If you haven't seen the editorial, give that a look first. It starts out - Caution: It is vitally important not to make connections. When you see pictures of rubble like this week's shots from Joplin, Mo., you should not wonder: Is this somehow related to the tornado outbreak three weeks ago in Tuscaloosa, Ala., or the enormous outbreak a couple of weeks before that (which, together, comprised the most active April for tornadoes in U.S. history). No, that doesn't mean a thing. It is far better to think of these as isolated, unpredictable, discrete events. It is not advisable to try to connect them in your mind with, say, the fires burning across Texas - fires that have burned more of America at this point this year than any wildfires have in previous years. Texas, and adjoining parts of Oklahoma and New Mexico, are drier than they've ever been - the drought is worse than that of the Dust Bowl. But do not wonder if they're somehow connected.
Weiye Loh

Would You Donate Your Facebook Account to Al Gore For a Day? | The Utopianist - Think B... - 0 views

  •  
    On September 14, Al Gore will launch the Climate Reality Project, or "24 Hours of Reality" - this most recent project will have 24 presentations, done by 24 presenters in 13 languages, each broadcast one hour after the other, representing all the time zones of the world. Al Gore will be connecting the dots of climate change, extreme weather and pollution, among other things - but the innovative thing is that he wants to harness the power of his follower's social media accounts in order to reach more people. Gore is asking his supporters to lend him their accounts - the Project will be posting status updates in their name, trying to reach many more people as well as start a dialogue on the subject.
Weiye Loh

Oxford academic wins right to read UEA climate data | Environment | guardian.co.uk - 0 views

  • Jonathan Jones, physics professor at Oxford University and self-confessed "climate change agnostic", used freedom of information law to demand the data that is the life's work of the head of the University of East Anglia's Climatic Research Unit, Phil Jones. UEA resisted the requests to disclose the data, but this week it was compelled to do so.
  • Graham gave the UEA one month to deliver the data, which includes more than 4m individual thermometer readings taken from 4,000 weather stations over the past 160 years. The commissioner's office said this was his first ruling on demands for climate data made in the wake of the climategate affair.
  • an archive of world temperature records collected jointly with the Met Office.
  • ...3 more annotations...
  • Critics of the UEA's scientists say an independent analysis of the temperature data may reveal that Phil Jones and his colleagues have misinterpreted the evidence of global warming. They may have failed to allow for local temperature influences, such as the growth of cities close to many of the thermometers.
  • when Jonathan Jones and others asked for the data in the summer of 2009, the UEA said legal exemptions applied. It said variously that the temperature data were the property of foreign meteorological offices; were intellectual property that might be valuable if sold to other researchers; and were in any case often publicly available.
  • Jonathan Jones said this week that he took up the cause of data freedom after Steve McIntyre, a Canadian mathematician, had requests for the data turned down. He thought this was an unreasonable response when Phil Jones had already shared the data with academic collaborators, including Prof Peter Webster of the Georgia Institute of Technology in the US. He asked to be given the data already sent to Webster, and was also turned down.
  •  
    An Oxford academic has won the right to read previously secret data on climate change held by the University of East Anglia (UEA). The decision, by the government's information commissioner, Christopher Graham, is being hailed as a landmark ruling that will mean that thousands of British researchers are required to share their data with the public.
Weiye Loh

Should technical science journals have plain language translation? - Capital Weather Ga... - 0 views

  • Given that the future of the Earth depends on the public have a clearer understanding of Earth science, it seems to me there is something unethical in our insular behavior as scientists. Here is my proposal. I suggest authors must submit for review, and scientific societies be obliged to publish two versions of every journal. One would be the standard journal in scientific English for their scientific club. The second would be a parallel open-access summary translation into plain English of the relevance and significance of each paper for everyone else. A translation that educated citizens,businesses and law-makers can understand. Remember that they are funding this research, and some really want to understand what is happening to the Earth
  • A short essay in the Bulletin of the American Meteorological Society , entitled “A Proposal for Communicating Science” caught my attention today. Written by atmospheric scientist Alan Betts, it advocates technical journal articles related to Earth science be complemented by a mandatory non-technical version for the lay public. What a refreshing idea!
  •  
    A short essay in the Bulletin of the American Meteorological Society , entitled "A Proposal for Communicating Science" caught my attention today. Written by atmospheric scientist Alan Betts, it advocates technical journal articles related to Earth science be complemented by a mandatory non-technical version for the lay public.
Weiye Loh

Experts claim 2006 climate report plagiarized - USATODAY.com - 0 views

  • An influential 2006 congressional report that raised questions about the validity of global warming research was partly based on material copied from textbooks, Wikipedia and the writings of one of the scientists criticized in the report, plagiarism experts say.
  • "It kind of undermines the credibility of your work criticizing others' integrity when you don't conform to the basic rules of scholarship," Virginia Tech plagiarism expert Skip Garner says.
  • Led by George Mason University statistician Edward Wegman, the 2006 report criticized the statistics and scholarship of scientists who found the last century the warmest in 1,000 years.
  • ...1 more annotation...
  • But in March, climate scientist Raymond Bradley of the University of Massachusetts asked GMU, based in Fairfax, Va., to investigate "clear plagiarism" of one of his textbooks. Bradley says he learned of the copying on the Deep Climate website and through a now year-long analysis of the Wegman report made by retired computer scientist John Mashey of Portola Valley, Calif. Mashey's analysis concludes that 35 of the report's 91 pages "are mostly plagiarized text, but often injected with errors, bias and changes of meaning." Copying others' text or ideas without crediting them violates universities' standards, according to Liz Wager of the London-based Committee on Publication Ethics.
Weiye Loh

Monckton takes scientist to brink of madness at climate change talk | John Abraham | En... - 0 views

  • Christopher Monckton, Viscount Monckton of Brenchley, had given a rousing speech to a crowd at Bethel University in Minnesota, near where I live.His speech was on global warming and his style was convincing and irreverent. Anyone listening to him was given the impression that global warming was not happening, or that if it did happen it wouldn't be so bad, and scientists who warned about it were part of a vast conspiracy.
  • Monckton cited scientist after scientist whose work "disproved" global warming.He contended that polar bears are not really at risk (in fact they do better as weather warms); projections of sea level rise are a mere 6cm; Arctic ice has not declined in a decade; Greenland is not melting; sea levels are not rising; ocean temperatures are not increasing; medieval times were warmer than today; ocean acidification is not occurring; and global temperatures are not increasing.
  • I actually tracked down the articles and authors that Monckton cited. What I discovered was incredible, even to a scientist who follows the politics of climate change. I found that he had misrepresented the science.
  • ...4 more annotations...
  • For instance, Monckton's claims that "Arctic sea ice is fine, steady for a decade" made reference to Alaskan research group (IARC).I wrote to members of IARC and asked whether this was true. Both their chief scientist and director confirmed that Monckton was mistaken.They also pointed me to the National Snow and Ice Data Centre (NSIDC) for a second opinion.A scientist there confirmed Monckton's error, as did Dr Ola Johannessen, whose work has shown ice loss in Greenland (Monckton reported that Johannessen's work showed that Greenland "was just fine".)
  • Next, I investigated Monckton's claim that the medieval period was warmer than today. Monckton showed a slide featuring nine researchers' works which, he claimed, proved that today's warming is not unusual – it was hotter in the past.I wrote to these authors and I read their papers. It turned out that none of the authors or papers made the claims that Monckton attributed to them. This pattern of misinterpretation was becoming chronic.
  • Next, I checked on Monckton's claim that the ocean has not been heating for 50 years. To quote him directly, there has been "no ocean heat buildup for 50 years".On this slide, he referenced a well-known researcher named Dr Catia Domingues. It turns out Domingues said no such thing. What would she know? She only works for the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in Australia.
  • Monckton referred to a 2004 statement by the International Astronomical Union (IAU) which stated that solar activity has caused today's warming and that global warming will end soon.The president of the IAU division on the sun and heliosphere told me that there is no such position of the IAU and that I should pass this information on to whomever "might have used the IAU name to claim otherwise".
Weiye Loh

Roger Pielke Jr.'s Blog: Tall Tales in the New York Times - 0 views

  • , it is still amazing to see the newspaper of record publish a statement like the following about Munich Re, one of the world's largest reinsurance companies: Munich Re is already tailoring its offerings to a world of more extreme weather. It is a matter of financial survival: In 2008, heavy snows in China resulted in the collapse of 223,000 homes, according to Chinese government statistics, including $1 billion in insured lossesMunich Re's financial survival? Here Rosenthal makes a leap well beyond the perhaps understandable following along with the delusions of crowds. There are always risks to bringing data to bear on an enjoyable tale tall, but let's look anyway at what is actually going on in Munich Re's business over the past several years.
  • Here is what Muinch Re reported on its 2008 company performance, the year in which China suffered the heavy snows: Notwithstanding the most severe financial crisis for generations, Munich Re recorded a clear profit for the financial year 2008, in line with previous announcements. According to preliminary calculations, the consolidated profit amounted to €1.5bn.How about 2009 then? Nikolaus von Bomhard, Chairman of the Board of Management: “We have brought the financial year 2009 to a successful close: with a profit of over €2.5bn, we were even able to surpass expectations and achieve our long-term return target despite the difficult environment.” Sure, 2010 must have see some evidence of a threat to the company's financial survival?  Guess again: On the basis of preliminary estimates, Munich Re achieved a consolidated result of €2.43bn for 2010 (previous year: €2.56bn), despite substantial major losses. The profit for the fourth quarter totalled €0.48bn (0.78bn). Shareholders are to participate in last year's success through another increase in the dividend: subject to approval by the Supervisory Board and the Annual General Meeting, the dividend will rise by 50 cents to €6.25 (5.75) per share. In addition, Munich Re has announced a further share buy-back programme: shares with a volume of up to €500m are to be repurchased before the Annual General Meeting in 2012
  • The NYT may be unaware of the fact that not only is Munich Re in the catastrophe reinsurance business, meaning that it pays out variable and large claims for disasters, but that its business actually depends upon those disasters
  • ...3 more annotations...
  • Munich Re explains in the context of recent disasters (emphasis added): Overall, pressure on prices in most lines of business and regions is persisting. Munich Re therefore consistently withdrew from under-rated business. It nevertheless proved possible to expand accounts with individual major clients, so that the business volume grew slightly on balance, despite the difficult environment. Munich Re owes this profitable growth especially to its ability to swiftly offer complex, tailor-made reinsurance solutions to its clients. Besides this, the many large losses resulting from natural hazards and also from man-made events, had a stabilising influence on the lines of business and regions affected. Thus, prices increased markedly for natural catastrophe covers in Australia/New Zealand (Oceania) and in offshore energy business. There were no major changes in conditions in this renewal season. The overall outcome of the reinsurance treaty renewals at 1 January 2011 was again very satisfactory for Munich Re.
  • There is downward pressure on prices in the reinsurance industry because there have not been enough disasters to keep up demand and thus premium prices. The following observation was made just three months ago: Insurance and reinsurance prices have been falling across most business lines for two years, reflecting intense competition between well-capitalised insurers and a comparative dearth of major catastrophe-induced losses.
  • as Munch Re explains, they have been able to overcome the dearth of disasters because recent extreme events have allowed them to increase prices on coverage in a manner that not only counteracts recent losses to some degree, but even allows for "profitable growth."  As with most tall tales, the one about the financial plight of reinsurers dealing with a changed climate isn't going away any time soon. It is just another bit of  popular unreality that effective decision making will have to overcome.
Weiye Loh

RealClimate: Going to extremes - 0 views

  • There are two new papers in Nature this week that go right to the heart of the conversation about extreme events and their potential relationship to climate change.
  • Let’s start with some very basic, but oft-confused points: Not all extremes are the same. Discussions of ‘changes in extremes’ in general without specifying exactly what is being discussed are meaningless. A tornado is an extreme event, but one whose causes, sensitivity to change and impacts have nothing to do with those related to an ice storm, or a heat wave or cold air outbreak or a drought. There is no theory or result that indicates that climate change increases extremes in general. This is a corollary of the previous statement – each kind of extreme needs to be looked at specifically – and often regionally as well. Some extremes will become more common in future (and some less so). We will discuss the specifics below. Attribution of extremes is hard. There are limited observational data to start with, insufficient testing of climate model simulations of extremes, and (so far) limited assessment of model projections.
  • The two new papers deal with the attribution of a single flood event (Pall et al), and the attribution of increased intensity of rainfall across the Northern Hemisphere (Min et al). While these issues are linked, they are quite distinct, and the two approaches are very different too.
  • ...4 more annotations...
  • The aim of the Pall et al paper was to examine a specific event – floods in the UK in Oct/Nov 2000. Normally, with a single event there isn’t enough information to do any attribution, but Pall et al set up a very large ensemble of runs starting from roughly the same initial conditions to see how often the flooding event occurred. Note that flooding was defined as more than just intense rainfall – the authors tracked runoff and streamflow as part of their modelled setup. Then they repeated the same experiments with pre-industrial conditions (less CO2 and cooler temperatures). If the amount of times a flooding event would occur increased in the present-day setup, you can estimate how much more likely the event would have been because of climate change. The results gave varying numbers but in nine out of ten cases the chance increased by more than 20%, and in two out of three cases by more than 90%. This kind of fractional attribution (if an event is 50% more likely with anthropogenic effects, that implies it is 33% attributable) has been applied also to the 2003 European heatwave, and will undoubtedly be applied more often in future. One neat and interesting feature of these experiments was that they used the climateprediction.net set up to harness the power of the public’s idle screensaver time.
  • The second paper is a more standard detection and attribution study. By looking at the signatures of climate change in precipitation intensity and comparing that to the internal variability and the observation, the researchers conclude that the probability of intense precipitation on any given day has increased by 7 percent over the last 50 years – well outside the bounds of natural variability. This is a result that has been suggested before (i.e. in the IPCC report (Groisman et al, 2005), but this was the first proper attribution study (as far as I know). The signal seen in the data though, while coherent and similar to that seen in the models, was consistently larger, perhaps indicating the models are not sensitive enough, though the El Niño of 1997/8 may have had an outsize effect.
  • Both papers were submitted in March last year, prior to the 2010 floods in Pakistan, Australia, Brazil or the Philippines, and so did not deal with any of the data or issues associated with those floods. However, while questions of attribution come up whenever something weird happens to the weather, these papers demonstrate clearly that the instant pop-attributions we are always being asked for are just not very sensible. It takes an enormous amount of work to do these kinds of tests, and they just can’t be done instantly. As they are done more often though, we will develop a better sense for the kinds of events that we can say something about, and those we can’t.
  • There is always concern that the start and end points for any trend study are not appropriate (both sides are guilty on this IMO). I have read precipitation studies were more difficult due to sparse data, and it seems we would have seen precipitation trend graphs a lot more often by now if it was straight forward. 7% seems to be a large change to not have been noted (vocally) earlier, seems like there is more to this story.
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

If climate scientists are in it for the money, they're doing it wrong - 0 views

  • Since it doesn't have a lot of commercial appeal, most of the people working in the area, and the vast majority of those publishing the scientific literature, work in academic departments or at government agencies. Penn State, home of noted climatologists Richard Alley and Michael Mann, has a strong geosciences department and, conveniently, makes the department's salary information available. It's easy to check, and find that the average tenured professor earned about $120,000 last year, and a new hire a bit less than $70,000.
  • That's a pretty healthy salary by many standards, but it's hardly a racket. Penn State appears to be on the low end of similar institutions, and is outdone by two other institutions in its own state (based on this report). But, more significantly for the question at hand, we can see that Earth Sciences faculty aren't paid especially well. Sure, they do much better than the Arts faculty, but they're somewhere in the middle of the pack, and get stomped on by professors in the Business and IT departments.
  • This is all, of course, ignoring what someone who can do the sort of data analysis or modeling of complex systems that climatologists perform might make if they went to Wall Street.
  • ...10 more annotations...
  • It's also worth pointing out what they get that money for, as exemplified by a fairly typical program announcement for NSF grants. Note that it calls for studies of past climate change and its impact on the weather. This sort of research could support the current consensus view, but it just as easily might not. And here's the thing: it's impossible to tell before the work's done. Even a study looking at the flow of carbon into and out of the atmosphere, which would seem to be destined to focus on anthropogenic climate influences, might identify a previously unknown or underestimated sink or feedback. So, even if the granting process were biased (and there's been no indication that it is), there is no way for it to prevent people from obtaining contrary data. The granting system is also set up to induce people to publish it, since a grant that doesn't produce scientific papers can make it impossible for a professor to obtain future funding.
  • Maybe the money is in the perks that come with grants, which provide for travel and lab toys. Unfortunately, there's no indication that there's lots of money out there for the taking, either from the public or private sector. For the US government, spending on climate research across 13 different agencies (from the Department of State to NASA) is tracked by the US Climate Change Science Program. The group has tracked the research budget since 1989, but not everything was brought under its umbrella until 1991. That year, according to CCSP figures, about $1.45 billion was spent on climate research (all figures are in 2007 dollars). Funding peaked back in 1995 at $2.4 billion, then bottomed out in 2006 at only $1.7 billion.
  • Funding has gone up a bit over the last couple of years, and some stimulus money went into related programs. But, in general, the trend has been a downward one for 15 years; it's not an area you'd want to go into if you were looking for a rich source of grant money. If you were, you would target medical research, for which the NIH had a $31 billion budget plus another $10 billion in stimulus money.
  • Not all of this money went to researchers anyway; part of the budget goes to NASA, and includes some of that agency's (rather pricey) hardware. For example, the Orbiting Carbon Observatory cost roughly $200 million, but failed to go into orbit; its replacement is costing another $170 million.
  • Might the private sector make up for the lack of government money? Pretty unlikely. For starters, it's tough to identify many companies that have a vested interest in the scientific consensus. Renewable energy companies would seem to be the biggest winners, but they're still relatively tiny. Neither the largest wind or photovoltaic manufacturers (Vestas and First Solar) appear in the Financial Times' list of the world's 500 largest companies. In contrast, there are 16 oil companies in the of the top 100, and they occupy the top two spots. Exxon's profits in 2010 were nearly enough to buy both Vestas and First Solar, given their market valuations in late February.
  • climate researchers are scrambling for a piece of a smaller piece of the government-funded pie, and the resources of the private sector are far, far more likely to go to groups that oppose their conclusions.
  • If you were paying careful attention to that last section, you would have noticed something funny: the industry that seems most likely to benefit from taking climate change seriously produces renewable energy products. However, those companies don't employ any climatologists. They probably have plenty of space for engineers, materials scientists, and maybe a quantum physicist or two, but there's not much that a photovoltaic company would do with a climatologist. Even by convincing the public of their findings—namely, climate change is real, and could have serious impacts—the scientists are not doing themselves any favors in terms of job security or alternative careers.
  • But, surely, by convincing the public, or at least the politicians, that there's something serious here, they ensure their own funding? That's arguably not true either, and the stimulus package demonstrates that nicely. The US CCSP programs, in total, got a few hundred million dollars from the stimulus. In contrast, the Department of Energy got a few billion. Carbon capture and sequestration alone received $2.4 billion, more than the entire CCSP budget.
  • The problem is that climatologists are well equipped to identify potential problems, but very poorly equipped to solve them; it would be a bit like expecting an astronomer to know how to destroy a threatening asteroid.
  • The solutions to problems related to climate change are going to come in areas like renewable energy, carbon sequestration, and efficiency measures; that's where most of the current administration's efforts have focused. None of these are areas where someone studying the climate is likely to have a whole lot to add. So, when they advocate that the public take them seriously, they're essentially asking the public to send money to someone else.
Weiye Loh

Can a group of scientists in California end the war on climate change? | Science | The ... - 0 views

  • Muller calls his latest obsession the Berkeley Earth project. The aim is so simple that the complexity and magnitude of the undertaking is easy to miss. Starting from scratch, with new computer tools and more data than has ever been used, they will arrive at an independent assessment of global warming. The team will also make every piece of data it uses – 1.6bn data points – freely available on a website. It will post its workings alongside, including full information on how more than 100 years of data from thousands of instruments around the world are stitched together to give a historic record of the planet's temperature.
  • Muller is fed up with the politicised row that all too often engulfs climate science. By laying all its data and workings out in the open, where they can be checked and challenged by anyone, the Berkeley team hopes to achieve something remarkable: a broader consensus on global warming. In no other field would Muller's dream seem so ambitious, or perhaps, so naive.
  • "We are bringing the spirit of science back to a subject that has become too argumentative and too contentious," Muller says, over a cup of tea. "We are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find." Why does Muller feel compelled to shake up the world of climate change? "We are doing this because it is the most important project in the world today. Nothing else comes close," he says.
  • ...20 more annotations...
  • There are already three heavyweight groups that could be considered the official keepers of the world's climate data. Each publishes its own figures that feed into the UN's Intergovernmental Panel on Climate Change. Nasa's Goddard Institute for Space Studies in New York City produces a rolling estimate of the world's warming. A separate assessment comes from another US agency, the National Oceanic and Atmospheric Administration (Noaa). The third group is based in the UK and led by the Met Office. They all take readings from instruments around the world to come up with a rolling record of the Earth's mean surface temperature. The numbers differ because each group uses its own dataset and does its own analysis, but they show a similar trend. Since pre-industrial times, all point to a warming of around 0.75C.
  • You might think three groups was enough, but Muller rolls out a list of shortcomings, some real, some perceived, that he suspects might undermine public confidence in global warming records. For a start, he says, warming trends are not based on all the available temperature records. The data that is used is filtered and might not be as representative as it could be. He also cites a poor history of transparency in climate science, though others argue many climate records and the tools to analyse them have been public for years.
  • Then there is the fiasco of 2009 that saw roughly 1,000 emails from a server at the University of East Anglia's Climatic Research Unit (CRU) find their way on to the internet. The fuss over the messages, inevitably dubbed Climategate, gave Muller's nascent project added impetus. Climate sceptics had already attacked James Hansen, head of the Nasa group, for making political statements on climate change while maintaining his role as an objective scientist. The Climategate emails fuelled their protests. "With CRU's credibility undergoing a severe test, it was all the more important to have a new team jump in, do the analysis fresh and address all of the legitimate issues raised by sceptics," says Muller.
  • This latest point is where Muller faces his most delicate challenge. To concede that climate sceptics raise fair criticisms means acknowledging that scientists and government agencies have got things wrong, or at least could do better. But the debate around global warming is so highly charged that open discussion, which science requires, can be difficult to hold in public. At worst, criticising poor climate science can be taken as an attack on science itself, a knee-jerk reaction that has unhealthy consequences. "Scientists will jump to the defence of alarmists because they don't recognise that the alarmists are exaggerating," Muller says.
  • The Berkeley Earth project came together more than a year ago, when Muller rang David Brillinger, a statistics professor at Berkeley and the man Nasa called when it wanted someone to check its risk estimates of space debris smashing into the International Space Station. He wanted Brillinger to oversee every stage of the project. Brillinger accepted straight away. Since the first meeting he has advised the scientists on how best to analyse their data and what pitfalls to avoid. "You can think of statisticians as the keepers of the scientific method, " Brillinger told me. "Can scientists and doctors reasonably draw the conclusions they are setting down? That's what we're here for."
  • For the rest of the team, Muller says he picked scientists known for original thinking. One is Saul Perlmutter, the Berkeley physicist who found evidence that the universe is expanding at an ever faster rate, courtesy of mysterious "dark energy" that pushes against gravity. Another is Art Rosenfeld, the last student of the legendary Manhattan Project physicist Enrico Fermi, and something of a legend himself in energy research. Then there is Robert Jacobsen, a Berkeley physicist who is an expert on giant datasets; and Judith Curry, a climatologist at Georgia Institute of Technology, who has raised concerns over tribalism and hubris in climate science.
  • Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. Muller likens Rohde's achievement to Hercules's enormous task of cleaning the Augean stables.
  • The wealth of data Rohde has collected so far – and some dates back to the 1700s – makes for what Muller believes is the most complete historical record of land temperatures ever compiled. It will, of itself, Muller claims, be a priceless resource for anyone who wishes to study climate change. So far, Rohde has gathered records from 39,340 individual stations worldwide.
  • Publishing an extensive set of temperature records is the first goal of Muller's project. The second is to turn this vast haul of data into an assessment on global warming.
  • The big three groups – Nasa, Noaa and the Met Office – work out global warming trends by placing an imaginary grid over the planet and averaging temperatures records in each square. So for a given month, all the records in England and Wales might be averaged out to give one number. Muller's team will take temperature records from individual stations and weight them according to how reliable they are.
  • This is where the Berkeley group faces its toughest task by far and it will be judged on how well it deals with it. There are errors running through global warming data that arise from the simple fact that the global network of temperature stations was never designed or maintained to monitor climate change. The network grew in a piecemeal fashion, starting with temperature stations installed here and there, usually to record local weather.
  • Among the trickiest errors to deal with are so-called systematic biases, which skew temperature measurements in fiendishly complex ways. Stations get moved around, replaced with newer models, or swapped for instruments that record in celsius instead of fahrenheit. The times measurements are taken varies, from say 6am to 9pm. The accuracy of individual stations drift over time and even changes in the surroundings, such as growing trees, can shield a station more from wind and sun one year to the next. Each of these interferes with a station's temperature measurements, perhaps making it read too cold, or too hot. And these errors combine and build up.
  • This is the real mess that will take a Herculean effort to clean up. The Berkeley Earth team is using algorithms that automatically correct for some of the errors, a strategy Muller favours because it doesn't rely on human interference. When the team publishes its results, this is where the scrutiny will be most intense.
  • Despite the scale of the task, and the fact that world-class scientific organisations have been wrestling with it for decades, Muller is convinced his approach will lead to a better assessment of how much the world is warming. "I've told the team I don't know if global warming is more or less than we hear, but I do believe we can get a more precise number, and we can do it in a way that will cool the arguments over climate change, if nothing else," says Muller. "Science has its weaknesses and it doesn't have a stranglehold on the truth, but it has a way of approaching technical issues that is a closer approximation of truth than any other method we have."
  • It might not be a good sign that one prominent climate sceptic contacted by the Guardian, Canadian economist Ross McKitrick, had never heard of the project. Another, Stephen McIntyre, whom Muller has defended on some issues, hasn't followed the project either, but said "anything that [Muller] does will be well done". Phil Jones at the University of East Anglia was unclear on the details of the Berkeley project and didn't comment.
  • Elsewhere, Muller has qualified support from some of the biggest names in the business. At Nasa, Hansen welcomed the project, but warned against over-emphasising what he expects to be the minor differences between Berkeley's global warming assessment and those from the other groups. "We have enough trouble communicating with the public already," Hansen says. At the Met Office, Peter Stott, head of climate monitoring and attribution, was in favour of the project if it was open and peer-reviewed.
  • Peter Thorne, who left the Met Office's Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller's claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. "Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn't give you much more bang for your buck," he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.
  • Despite his reservations, Thorne says climate science stands to benefit from Muller's project. "We need groups like Berkeley stepping up to the plate and taking this challenge on, because it's the only way we're going to move forwards. I wish there were 10 other groups doing this," he says.
  • Muller's project is organised under the auspices of Novim, a Santa Barbara-based non-profit organisation that uses science to find answers to the most pressing issues facing society and to publish them "without advocacy or agenda". Funding has come from a variety of places, including the Fund for Innovative Climate and Energy Research (funded by Bill Gates), and the Department of Energy's Lawrence Berkeley Lab. One donor has had some climate bloggers up in arms: the man behind the Charles G Koch Charitable Foundation owns, with his brother David, Koch Industries, a company Greenpeace called a "kingpin of climate science denial". On this point, Muller says the project has taken money from right and left alike.
  • No one who spoke to the Guardian about the Berkeley Earth project believed it would shake the faith of the minority who have set their minds against global warming. "As new kids on the block, I think they will be given a favourable view by people, but I don't think it will fundamentally change people's minds," says Thorne. Brillinger has reservations too. "There are people you are never going to change. They have their beliefs and they're not going to back away from them."
Weiye Loh

Roger Pielke Jr.'s Blog: Flawed Food Narrative in the New York Times - 0 views

  • The article relies heavily on empty appeals to authority.  For example, it makes an unsupported assertion about what "scientists believe": Many of the failed harvests of the past decade were a consequence of weather disasters, like floods in the United States, drought in Australia and blistering heat waves in Europe and Russia. Scientists believe some, though not all, of those events were caused or worsened by human-induced global warming.  Completely unmentioned are the many (most?) scientists who believe that evidence is lacking to connect recent floods and heat waves to "human-induced global warming."
  • Some important issues beyond carbon dioxide are raised in the article, but are presented as secondary to the carbon narrative.  Other important issues are completely ignored -- for example, wheat rust goes unmentioned, and it probably has a greater risk to food supplies in the short term than anything to do with carbon dioxide. The carbon dioxide-centric focus on the article provides a nice illustration of how an obsession with "global warming" can serve to distract attention from factors that actually matter more for issues of human and environmental concern.
  • The central thesis of the NYT article is the following statement: The rapid growth in farm output that defined the late 20th century has slowed to the point that it is failing to keep up with the demand for food, driven by population increases and rising affluence in once-poor countries. But this claim of slowing output is shown to be completely false by the graphic that accompanies the article, shown below.  Far from slowing, farm output has increased dramatically over the past half-century (left panel) and on a per capita basis in 2009 was higher than at any point since the early 1980s (right panel).  
  •  
    Today's New York Times has an article by Justin Gillis on global food production that strains itself to the breaking point to make a story fit a narrative.  The narrative, of course, is that climate change "is helping to destabilize the food system."  The problem with the article is that the data that it presents don't support this narrative. Before proceeding, let me reiterate that human-caused climate change is a threat and one that we should be taking seriously. But taking climate change seriously does not mean shoehorning every global concern into that narrative, and especially conflating concerns about the future with what has been observed in the past. The risk of course of putting a carbon-centric spin on every issue is that other important dimensions are neglected.
1 - 20 of 24 Next ›
Showing 20 items per page