Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Temperature

Rss Feed Group items tagged

Weiye Loh

Adventures in Flay-land: Dealing with Denialists - Delingpole Part III - 0 views

  • This post is about how one should deal with a denialist of Delingpole's ilk.
  • I saw someone I follow on Twitter retweet an update from another Twitter user called @AGW_IS_A_HOAX, which was this: "NZ #Climate Scientists Admit Faking Temperatures http://bit.ly/fHbdPI RT @admrich #AGW #Climategate #Cop16 #ClimateChange #GlobalWarming".
  • So I click on it. And this is how you deal with a denialist claim. You actually look into it. Here is the text of that article reproduced in full: New Zealand Climate Scientists Admit To Faking Temperatures: The Actual Temps Show Little Warming Over Last 50 YearsRead here and here. Climate "scientists" across the world have been blatantly fabricating temperatures in hopes of convincing the public and politicians that modern global warming is unprecedented and accelerating. The scientists doing the fabrication are usually employed by the government agencies or universities, which thrive and exist on taxpayer research dollars dedicated to global warming research. A classic example of this is the New Zealand climate agency, which is now admitting their scientists produced bogus "warming" temperatures for New Zealand. "NIWA makes the huge admission that New Zealand has experienced hardly any warming during the last half-century. For all their talk about warming, for all their rushed invention of the “Eleven-Station Series” to prove warming, this new series shows that no warming has occurred here since about 1960. Almost all the warming took place from 1940-60, when the IPCC says that the effect of CO2 concentrations was trivial. Indeed, global temperatures were falling during that period.....Almost all of the 34 adjustments made by Dr Jim Salinger to the 7SS have been abandoned, along with his version of the comparative station methodology."A collection of temperature-fabrication charts.
  • ...10 more annotations...
  • I check out the first link, the first "here" where the article says "Read here and here". I can see that there's been some sort of dispute between two New Zealand groups associated with climate change. One is New Zealand’s Climate Science Coalition (NZCSC) and the other is New Zealand’s National Institute of Water and Atmospheric Research (NIWA), but it doesn't tell me a whole lot more than I already got from the other article.
  • I check the second source behind that article. The second article, I now realize, is published on the website of a person called Andrew Montford with whom I've been speaking recently and who is the author of a book titled The Hockey Stick Illusion. I would not label Andrew a denialist. He makes some good points and seems to be a decent guy and geniune sceptic (This is not to suggest all denialists are outwardly dishonest; however, they do tend to be hard to reason with). Again, this article doesn't give me anything that I haven't already seen, except a link to another background source. I go there.
  • From this piece written up on Scoop NZNEWSUK I discover that a coalition group consisting of the NZCSC and the Climate Conversation Group (CCG) has pressured the NIWA into abandoning a set of temperature record adjustments of which the coalition dispute the validity. This was the culmination of a court proceeding in December 2010, last month. In dispute were 34 adjustments that had been made by Dr Jim Salinger to the 7SS temperature series, though I don't know what that is exactly. I also discover that there is a guy called Richard Treadgold, Convenor of the CCG, who is quoted several times. Some of the statements he makes are quoted in the articles I've already seen. They are of a somewhat snide tenor. The CSC object to the methodology used by the NIWA to adjust temperature measurements (one developed as part of a PhD thesis), which they critique in a paper in November 2009 with the title "Are we feeling warmer yet?", and are concerned about how this public agency is spending its money. I'm going to have to dig a bit deeper if I want to find out more. There is a section with links under the heading "Related Stories on Scoop". I click on a few of those.
  • One of these leads me to more. Of particular interest is a fairly neutral article outlining the progress of the court action. I get some more background: For the last ten years, visitors to NIWA’s official website have been greeted by a graph of the “seven-station series” (7SS), under the bold heading “New Zealand Temperature Record”. The graph covers the period from 1853 to the present, and is adorned by a prominent trend-line sloping sharply upwards. Accompanying text informs the world that “New Zealand has experienced a warming trend of approximately 0.9°C over the past 100 years.” The 7SS has been updated and used in every monthly issue of NIWA’s “Climate Digest” since January 1993. Its 0.9°C (sometimes 1.0°C) of warming has appeared in the Australia/NZ Chapter of the IPCC’s 2001 and 2007 Assessment Reports. It has been offered as sworn evidence in countless tribunals and judicial enquiries, and provides the historical base for all of NIWA’s reports to both Central and Local Governments on climate science issues and future projections.
  • now I can see why this is so important. The temperature record informs the conclusions of the IPCC assessment reports and provides crucial evidence for global warming.
  • Further down we get: NIWA announces that it has now completed a full internal examination of the Salinger adjustments in the 7SS, and has forwarded its “review papers” to its Australian counterpart, the Bureau of Meteorology (BOM) for peer review.and: So the old 7SS has already been repudiated. A replacement NZTR [New Zealand Temperature Record] is being prepared by NIWA – presumably the best effort they are capable of producing. NZCSC is about to receive what it asked for. On the face of it, there’s nothing much left for the Court to adjudicate.
  • NIWA has been forced to withdraw its earlier temperature record and replace it with a new one. Treadgold quite clearly states that "NIWA makes the huge admission that New Zealand has experienced hardly any warming during the last half-century" and that "the new temperature record shows no evidence of a connection with global warming." Earlier in the article he also stresses the role of the CSC in achieving these revisions, saying "after 12 months of futile attempts to persuade the public, misleading answers to questions in the Parliament from ACT and reluctant but gradual capitulation from NIWA, their relentless defence of the old temperature series has simply evaporated. They’ve finally given in, but without our efforts the faulty graph would still be there."
  • All this leads me to believe that if I look at the website of NIWA I will see a retraction of the earlier position and a new position that New Zealand has experienced no unusual warming. This is easy enough to check. I go there. Actually, I search for it to find the exact page. Here is the 7SS page on the NIWA site. Am I surprised that NIWA have retracted nothing and that in fact their revised graph shows similar results? Not really. However, I am somewhat surprised by this page on the Climate Conversation Group website which claims that the 7SS temperature record is as dead as the parrot in the Monty Python sketch. It says "On the eve of Christmas, when nobody was looking, NIWA declared that New Zealand had a new official temperature record (the NZT7) and whipped the 7SS off its website." However, I've already seen that this is not true. Perhaps there was once a 7SS graph and information about the temperature record on the site's homepage that can no longer be seen. I don't know. I can only speculate. I know that there is a section on the NIWA site about the 7SS temperature record that contains a number of graphs and figures and discusses recent revisions. It has been updated as recently as December 2010, last month. The NIWA page talks all about the 7SS series and has a heading that reads "Our new analysis confirms the warming trend".
  • The CCG page claims that the new NZT7 is not in fact a revision but rather a replacement. Although it results in a similar curve, the adjustments that were made are very different. Frankly I can't see how that matters at the end of the day. Now, I don't really know whether I can believe that the NIWA analysis is true, but what I am in no doubt of whatsoever is that the statements made by Richard Treadgold that were quoted in so many places are at best misleading. The NIWA has not changed its position in the slightest. The assertion that the NIWA have admitted that New Zealand has not warmed much since 1960 is a politician's careful argument. Both analyses showed the same result. This is a fact that NIWA have not disputed; however, they still maintain a connection to global warming. A document explaining the revisions talks about why the warming has slowed after 1960: The unusually steep warming in the 1940-1960 period is paralleled by an unusually large increase in northerly flow* during this same period. On a longer timeframe, there has been a trend towards less northerly flow (more southerly) since about 1960. However, New Zealand temperatures have continued to increase over this time, albeit at a reduced rate compared with earlier in the 20th century. This is consistent with a warming of the whole region of the southwest Pacific within which New Zealand is situated.
  • Denialists have taken Treadgold's misleading mantra and spread it far and wide including on Twitter and fringe websites, but it is faulty as I've just demonstrated. Why do people do this? Perhaps they are hoping that others won't check the sources. Most people don't. I hope this serves as a lesson for why you always should.
Weiye Loh

Can a group of scientists in California end the war on climate change? | Science | The ... - 0 views

  • Muller calls his latest obsession the Berkeley Earth project. The aim is so simple that the complexity and magnitude of the undertaking is easy to miss. Starting from scratch, with new computer tools and more data than has ever been used, they will arrive at an independent assessment of global warming. The team will also make every piece of data it uses – 1.6bn data points – freely available on a website. It will post its workings alongside, including full information on how more than 100 years of data from thousands of instruments around the world are stitched together to give a historic record of the planet's temperature.
  • Muller is fed up with the politicised row that all too often engulfs climate science. By laying all its data and workings out in the open, where they can be checked and challenged by anyone, the Berkeley team hopes to achieve something remarkable: a broader consensus on global warming. In no other field would Muller's dream seem so ambitious, or perhaps, so naive.
  • "We are bringing the spirit of science back to a subject that has become too argumentative and too contentious," Muller says, over a cup of tea. "We are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find." Why does Muller feel compelled to shake up the world of climate change? "We are doing this because it is the most important project in the world today. Nothing else comes close," he says.
  • ...20 more annotations...
  • There are already three heavyweight groups that could be considered the official keepers of the world's climate data. Each publishes its own figures that feed into the UN's Intergovernmental Panel on Climate Change. Nasa's Goddard Institute for Space Studies in New York City produces a rolling estimate of the world's warming. A separate assessment comes from another US agency, the National Oceanic and Atmospheric Administration (Noaa). The third group is based in the UK and led by the Met Office. They all take readings from instruments around the world to come up with a rolling record of the Earth's mean surface temperature. The numbers differ because each group uses its own dataset and does its own analysis, but they show a similar trend. Since pre-industrial times, all point to a warming of around 0.75C.
  • You might think three groups was enough, but Muller rolls out a list of shortcomings, some real, some perceived, that he suspects might undermine public confidence in global warming records. For a start, he says, warming trends are not based on all the available temperature records. The data that is used is filtered and might not be as representative as it could be. He also cites a poor history of transparency in climate science, though others argue many climate records and the tools to analyse them have been public for years.
  • Then there is the fiasco of 2009 that saw roughly 1,000 emails from a server at the University of East Anglia's Climatic Research Unit (CRU) find their way on to the internet. The fuss over the messages, inevitably dubbed Climategate, gave Muller's nascent project added impetus. Climate sceptics had already attacked James Hansen, head of the Nasa group, for making political statements on climate change while maintaining his role as an objective scientist. The Climategate emails fuelled their protests. "With CRU's credibility undergoing a severe test, it was all the more important to have a new team jump in, do the analysis fresh and address all of the legitimate issues raised by sceptics," says Muller.
  • This latest point is where Muller faces his most delicate challenge. To concede that climate sceptics raise fair criticisms means acknowledging that scientists and government agencies have got things wrong, or at least could do better. But the debate around global warming is so highly charged that open discussion, which science requires, can be difficult to hold in public. At worst, criticising poor climate science can be taken as an attack on science itself, a knee-jerk reaction that has unhealthy consequences. "Scientists will jump to the defence of alarmists because they don't recognise that the alarmists are exaggerating," Muller says.
  • The Berkeley Earth project came together more than a year ago, when Muller rang David Brillinger, a statistics professor at Berkeley and the man Nasa called when it wanted someone to check its risk estimates of space debris smashing into the International Space Station. He wanted Brillinger to oversee every stage of the project. Brillinger accepted straight away. Since the first meeting he has advised the scientists on how best to analyse their data and what pitfalls to avoid. "You can think of statisticians as the keepers of the scientific method, " Brillinger told me. "Can scientists and doctors reasonably draw the conclusions they are setting down? That's what we're here for."
  • For the rest of the team, Muller says he picked scientists known for original thinking. One is Saul Perlmutter, the Berkeley physicist who found evidence that the universe is expanding at an ever faster rate, courtesy of mysterious "dark energy" that pushes against gravity. Another is Art Rosenfeld, the last student of the legendary Manhattan Project physicist Enrico Fermi, and something of a legend himself in energy research. Then there is Robert Jacobsen, a Berkeley physicist who is an expert on giant datasets; and Judith Curry, a climatologist at Georgia Institute of Technology, who has raised concerns over tribalism and hubris in climate science.
  • Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. Muller likens Rohde's achievement to Hercules's enormous task of cleaning the Augean stables.
  • The wealth of data Rohde has collected so far – and some dates back to the 1700s – makes for what Muller believes is the most complete historical record of land temperatures ever compiled. It will, of itself, Muller claims, be a priceless resource for anyone who wishes to study climate change. So far, Rohde has gathered records from 39,340 individual stations worldwide.
  • Publishing an extensive set of temperature records is the first goal of Muller's project. The second is to turn this vast haul of data into an assessment on global warming.
  • The big three groups – Nasa, Noaa and the Met Office – work out global warming trends by placing an imaginary grid over the planet and averaging temperatures records in each square. So for a given month, all the records in England and Wales might be averaged out to give one number. Muller's team will take temperature records from individual stations and weight them according to how reliable they are.
  • This is where the Berkeley group faces its toughest task by far and it will be judged on how well it deals with it. There are errors running through global warming data that arise from the simple fact that the global network of temperature stations was never designed or maintained to monitor climate change. The network grew in a piecemeal fashion, starting with temperature stations installed here and there, usually to record local weather.
  • Among the trickiest errors to deal with are so-called systematic biases, which skew temperature measurements in fiendishly complex ways. Stations get moved around, replaced with newer models, or swapped for instruments that record in celsius instead of fahrenheit. The times measurements are taken varies, from say 6am to 9pm. The accuracy of individual stations drift over time and even changes in the surroundings, such as growing trees, can shield a station more from wind and sun one year to the next. Each of these interferes with a station's temperature measurements, perhaps making it read too cold, or too hot. And these errors combine and build up.
  • This is the real mess that will take a Herculean effort to clean up. The Berkeley Earth team is using algorithms that automatically correct for some of the errors, a strategy Muller favours because it doesn't rely on human interference. When the team publishes its results, this is where the scrutiny will be most intense.
  • Despite the scale of the task, and the fact that world-class scientific organisations have been wrestling with it for decades, Muller is convinced his approach will lead to a better assessment of how much the world is warming. "I've told the team I don't know if global warming is more or less than we hear, but I do believe we can get a more precise number, and we can do it in a way that will cool the arguments over climate change, if nothing else," says Muller. "Science has its weaknesses and it doesn't have a stranglehold on the truth, but it has a way of approaching technical issues that is a closer approximation of truth than any other method we have."
  • It might not be a good sign that one prominent climate sceptic contacted by the Guardian, Canadian economist Ross McKitrick, had never heard of the project. Another, Stephen McIntyre, whom Muller has defended on some issues, hasn't followed the project either, but said "anything that [Muller] does will be well done". Phil Jones at the University of East Anglia was unclear on the details of the Berkeley project and didn't comment.
  • Elsewhere, Muller has qualified support from some of the biggest names in the business. At Nasa, Hansen welcomed the project, but warned against over-emphasising what he expects to be the minor differences between Berkeley's global warming assessment and those from the other groups. "We have enough trouble communicating with the public already," Hansen says. At the Met Office, Peter Stott, head of climate monitoring and attribution, was in favour of the project if it was open and peer-reviewed.
  • Peter Thorne, who left the Met Office's Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller's claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. "Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn't give you much more bang for your buck," he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.
  • Despite his reservations, Thorne says climate science stands to benefit from Muller's project. "We need groups like Berkeley stepping up to the plate and taking this challenge on, because it's the only way we're going to move forwards. I wish there were 10 other groups doing this," he says.
  • Muller's project is organised under the auspices of Novim, a Santa Barbara-based non-profit organisation that uses science to find answers to the most pressing issues facing society and to publish them "without advocacy or agenda". Funding has come from a variety of places, including the Fund for Innovative Climate and Energy Research (funded by Bill Gates), and the Department of Energy's Lawrence Berkeley Lab. One donor has had some climate bloggers up in arms: the man behind the Charles G Koch Charitable Foundation owns, with his brother David, Koch Industries, a company Greenpeace called a "kingpin of climate science denial". On this point, Muller says the project has taken money from right and left alike.
  • No one who spoke to the Guardian about the Berkeley Earth project believed it would shake the faith of the minority who have set their minds against global warming. "As new kids on the block, I think they will be given a favourable view by people, but I don't think it will fundamentally change people's minds," says Thorne. Brillinger has reservations too. "There are people you are never going to change. They have their beliefs and they're not going to back away from them."
Weiye Loh

Skepticblog » Global Warming Skeptic Changes His Tune - by Doing the Science ... - 0 views

  • To the global warming deniers, Muller had been an important scientific figure with good credentials who had expressed doubt about the temperature data used to track the last few decades of global warming. Muller was influenced by Anthony Watts, a former TV weatherman (not a trained climate scientist) and blogger who has argued that the data set is mostly from large cities, where the “urban heat island” effect might bias the overall pool of worldwide temperature data. Climate scientists have pointed out that they have accounted for this possible effect already, but Watts and Muller were unconvinced. With $150,000 (25% of their funding) from the Koch brothers (the nation’s largest supporters of climate denial research), as well as the Getty Foundation (their wealth largely based on oil money) and other funding sources, Muller set out to reanalyze all the temperature data by setting up the Berkeley Earth Surface Temperature Project.
  • Although only 2% of the data were analyzed by last month, the Republican climate deniers in Congress called him to testify in their March 31 hearing to attack global warming science, expecting him to give them scientific data supporting their biases. To their dismay, Muller behaved like a real scientist and not an ideologue—he followed his data and told them the truth, not what they wanted to hear. Muller pointed out that his analysis of the data set almost exactly tracked what the National Oceanographic and Atmospheric Administration (NOAA), the Goddard Institute of Space Science (GISS), and the Hadley Climate Research Unit at the University of East Anglia in the UK had already published (see figure).
  • Muller testified before the House Committee that: The Berkeley Earth Surface Temperature project was created to make the best possible estimate of global temperature change using as complete a record of measurements as possible and by applying novel methods for the estimation and elimination of systematic biases. We see a global warming trend that is very similar to that previously reported by the other groups. The world temperature data has sufficient integrity to be used to determine global temperature trends. Despite potential biases in the data, methods of analysis can be used to reduce bias effects well enough to enable us to measure long-term Earth temperature changes. Data integrity is adequate. Based on our initial work at Berkeley Earth, I believe that some of the most worrisome biases are less of a problem than I had previously thought.
  • ...4 more annotations...
  • The right-wing ideologues were sorely disappointed, and reacted viciously in the political sphere by attacking their own scientist, but Muller’s scientific integrity overcame any biases he might have harbored at the beginning. He “called ‘em as he saw ‘em” and told truth to power.
  • it speaks well of the scientific process when a prominent skeptic like Muller does his job properly and admits that his original biases were wrong. As reported in the Los Angeles Times : Ken Caldeira, an atmospheric scientist at the Carnegie Institution for Science, which contributed some funding to the Berkeley effort, said Muller’s statement to Congress was “honorable” in recognizing that “previous temperature reconstructions basically got it right…. Willingness to revise views in the face of empirical data is the hallmark of the good scientific process.”
  • This is the essence of the scientific method at its best. There may be biases in our perceptions, and we may want to find data that fits our preconceptions about the world, but if science is done properly, we get a real answer, often one we did not expect or didn’t want to hear. That’s the true test of when science is giving us a reality check: when it tells us “an inconvenient truth”, something we do not like, but is inescapable if one follows the scientific method and analyzes the data honestly.
  • Sit down before fact as a little child, be prepared to give up every preconceived notion, follow humbly wherever and to whatever abysses nature leads, or you shall learn nothing.
Weiye Loh

Adventures in Flay-land: James Delingpole and the "Science" of Denialism - 0 views

  • Perhaps like me, you watched the BBC Two Horizons program Monday night presented by Sir Paul Nurse, president of the Royal Society and Nobel Prize winning geneticist for his discovery of the genes of cell division.
  • James. He really believes there's some kind of mainstream science "warmist" conspiracy against the brave outliers who dare to challenge the consensus. He really believes that "climategate" is a real scandal. He fails to understand that it is a common practice in statistics to splice together two or more datasets where you know that the quality of data is patchy. In the case of "climategate", researchers found that indirect temperature measurements based on tree ring widths (the tree ring temperature proxy) is consistent with other proxy methods of recording temperature from before the start of the instrumental temperature record (around 1950) but begins to show a decline in temperature after that for reasons which are unclear. Actual temperature measurements however show the opposite. The researcher at the head of the climategate affair, Phil Jones, created a graph of the temperature record to include on the cover of a report for policy makers and journalists. For this graph he simply spliced together the tree ring proxy data up until 1950 with the recorded data after that using statistical techniques to bring them into agreement. What made this seem particularly dodgy was an email intercepted by a hacker in which Jones referred to this practice as a "Mike's Nature trick", referring to a paper published by his colleague Mike Hulme Michael Mann in the journal Nature. It is however nothing out of the ordinary. Delingpole and others have talked about how this "trick" was used here to "hide the decline" revealed by the other dataset, as though this was some sort of deception. The fact that all parties were found to have behaved ethically is simply further evidence of the global warmist conspiracy. Delingpole takes it further and casts aspersions on scientific consensus and the entire peer review process.
  • When Nurse asked Delingpole the very straightforward question of whether he would be willing to trust a scientific consensus if he required treatment for cancer, he could have said "Gee, that's an interesting question. Let me think about that and why it's different."
  • ...7 more annotations...
  • Instead, he became defensive and lost his focus. Eventually he would make such regrettable statements as this one: "It is not my job to sit down and read peer-reviewed papers because I simply haven’t got the time, I haven’t got the scientific expertise… I am an interpreter of interpretation."
  • In a parallel universe where James Delingpole is not the "penis" that Ben Goldacre describes him to be, he might have said the following: Gee, that's an interesting question. Let me think about why it's different. (Thinks) Well, it seems to me that when evaluating a scientifically agreed treatment for a disease such as cancer, we have not only all the theory to peruse and the randomized and blinded trials, but also thousands if not millions of case studies where people have undergone the intervention. We have enough data to estimate a person's chances of recovery and know that on average they will do better. When discussing climate change, we really only have the one case study. Just the one earth. And it's a patient that has not undergone any intervention. The scientific consensus is therfore entirely theoretical and intangible. This makes it more difficult for the lay person such as myself to trust it.
  • Sir Paul ended the program saying "Scientists have got to get out there… if we do not do that it will be filled by others who don’t understand the science, and who may be driven by politics and ideology."
  • f proxy tracks instrumental from 1850 to 1960 but then diverges for unknown reasons, how do we know that the proxy is valid for reconstructing temperatures in periods prior to 1850?
  • This is a good question and one I'm not sure I can answer it to anyone's satisfaction. We seem to have good agreement among several forms of temperature proxy going back centuries and with direct measurements back to 1880. There is divergence in more recent years and there are several theories as to why that might be. Some possible explanations here:http://www.skepticalscience.com/Tree-ring-proxies-divergence-problem.htm
  • In the physical world we can never be absolutely certain of anything. Rene Des Cartes showed it was impossible to prove that everything he sensed wasn't manipulated by some invisible demon.
  • It is necessary to first make certain assumptions about the universe that we observe. After that, we can only go with the best theories available that allow us to make scientific progress.
Weiye Loh

Scientist Beloved by Climate Deniers Pulls Rug Out from Their Argument - Environment - ... - 0 views

  • One of the scientists was Richard Muller from University of California, Berkeley. Muller has been working on an independent project to better estimate the planet's surface temperatures over time. Because he is willing to say publicly that he has some doubts about the accuracy of the temperature stations that most climate models are based on, he has been embraced by the science denying crowd.
  • A Koch brothers charity, for example, has donated nearly 25 percent of the financial support provided to Muller's project.
  • Skeptics of climate science have been licking their lips waiting for his latest research, which they hoped would undermine the data behind basic theories of anthropogenic climate change. At the hearing today, however, Muller threw them for a loop with this graph:
  • ...3 more annotations...
  • Muller's data (black line) tracks pretty well with the three established data sets. This is just an initial sampling of Muller's data—just 2 percent of the 1.6 billion records he's working with—but these early findings are incredibly consistent with the previous findings
  • In his testimony, Muller made these points (emphasis mine): The Berkeley Earth Surface Temperature project was created to make the best possible estimate of global temperature change using as complete a record of measurements as possible and by applying novel methods for the estimation and elimination of systematic biases. We see a global warming trend that is very similar to that previously reported by the other groups. The world temperature data has sufficient integrity to be used to determine global temperature trends. Despite potential biases in the data, methods of analysis can be used to reduce bias effects well enough to enable us to measure long-term Earth temperature changes. Data integrity is adequate. Based on our initial work at Berkeley Earth, I believe that some of the most worrisome biases are less of a problem than I had previously thought.
  • For the many climate deniers who hang their arguments on Muller's "doubts," this is a severe blow. Of course, when the hard scientific truths are inconvenient, climate denying House leaders can always call a lawyer, a marketing professor, and an economist into the scientific hearing.
  •  
    Today, there was a climate science hearing in the House Committee on Science, Space, and Technology. Of the six "expert" witnesses, only three were scientists. The others were an economist, a lawyer, and a professor of marketing. One of the scientists was Richard Muller from University of California, Berkeley. Muller has been working on an independent project to better estimate the planet's surface temperatures over time. Because he is willing to say publicly that he has some doubts about the accuracy of the temperature stations that most climate models are based on, he has been embraced by the science denying crowd. A Koch brothers charity, for example, has donated nearly 25 percent of the financial support provided to Muller's project.
Weiye Loh

Roger Pielke Jr.'s Blog: Global Temperature Trends - 0 views

  • My concern about the potential effects of human influences on the climate system are not a function of global average warming over a long-period of time or of predictions of continued warming into the future.
  • what maters are the effects of human influences on the climate system on human and ecological scales, not at the global scale. No one experiences global average temperature and it is very poorly correlated with things that we do care about in specific places at specific times.
  • Consider the following thought experiment. Divide the world up into 1,000 grid boxes of equal area. Now imagine that the temperature in each of 500 of those boxes goes up by 20 degrees while the temperature in the other 500 goes down by 20 degrees. The net global change is exactly zero (because I made it so). However, the impacts would be enormous. Let's further say that the changes prescribed in my thought experiment are the direct consequence of human activity. Would we want to address those changes? Or would we say, ho hum, it all averages out globally, so no problem? The answer is obvious and is not a function of what happens at some global average scale, but what happens at human and ecological scales.
  • ...2 more annotations...
  • In the real world, the effects of increasing carbon dioxide on human and ecological scales are well established, and they include a biogechemical effect on land ecosystems with subsequent effects on water and climate, as well as changes to the chemistry of the oceans. Is it possible that these effects are benign? Sure. Is it also possible that these effects have some negatives? Sure. These two factors alone would be sufficient for one to begin to ask questions about the worth of decarbonizing the global energy system. But greenhouse gas emissions also have a radiative effect that, in the real world, is thought to be a net warming, all else equal and over a global scale. However, if this effect were to be a net cooling, or even, no net effect at the global scale, it would not change my views about a need to consider decarbonizing the energy system one bit. There is an effect -- or effects to be more accurate -- and these effects could be negative.
  • The debate over climate change has many people on both sides of the issue wrapped up in discussing global average temperature trends. I understand this as it is an icon with great political symbolism. It has proved a convenient political battleground, but the reality is that it should matter little to the policy case for decarbonization. What matters is that there is a human effect on the climate system and it could be negative with respect to things people care about. That is enough to begin asking whether we want to think about accelerating decarbonization of the global economy.
  •  
    one needs to know only two things about the science of climate change to begin asking whether accelerating decarbonization of the economy might be worth doing: Carbon dioxide has an influence on the climate system. This influence might well be negative for things many people care about. That is it. An actual decision to accelerate decarbonization and at what rate will depend on many other things, like costs and benefits of particular actions unrelated to climate and technological alternatives. In this post I am going to further explain my views, based on an interesting question posed in that earlier thread. What would my position be if it were to be shown, hypothetically, that the global average surface temperature was not warming at all, or in fact even cooling (over any relevant time period)? Would I then change my views on the importance of decarbonizing the global energy system?
Weiye Loh

Roger Pielke Jr.'s Blog: Global Warming: It's Worse Than You Think - 0 views

  • What happens if you weight the land surface record to account for this bias? Their preliminary result (which they emphasize is preliminary) is that land surface trends would actually increase if properly weighted. If this is the case then it potentially presents a headache for the climate modeling community because it would exacerbate the divergence between land surface and tropospheric trends that we documented in Klotzbach et al. 2009 (see this, this, and this).
  • My favorite climate scientist and several of his colleagues have a new paper out on global land surface temperature trends (Montandon et al. 2011).  They perform an interesting analysis in asking the degree to which the spatial distribution of land surface stations is representative of land surface types found on Earth. They find that the major surface temperature records (i.e., NCDC, GISS, CRU, GHCN) are not spatially representative (see their Figure 2 above).
  •  
    My favorite climate scientist and several of his colleagues have a new paper out on global land surface temperature trends (Montandon et al. 2011).  They perform an interesting analysis in asking the degree to which the spatial distribution of land surface stations is representative of land surface types found on Earth. They find that the major surface temperature records (i.e., NCDC, GISS, CRU, GHCN) are not spatially representative (see their Figure 2 above).
Weiye Loh

Climate cherry pickers: Falling humidity - 0 views

  • Scientific skepticism requires we consider the full body of evidence before coming to conclusions. The antithesis of genuine skepticism is ignoring all the evidence that contradicts a desired conclusion.
  • he article seems to overlook the relative importance of solar radiation and wind as being the two main drivers of evaporation, translating as the skin temperature of the evaporating surface rather than ambient temperature, and the airflow over it, which in the case of solar radiation would make water vapour more of a forcing than a feedback. This paper details the calculations and the various inputs that are involved BUREAU OF METEOROLOGY REFERENCE EVAPOTRANSPIRATION CALCULATIONS
  • This doesn't seem like a particularly relevant or useful start to the discussion of this topic. John's done some nice work looking at humidity trends wrt the water vapor feedback, and it would be a shame to divert the discussion right from the start into a lot of wrangling over minutia.
  • ...3 more annotations...
  • Ned, I feel it is both relevant and important enough to clarify given the statement in the article "Water vapor provides the most powerful feedback in the climate system. When surface temperature warms, this leads to an increase in atmospheric humidity." I feel that is not conveying a sense of the correct drivers that are most relevant to how water vapour enters the atmosphere in the first place. There is a need to be sure that the foundations any discussion is built upon are fully understood and solid.
  • Johnd, are you suggesting that the most solar radiation is absorbed by the skin of the ocean, rather than by layers beneathe the surface? The citation you refer to is for calculating evapotranspiration on land, where light does not penetrate beneathe the "skin", at least not far. Water is actually fairly transparent to light so the very thin "skin" accounts for little of the absorbance, although eventually most incoming light is absorbed at depth. The skin temperature of the ocean (where the vast majority of evaporation on earth happens) is largely a function of mixed water column temperature as a whole, which reflects the balance between inputs (solar radiation, incoming IR radiation) and outputs (outgoing IR radiation, evaporation, convection, mixing)of heat energy. As the earth's temperature increases that heat balance results in higher mixed layer temps, which leads to high skin temps and greater evaporation.
  • I also want to agree with Ned. This discussion of insolation and skin temperatures is a distraction. All other things being equal (insolation included), evaporation and water vapor should increase if the earth and atmosphere warm.
  •  
    Climate cherry pickers: Falling humidity
Weiye Loh

DenialDepot: A word of caution to the BEST project team - 0 views

  • 1) Any errors, however inconsequential, will be taken Very Seriously and accusations of fraud will be made.
  • 2) If you adjust the raw data we will accuse you of fraudulently fiddling the figures whilst cooking the books.3) If you don't adjust the raw data we will accuse you of fraudulently failing to account for station biases and UHI.
  • 7) By all means publish all your source code, but we will still accuse you of hiding the methodology for your adjustments.
  • ...10 more annotations...
  • 8) If you publish results to your website and errors are found, we will accuse you of a Very Serious Error irregardless of severity (see point #1) and bemoan the press release you made about your results even though you won't remember making any press release about your results.
  • 9) With regard to point #8 above, at extra cost and time to yourself you must employ someone to thoroughly check each monthly update before is is published online, even if this delays publication of the results till the end of the month. You might be surprised at this because no-one actually relies on such freshly published data anyway and aren't the many eyes of blog audit better than a single pair of eyes? Well that's irrelevant. See points #1 and #810) If you don't publish results promptly at the start of the month on the public website, but instead say publish the results to a private site for checks to be performed before release, we will accuse you of engaging in unscientific-like secrecy and massaging the data behind closed doors.
  • 14) If any region/station shows a warming trend that doesn't match the raw data, and we can't understand why, we will accuse you of fraud and dismiss the entire record. Don't expect us to have to read anything to understand results.
  • 15) You must provide all input datasets on your website. It's no good referencing NOAAs site and saying they "own" the GHCN data for example. I don't want their GHCN raw temperatures file, I want the one on your hard drive which you used for the analysis, even if you claim they are the same. If you don't do this we will accuse you of hiding the data and preventing us checking your results.
  • 24. In the event that you comply with all of the above, we will point out that a mere hundred-odd years of data is irrelevant next to the 4.5 billion year history of Earth. So why do you even bother?
  • 23) In the unlikely event that I haven't wasted enough of your time forcing you to comply with the above rules, I also demand to see all emails you have sent or will send during the period 1950 to 2050 that contain any of these keywords
  • 22) We don't need any scrutiny because our role isn't important.
  • 17) We will treat your record as if no alternative exists. As if your record is the make or break of Something Really Important (see point #1) and we just can't check the results in any other way.
  • 16) You are to blame for any station data your team uses. If we find out that a station you use is next to an AC Unit, we will conclude you personally planted the thermometer there to deliberately get warming.
  • an article today by Roger Pielke Nr. (no relation) that posited the fascinating concept that thermometers are just as capricious and unreliable proxies for temperature as tree rings. In fact probably more so, and re-computing global temperature by gristlecone pines would reveal the true trend of global cooling, which will be in all our best interests and definitely NOT just those of well paying corporate entities.
  •  
    Dear Professor Muller and Team, If you want your Berkley Earth Surface Temperature project to succeed and become the center of attention you need to learn from the vast number of mistakes Hansen and Jones have made with their temperature records. To aid this task I created a point by point list for you.
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

Berkeley Earth Surface Temperature (© 2010) - 0 views

  •  
    A transparent approach Based on data analysis Our aim is to resolve current criticism of the former temperature analyses, and to prepare an open record that will allow rapid response to further criticism or suggestions. Our results will include not only our best estimate for the global temperature change, but estimates of the uncertainties in the record.
Weiye Loh

Leading climate scientist challenges Mail on Sunday's use of his research | Environment... - 0 views

  • Mojib Latif denies his research supports theory that current cold weather undermines scientific consensus on global warming
  • A leading scientist has hit out at misleading newspaper reports that linked his research to claims that the current cold weather undermines the scientific case for manmade global warming.
  • Mojib Latif, a climate expert at the Leibniz Institute at Kiel University in Germany, said he "cannot understand" reports that used his research to question the scientific consensus on climate change.He told the Guardian: "It comes as a surprise to me that people would try to use my statements to try to dispute the nature of global warming. I believe in manmade global warming. I have said that if my name was not Mojib Latif it would be global warming."
  • ...3 more annotations...
  • A report in the Mail on Sunday said that Latif's results "challenge some of the global warming orthodoxy's most deeply cherished beliefs" and "undermine the standard climate computer models". Monday's Daily Mail and Daily Telegraph repeated the claims.The reports attempted to link the Arctic weather that has enveloped the UK with research published by Latif's team in the journal Nature in 2008. The research said that natural fluctuations in ocean temperature could have a bigger impact on global temperature than expected. In particular, the study concluded that cooling in the oceans could offset global warming, with the average temperature over the decades 2000-2010 and 2005-2015 predicted to be no higher than the average for 1994-2004. Despite clarifications from the scientists at the time, who stressed that the research did not challenge the predicted long-term warming trend, the study was widely misreported as signalling a switch from global warming to global cooling.
  • The Mail on Sunday article said that Latif's research showed that the current cold weather heralds such "a global trend towards cooler weather".It said: "The BBC assured viewers that the big chill was was merely short-term 'weather' that had nothing to do with 'climate', which was still warming. The work of Prof Latif and the other scientists refutes that view."
  • Not according to Latif. "They are not related at all," he said. "What we are experiencing now is a weather phenomenon, while we talked about the mean temperature over the next 10 years. You can't compare the two."
Weiye Loh

Oxford academic wins right to read UEA climate data | Environment | guardian.co.uk - 0 views

  • Jonathan Jones, physics professor at Oxford University and self-confessed "climate change agnostic", used freedom of information law to demand the data that is the life's work of the head of the University of East Anglia's Climatic Research Unit, Phil Jones. UEA resisted the requests to disclose the data, but this week it was compelled to do so.
  • Graham gave the UEA one month to deliver the data, which includes more than 4m individual thermometer readings taken from 4,000 weather stations over the past 160 years. The commissioner's office said this was his first ruling on demands for climate data made in the wake of the climategate affair.
  • an archive of world temperature records collected jointly with the Met Office.
  • ...3 more annotations...
  • Critics of the UEA's scientists say an independent analysis of the temperature data may reveal that Phil Jones and his colleagues have misinterpreted the evidence of global warming. They may have failed to allow for local temperature influences, such as the growth of cities close to many of the thermometers.
  • when Jonathan Jones and others asked for the data in the summer of 2009, the UEA said legal exemptions applied. It said variously that the temperature data were the property of foreign meteorological offices; were intellectual property that might be valuable if sold to other researchers; and were in any case often publicly available.
  • Jonathan Jones said this week that he took up the cause of data freedom after Steve McIntyre, a Canadian mathematician, had requests for the data turned down. He thought this was an unreasonable response when Phil Jones had already shared the data with academic collaborators, including Prof Peter Webster of the Georgia Institute of Technology in the US. He asked to be given the data already sent to Webster, and was also turned down.
  •  
    An Oxford academic has won the right to read previously secret data on climate change held by the University of East Anglia (UEA). The decision, by the government's information commissioner, Christopher Graham, is being hailed as a landmark ruling that will mean that thousands of British researchers are required to share their data with the public.
Weiye Loh

Climate change and extreme flooding linked by new evidence | George Monbiot | Environme... - 0 views

  • Two studies suggest for the first time a clear link between global warming and extreme precipitation
  • There's a sound rule for reporting weather events that may be related to climate change. You can't say that a particular heatwave or a particular downpour – or even a particular freeze – was definitely caused by human emissions of greenhouse gases. But you can say whether these events are consistent with predictions, or that their likelihood rises or falls in a warming world.
  • Weather is a complex system. Long-running trends, natural fluctuations and random patterns are fed into the global weather machine, and it spews out a series of events. All these events will be influenced to some degree by global temperatures, but it's impossible to say with certainty that any of them would not have happened in the absence of man-made global warming.
  • ...5 more annotations...
  • over time, as the data build up, we begin to see trends which suggest that rising temperatures are making a particular kind of weather more likely to occur. One such trend has now become clearer. Two new papers, published by Nature, should make us sit up, as they suggest for the first time a clear link between global warming and extreme precipitation (precipitation means water falling out of the sky in any form: rain, hail or snow).
  • We still can't say that any given weather event is definitely caused by man-made global warming. But we can say, with an even higher degree of confidence than before, that climate change makes extreme events more likely to happen.
  • One paper, by Seung-Ki Min and others, shows that rising concentrations of greenhouse gases in the atmosphere have caused an intensification of heavy rainfall events over some two-thirds of the weather stations on land in the northern hemisphere. The climate models appear to have underestimated the contribution of global warming on extreme rainfall: it's worse than we thought it would be.
  • The other paper, by Pardeep Pall and others, shows that man-made global warming is very likely to have increased the probability of severe flooding in England and Wales, and could well have been behind the extreme events in 2000. The researchers ran thousands of simulations of the weather in autumn 2000 (using idle time on computers made available by a network of volunteers) with and without the temperature rises caused by man-made global warming. They found that, in nine out of 10 cases, man-made greenhouse gases increased the risks of flooding. This is probably as solid a signal as simulations can produce, and it gives us a clear warning that more global heating is likely to cause more floods here.
  • As Richard Allan points out, also in Nature, the warmer the atmosphere is, the more water vapour it can carry. There's even a formula which quantifies this: 6-7% more moisture in the air for every degree of warming near the Earth's surface. But both models and observations also show changes in the distribution of rainfall, with moisture concentrating in some parts of the world and fleeing from others: climate change is likely to produce both more floods and more droughts.
Weiye Loh

Hiding the Decline | Climate Etc. - 0 views

  • we need to understand the magnitude and characteristics and causes of natural climate variability over the current interglacial, particularly the last 2000 years.  I’m more interested in the handle than the blade of the hockey stick.  I also view understanding regional climate variations as much more important than trying to use some statistical model to create global average anomalies (which I personally regard as pointless, given the sampling issue).
  • I am really hoping that the AR5 will do a better job of providing a useful analysis and assessment of the paleodata for the last millennium.  However I am not too optimistic. There was another Workshop in Lisbon this past year (Sept 2010), on the Medieval Warm Period.  The abstracts for the presentations are found here.  No surprises, many of the usual people doing the usual things.
  • This raises the issue as to whether there is any value at all in the tree ring analyses for this application, and whether these paleoreconstructions can tell us anything.  Apart from the issue of the proxies not matching the observations from the current period of warming (which is also the period of best historical data), there is the further issue as to whether these hemispheric or global temperature analyses make any sense at all because of the sampling issue.  I am personally having a difficult time in seeing how this stuff has any credibility at the level of “likely” confidence levels reported in the TAR and AR4.
  • ...5 more annotations...
  • There is no question that the diagrams and accompanying text in the IPCC TAR, AR4 and WMO 1999 are misleading.  I was misled.  Upon considering the material presented in these reports, it did not occur to me that recent paleo data was not consistent with the historical record.  The one statement in AR4 (put in after McIntyre’s insistence as a reviewer) that mentions the divergence problem is weak tea.
  • It is obvious that there has been deletion of adverse data in figures shown IPCC AR3 and AR4, and the 1999 WMO document.  Not only is this misleading, but it is dishonest (I agree with Muller on this one).  The authors defend themselves by stating that there has been no attempt to hide the divergence problem in the literature, and that the relevant paper was referenced.  I infer then that there is something in the IPCC process or the authors’ interpretation of the IPCC process  (i.e. don’t dilute the message) that corrupted the scientists into deleting the adverse data in these diagrams.
  • McIntyre’s analysis is sufficiently well documented that it is difficult to imagine that his analysis is incorrect in any significant way.  If his analysis is incorrect, it should be refuted.  I would like to know what the heck Mann, Briffa, Jones et al. were thinking when they did this and why they did this, and how they can defend this, although the emails provide pretty strong clues.  Does the IPCC regard this as acceptable?  I sure don’t.
  • paleoproxies are outside the arena of my personal research expertise, and I find my eyes glaze over when I start reading about bristlecones, etc.  However, two things this week have changed my mind, and I have decided to take on one aspect of this issue: the infamous “hide the decline.” The first thing that contributed to my mind change was this post at Bishop Hill entitled “Will Sir John condemn hide the decline?”, related to Sir John Beddington’s statement:  It is time the scientific community became proactive in challenging misuse of scientific evidence.
  • The second thing was this youtube clip of physicist Richard Muller (Director of the Berkeley Earth Project), where he discusses “hide the decline” and vehemently refers to this as “dishonest,” and says “you are not allowed to do this,” and further states that he intends not to read further papers by these authors (note “hide the decline” appears around minute 31 into the clip).  While most of his research is in physics, Muller has also published important papers on paleoclimate, including a controversial paper that supported McIntyre and McKitrick’s analysis.
Weiye Loh

Climategate: Hiding the Decline? - 0 views

  • Regarding the “hide the decline” email, Jones has explained that when he used the word “trick”, he simply meant “a mathematical approach brought to bear to solve a problem”. The inquiry made the following criticism of the resulting graph (its emphasis): [T]he figure supplied for the WMO Report was misleading. We do not find that it is misleading to curtail reconstructions at some point per se, or to splice data, but we believe that both of these procedures should have been made plain — ideally in the figure but certainly clearly described in either the caption or the text. [1.3.2] But this was one isolated instance that occurred more than a decade ago. The Review did not find anything wrong with the overall picture painted about divergence (or uncertainties generally) in the literature and in IPCC reports. The Review notes that the WMO report in question “does not have the status or importance of the IPCC reports”, and concludes that divergence “is not hidden” and “the subject is openly and extensively discussed in the literature, including CRU papers.” [1.3.2]
  • As for the treatment of uncertainty in the AR4’s paleoclimate chapter, the Review concludes that the central Figure 6.10 is not misleading, that “[t]he variation within and between lines, as well as the depiction of uncertainty is quite apparent to any reader”, that “there has been no exclusion of other published temperature reconstructions which would show a very different picture”, and that “[t]he general discussion of sources of uncertainty in the text is extensive, including reference to divergence”. [7.3.1]
  • Regarding CRU’s selections of tree ring series, the Review does not presume to say whether one series is better than another, though it does point out that CRU have responded to the accusation that Briffa misused the Yamal data on their website. The Review found no evidence that CRU scientists knowingly promoted non-representative series or that their input cast doubt on the IPCC’s conclusions. The much-maligned Yamal series was included in only 4 of the 12 temperature reconstructions in the AR4 (and not at all in the TAR).
  • ...1 more annotation...
  • What about the allegation that CRU withheld the Yamal data? The Review found that “CRU did not withhold the underlying raw data (having correctly directed the single request to the owners)”, although “we believe that CRU should have ensured that the data they did not own, but on which their publications relied, was archived in a more timely way.” [1.3.2]
  •  
    Regarding the "hide the decline" email, Jones has explained that when he used the word "trick", he simply meant "a mathematical approach brought to bear to solve a problem". The inquiry made the following criticism of the resulting graph (its emphasis): [T]he figure supplied for the WMO Report was misleading. We do not find that it is misleading to curtail reconstructions at some point per se, or to splice data, but we believe that both of these procedures should have been made plain - ideally in the figure but certainly clearly described in either the caption or the text. [1.3.2] But this was one isolated instance that occurred more than a decade ago. The Review did not find anything wrong with the overall picture painted about divergence (or uncertainties generally) in the literature and in IPCC reports. The Review notes that the WMO report in question "does not have the status or importance of the IPCC reports", and concludes that divergence "is not hidden" and "the subject is openly and extensively discussed in the literature, including CRU papers." [1.3.2]
Weiye Loh

Monckton takes scientist to brink of madness at climate change talk | John Abraham | En... - 0 views

  • Christopher Monckton, Viscount Monckton of Brenchley, had given a rousing speech to a crowd at Bethel University in Minnesota, near where I live.His speech was on global warming and his style was convincing and irreverent. Anyone listening to him was given the impression that global warming was not happening, or that if it did happen it wouldn't be so bad, and scientists who warned about it were part of a vast conspiracy.
  • Monckton cited scientist after scientist whose work "disproved" global warming.He contended that polar bears are not really at risk (in fact they do better as weather warms); projections of sea level rise are a mere 6cm; Arctic ice has not declined in a decade; Greenland is not melting; sea levels are not rising; ocean temperatures are not increasing; medieval times were warmer than today; ocean acidification is not occurring; and global temperatures are not increasing.
  • I actually tracked down the articles and authors that Monckton cited. What I discovered was incredible, even to a scientist who follows the politics of climate change. I found that he had misrepresented the science.
  • ...4 more annotations...
  • For instance, Monckton's claims that "Arctic sea ice is fine, steady for a decade" made reference to Alaskan research group (IARC).I wrote to members of IARC and asked whether this was true. Both their chief scientist and director confirmed that Monckton was mistaken.They also pointed me to the National Snow and Ice Data Centre (NSIDC) for a second opinion.A scientist there confirmed Monckton's error, as did Dr Ola Johannessen, whose work has shown ice loss in Greenland (Monckton reported that Johannessen's work showed that Greenland "was just fine".)
  • Next, I investigated Monckton's claim that the medieval period was warmer than today. Monckton showed a slide featuring nine researchers' works which, he claimed, proved that today's warming is not unusual – it was hotter in the past.I wrote to these authors and I read their papers. It turned out that none of the authors or papers made the claims that Monckton attributed to them. This pattern of misinterpretation was becoming chronic.
  • Next, I checked on Monckton's claim that the ocean has not been heating for 50 years. To quote him directly, there has been "no ocean heat buildup for 50 years".On this slide, he referenced a well-known researcher named Dr Catia Domingues. It turns out Domingues said no such thing. What would she know? She only works for the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in Australia.
  • Monckton referred to a 2004 statement by the International Astronomical Union (IAU) which stated that solar activity has caused today's warming and that global warming will end soon.The president of the IAU division on the sun and heliosphere told me that there is no such position of the IAU and that I should pass this information on to whomever "might have used the IAU name to claim otherwise".
Weiye Loh

Himalayan glaciers not melting because of climate change, report finds - Telegraph - 0 views

  • Himalayan glaciers are actually advancing rather than retreating, claims the first major study since a controversial UN report said they would be melted within quarter of a century.
  • Researchers have discovered that contrary to popular belief half of the ice flows in the Karakoram range of the mountains are actually growing rather than shrinking.
  • The discovery adds a new twist to the row over whether global warming is causing the world's highest mountain range to lose its ice cover.
  • ...13 more annotations...
  • It further challenges claims made in a 2007 report by the UN's Intergovernmental Panel on Climate Change that the glaciers would be gone by 2035.
  • Although the head of the panel Dr Rajendra Pachauri later admitted the claim was an error gleaned from unchecked research, he maintained that global warming was melting the glaciers at "a rapid rate", threatening floods throughout north India.
  • The new study by scientists at the Universities of California and Potsdam has found that half of the glaciers in the Karakoram range, in the northwestern Himlaya, are in fact advancing and that global warming is not the deciding factor in whether a glacier survives or melts.
  • Dr Bodo Bookhagen, Dirk Scherler and Manfred Strecker studied 286 glaciers between the Hindu Kush on the Afghan-Pakistan border to Bhutan, taking in six areas.Their report, published in the journal Nature Geoscience, found the key factor affecting their advance or retreat is the amount of debris – rocks and mud – strewn on their surface, not the general nature of climate change.
  • Glaciers surrounded by high mountains and covered with more than two centimetres of debris are protected from melting.Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher.
  • In contrast, more than 50 per cent of observed glaciers in the Karakoram region in the northwestern Himalaya are advancing or stable.
  • "Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability or global sea level," the authors concluded.
  • Dr Bookhagen said their report had shown "there is no stereotypical Himalayan glacier" in contrast to the UN's climate change report which, he said, "lumps all Himalayan glaciers together."
  • Dr Pachauri, head of the Nobel prize-winning UN Intergovernmental Panel on Climate Change, has remained silent on the matter since he was forced to admit his report's claim that the Himalayan glaciers would melt by 2035 was an error and had not been sourced from a peer-reviewed scientific journal. It came from a World Wildlife Fund report.
  • this latest tawdry addition to the pathetic lies of the Reality Deniers. If you go to a proper source which quotes the full study such as:http://www.sciencedaily.com/re...you discover that the findings of this study are rather different to those portrayed here.
  • only way to consistently maintain a lie is to refuse point-blank to publish ALL the findings of a study, but to cherry-pick the bits which are consistent with the ongoing lie, while ignoring the rest.
  • Bookhagen noted that glaciers in the Karakoram region of Northwestern Himalaya are mostly stagnating. However, glaciers in the Western, Central, and Eastern Himalaya are retreating, with the highest retreat rates -- approximately 8 meters per year -- in the Western Himalayan Mountains. The authors found that half of the studied glaciers in the Karakoram region are stable or advancing, whereas about two-thirds are in retreat elsewhere throughout High Asia
  • glaciers in the steep Himalaya are not only affected by temperature and precipitation, but also by debris coverage, and have no uniform and less predictable response, explained the authors. The debris coverage may be one of the missing links to creating a more coherent picture of glacial behavior throughout all mountains. The scientists contrast this Himalayan glacial study with glaciers from the gently dipping, low-relief Tibetan Plateau that have no debris coverage. Those glaciers behave in a different way, and their frontal changes can be explained by temperature and precipitation changes.
Weiye Loh

Debris on certain Himalayan glaciers may prevent melting - 0 views

  • ScienceDaily (Jan. 25, 2011) — A new scientific study shows that debris coverage -- pebbles, rocks, and debris from surrounding mountains -- may be a missing link in the understanding of the decline of glaciers. Debris is distinct from soot and dust, according to the scientists.
  • Experts have stated that global warming is a key element in the melting of glaciers worldwide.
  • With the aid of new remote-sensing methods and satellite images, we identified debris coverage to be an important contributor to glacial advance and retreat behaviors," said Bookhagen. "This parameter has been almost completely neglected in previous Himalayan and other mountainous region studies, although its impact has been known for some time.
  • ...4 more annotations...
  • "There is no 'stereotypical' Himalayan glacier," said Bookhagen. "This is in clear contrast to the IPCC reports that lumps all Himalayan glaciers together."
  • Bookhagen noted that glaciers in the Karakoram region of Northwestern Himalaya are mostly stagnating. However, glaciers in the Western, Central, and Eastern Himalaya are retreating, with the highest retreat rates -- approximately 8 meters per year -- in the Western Himalayan Mountains. The authors found that half of the studied glaciers in the Karakoram region are stable or advancing, whereas about two-thirds are in retreat elsewhere throughout High Asia. This is in contrast to the prevailing notion that all glaciers in the tropics are retreating.
  • debris cover has the opposite effect of soot and dust on glaciers. Debris coverage thickness above 2 centimeters, or about a half an inch, 'shields' the glacier and prevents melting. This is the case for many Himalayan glaciers that are surrounded by towering mountains that almost continuously shed pebbles, debris, and rocks onto the glacier.
  • glaciers in the steep Himalaya are not only affected by temperature and precipitation, but also by debris coverage, and have no uniform and less predictable response, explained the authors. The debris coverage may be one of the missing links to creating a more coherent picture of glacial behavior throughout all mountains. The scientists contrast this Himalayan glacial study with glaciers from the gently dipping, low-relief Tibetan Plateau that have no debris coverage. Those glaciers behave in a different way, and their frontal changes can be explained by temperature and precipitation changes.
Weiye Loh

Roger Pielke Jr.'s Blog: Quote Clarification - 0 views

  • Writing in the WSJ Europe this week Anne Jolis had a piece on extreme weather events that quotes me, and unfortunately the terse quote is missing some context that is apparently leading to some confusion.
  • I spoke with Jolis at length and she asked very good questions and expressed a desire to get the science right. She even called me back to confirm how I was to be quoted. Unfortunately the longer quote was abbreviated, which Jolis warned was always possible.  I do not view this as a particularly big deal, but since I am being asked about it via email by a few folks, here is what the quote said and how it should be: "There's no data-driven answer yet to the question of how human activity has affected extreme weather," adds Roger Pielke Jr., another University of Colorado climate researcher. Instead it would be more precise to read: "There's no data-driven answer yet to the question of how human activity has affected extreme weather disasters," adds Roger Pielke Jr., another University of Colorado climate researcher.
  • given the context of the article the implication should be abundantly clear that in the quote I am not referring to daily temperature records, Arctic ice melt or global average surface temperatures or precipitation. The quote refers directly to recent extreme events with large societal impacts around the world that are explicitly mentioned in the piece such as Cyclone Yasi, the Australian floods, Europe's cold winter and the Russian drought.  Of course, in the climate debate, anything that can be misinterpreted usually will be.
1 - 20 of 38 Next ›
Showing 20 items per page