Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Flood

Rss Feed Group items tagged

Weiye Loh

Roger Pielke Jr.'s Blog: Blind Spots in Australian Flood Policies - 0 views

  • better management of flood risks in Australia will depend up better data on flood risk.  However, collecting such data has proven problematic
  • As many Queenslanders affected by January’s floods are realising, riverine flood damage is commonly excluded from household insurance policies. And this is unlikely to change until councils – especially in Queensland – stop dragging their feet and actively assist in developing comprehensive data insurance companies can use.
  • ? Because there is often little available information that would allow an insurer to adequately price this flood risk. Without this, there is little economic incentive for insurers to accept this risk. It would be irresponsible for insurers to cover riverine flood without quantifying and pricing the risk accordingly.
  • ...8 more annotations...
  • The first step in establishing risk-adjusted premiums is to know the likelihood of the depth of flooding at each address. This information has to be address-specific because the severity of flooding can vary widely over small distances, for example, from one side of a road to the other.
  • A litany of reasons is given for withholding data. At times it seems that refusal stems from a view that insurance is innately evil. This is ironic in view of the gratuitous advice sometimes offered by politicians and commentators in the aftermath of extreme events, exhorting insurers to pay claims even when no legal liability exists and riverine flood is explicitly excluded from policies.
  • Risk Frontiers is involved in jointly developing the National Flood Information Database (NFID) for the Insurance Council of Australia with Willis Re, a reinsurance broking intermediary. NFID is a five year project aiming to integrate flood information from all city councils in a consistent insurance-relevant form. The aim of NFID is to help insurers understand and quantify their risk. Unfortunately, obtaining the base data for NFID from some local councils is difficult and sometimes impossible despite the support of all state governments for the development of NFID. Councils have an obligation to assess their flood risk and to establish rules for safe land development. However, many are antipathetic to the idea of insurance. Some states and councils have been very supportive – in New South Wales and Victoria, particularly. Some states have a central repository – a library of all flood studies and digital terrain models (digital elevation data). Council reluctance to release data is most prevalent in Queensland, where, unfortunately, no central repository exists.
  • Second, models of flood risk are sometimes misused:
  • many councils only undertake flood modelling in order to create a single design flood level, usually the so-called one-in-100 year flood. (For reasons given later, a better term is the flood with an 1% annual likelihood of being exceeded.)
  • Inundation maps showing the extent of the flood with a 1% annual likelihood of exceedance are increasingly common on council websites, even in Queensland. Unfortunately these maps say little about the depth of water at an address or, importantly, how depth varies for less probable floods. Insurance claims usually begin when the ground is flooded and increase rapidly as water rises above the floor level. At Windsor in NSW, for example, the difference in the water depth between the flood with a 1% annual chance of exceedance and the maximum possible flood is nine metres. In other catchments this difference may be as small as ten centimetres. The risk of damage is quite different in both cases and an insurer needs this information if they are to provide coverage in these areas.
  • The ‘one-in-100 year flood’ term is misleading. To many it is something that happens regularly once every 100 years — with the reliability of a bus timetable. It is still possible, though unlikely, that a flood of similar magnitude or even greater flood could happen twice in one year or three times in successive years.
  • The calculations underpinning this are not straightforward but the probability that an address exposed to a 1-in-100 year flood will experience such an event or greater over the lifetime of the house – 50 years say – is around 40%. Over the lifetime of a typical home mortgage – 25 years – the probability of occurrence is 22%. These are not good odds.
  •  
    John McAneney of Risk Frontiers at Macquarie University in Sydney identifies some opportunities for better flood policies in Australia.
Weiye Loh

Roger Pielke Jr.'s Blog: Flood Disasters and Human-Caused Climate Change - 0 views

  • [UPDATE: Gavin Schmidt at Real Climate has a post on this subject that  -- surprise, surprise -- is perfectly consonant with what I write below.] [UPDATE 2: Andy Revkin has a great post on the representations of the precipitation paper discussed below by scientists and related coverage by the media.]  
  • Nature published two papers yesterday that discuss increasing precipitation trends and a 2000 flood in the UK.  I have been asked by many people whether these papers mean that we can now attribute some fraction of the global trend in disaster losses to greenhouse gas emissions, or even recent disasters such as in Pakistan and Australia.
  • I hate to pour cold water on a really good media frenzy, but the answer is "no."  Neither paper actually discusses global trends in disasters (one doesn't even discuss floods) or even individual events beyond a single flood event in the UK in 2000.  But still, can't we just connect the dots?  Isn't it just obvious?  And only deniers deny the obvious, right?
  • ...12 more annotations...
  • What seems obvious is sometime just wrong.  This of course is why we actually do research.  So why is it that we shouldn't make what seems to be an obvious connection between these papers and recent disasters, as so many have already done?
  • First, the Min et al. paper seeks to identify a GHG signal in global precipitation over the period 1950-1999.  They focus on one-day and five-day measures of precipitation.  They do not discuss streamflow or damage.  For many years, an upwards trend in precipitation has been documented, and attributed to GHGs, even back to the 1990s (I co-authored a paper on precipitation and floods in 1999 that assumed a human influence on precipitation, PDF), so I am unsure what is actually new in this paper's conclusions.
  • However, accepting that precipitation has increased and can be attributed in some part to GHG emissions, there have not been shown corresponding increases in streamflow (floods)  or damage. How can this be?  Think of it like this -- Precipitation is to flood damage as wind is to windstorm damage.  It is not enough to say that it has become windier to make a connection to increased windstorm damage -- you need to show a specific increase in those specific wind events that actually cause damage. There are a lot of days that could be windier with no increase in damage; the same goes for precipitation.
  • My understanding of the literature on streamflow is that there have not been shown increasing peak streamflow commensurate with increases in precipitation, and this is a robust finding across the literature.  For instance, one recent review concludes: Floods are of great concern in many areas of the world, with the last decade seeing major fluvial events in, for example, Asia, Europe and North America. This has focused attention on whether or not these are a result of a changing climate. Rive flows calculated from outputs from global models often suggest that high river flows will increase in a warmer, future climate. However, the future projections are not necessarily in tune with the records collected so far – the observational evidence is more ambiguous. A recent study of trends in long time series of annual maximum river flows at 195 gauging stations worldwide suggests that the majority of these flow records (70%) do not exhibit any statistically significant trends. Trends in the remaining records are almost evenly split between having a positive and a negative direction.
  • Absent an increase in peak streamflows, it is impossible to connect the dots between increasing precipitation and increasing floods.  There are of course good reasons why a linkage between increasing precipitation and peak streamflow would be difficult to make, such as the seasonality of the increase in rain or snow, the large variability of flooding and the human influence on river systems.  Those difficulties of course translate directly to a difficulty in connecting the effects of increasing GHGs to flood disasters.
  • Second, the Pall et al. paper seeks to quantify the increased risk of a specific flood event in the UK in 2000 due to greenhouse gas emissions.  It applies a methodology that was previously used with respect to the 2003 European heatwave. Taking the paper at face value, it clearly states that in England and Wales, there has not been an increasing trend in precipitation or floods.  Thus, floods in this region are not a contributor to the global increase in disaster costs.  Further, there has been no increase in Europe in normalized flood losses (PDF).  Thus, Pall et al. paper is focused attribution in the context of on a single event, and not trend detection in the region that it focuses on, much less any broader context.
  • More generally, the paper utilizes a seasonal forecast model to assess risk probabilities.  Given the performance of seasonal forecast models in actual prediction mode, I would expect many scientists to remain skeptical of this approach to attribution. Of course, if this group can show an improvement in the skill of actual seasonal forecasts by using greenhouse gas emissions as a predictor, they will have a very convincing case.  That is a high hurdle.
  • In short, the new studies are interesting and add to our knowledge.  But they do not change the state of knowledge related to trends in global disasters and how they might be related to greenhouse gases.  But even so, I expect that many will still want to connect the dots between greenhouse gas emissions and recent floods.  Connecting the dots is fun, but it is not science.
  • Jessica Weinkle said...
  • The thing about the nature articles is that Nature itself made the leap from the science findings to damages in the News piece by Q. Schiermeier through the decision to bring up the topic of insurance. (Not to mention that which is symbolically represented merely by the journal’s cover this week). With what I (maybe, naively) believe to be a particularly ballsy move, the article quoted Muir-Wood, an industry scientists. However, what he is quoted as saying is admirably clever. Initially it is stated that Dr. Muir-Wood backs the notion that one cannot put the blame of increased losses on climate change. Then, the article ends with a quote from him, “If there’s evidence that risk is changing, then this is something we need to incorporate in our models.”
  • This is a very slippery slope and a brilliant double-dog dare. Without doing anything but sitting back and watching the headlines, one can form the argument that “science” supports the remodeling of the hazard risk above the climatological average and is more important then the risks stemming from socioeconomic factors. The reinsurance industry itself has published that socioeconomic factors far outweigh changes in the hazard in concern of losses. The point is (and that which has particularly gotten my knickers in a knot) is that Nature, et al. may wish to consider what it is that they want to accomplish. Is it greater involvement of federal governments in the insurance/reinsurance industry on the premise that climate change is too great a loss risk for private industry alone regardless of the financial burden it imposes? The move of insurance mechanisms into all corners of the earth under the auspices of climate change adaptation? Or simply a move to bolster prominence, regardless of whose back it breaks- including their own, if any of them are proud owners of a home mortgage? How much faith does one have in their own model when they are told that hundreds of millions of dollars in the global economy is being bet against the odds that their models produce?
  • What Nature says matters to the world; what scientists say matters to the world- whether they care for the responsibility or not. That is after all, the game of fame and fortune (aka prestige).
Weiye Loh

Roger Pielke Jr.'s Blog: A Decrease in Floods Around the World? - 0 views

  • Bouziotas et al. presented a paper at the EGU a few weeks ago (PDF) and concluded: Analysis of trends and of aggregated time series on climatic (30-year) scale does not indicate consistent trends worldwide. Despite common perception, in general, the detected trends are more negative (less intense floods in most recent years) than positive. Similarly, Svensson et al. (2005) and Di Baldassarre et al. (2010) did not find systematical change neither in flood increasing or decreasing numbers nor change in flood magnitudes in their analysis.
  • This finding is largely consistent with Kundzewicz et al. (2005) who find: Out of more than a thousand long time series made available by the Global Runoff Data Centre (GRDC) in Koblenz, Germany, a worldwide data set consisting of 195 long series of daily mean flow records was selected, based on such criteria as length of series, currency, lack of gaps and missing values, adequate geographical distribution, and priority to smaller catchments. The analysis of annual maximum flows does not support the hypothesis of ubiquitous growth of high flows. Although 27 cases of strong, statistically significant increase were identified by the Mann-Kendall test, there are 31 decreases as well, and most (137) time series do not show any significant changes (at the 10% level). Caution is advised in interpreting these results as flooding is a complex phenomenon, caused by a number of factors that can be associated with local, regional, and hemispheric climatic processes. Moreover, river flow has strong natural variability and exhibits long-term persistence which can confound the results of trend and significance tests.
  • estructive floods observed in the last decade all over the world have led to record high material damage. The conventional belief is that the increasing cost of floods is associated with increasing human development on flood plains (Pielke & Downton, 2000). However, the question remains as to whether or not the frequency and/or magnitude of flooding is also increasing and, if so, whether it is in response to climate variability and change. Several scenarios of future climate indicate a likelihood of increased intense precipitation and flood hazard. However, observations to date provide no conclusive and general proof as to how climate change affects flood behaviour.
  • ...1 more annotation...
  • References: Bouziotas, D., G. Deskos, N. Mastrantonas, D. Tsaknias, G. Vangelidis, S.M. Papalexiou, and D. Koutsoyiannis, Long-term properties of annual maximum daily river discharge worldwide, European Geosciences Union General Assembly 2011, Geophysical Research Abstracts, Vol. 13, Vienna, EGU2011-1439, European Geosciences Union, 2011. Kundzewicz, Z.W., D. Graczyk, T. Maurer, I. Przymusińska, M. Radziejewski, C. Svensson and M. Szwed, 2005(a):Trend detection in river flow time-series: 1. annual maximum flow. Hydrol. Sci. J., 50(5): 797-810.
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

Climate change and extreme flooding linked by new evidence | George Monbiot | Environme... - 0 views

  • Two studies suggest for the first time a clear link between global warming and extreme precipitation
  • There's a sound rule for reporting weather events that may be related to climate change. You can't say that a particular heatwave or a particular downpour – or even a particular freeze – was definitely caused by human emissions of greenhouse gases. But you can say whether these events are consistent with predictions, or that their likelihood rises or falls in a warming world.
  • Weather is a complex system. Long-running trends, natural fluctuations and random patterns are fed into the global weather machine, and it spews out a series of events. All these events will be influenced to some degree by global temperatures, but it's impossible to say with certainty that any of them would not have happened in the absence of man-made global warming.
  • ...5 more annotations...
  • over time, as the data build up, we begin to see trends which suggest that rising temperatures are making a particular kind of weather more likely to occur. One such trend has now become clearer. Two new papers, published by Nature, should make us sit up, as they suggest for the first time a clear link between global warming and extreme precipitation (precipitation means water falling out of the sky in any form: rain, hail or snow).
  • We still can't say that any given weather event is definitely caused by man-made global warming. But we can say, with an even higher degree of confidence than before, that climate change makes extreme events more likely to happen.
  • One paper, by Seung-Ki Min and others, shows that rising concentrations of greenhouse gases in the atmosphere have caused an intensification of heavy rainfall events over some two-thirds of the weather stations on land in the northern hemisphere. The climate models appear to have underestimated the contribution of global warming on extreme rainfall: it's worse than we thought it would be.
  • The other paper, by Pardeep Pall and others, shows that man-made global warming is very likely to have increased the probability of severe flooding in England and Wales, and could well have been behind the extreme events in 2000. The researchers ran thousands of simulations of the weather in autumn 2000 (using idle time on computers made available by a network of volunteers) with and without the temperature rises caused by man-made global warming. They found that, in nine out of 10 cases, man-made greenhouse gases increased the risks of flooding. This is probably as solid a signal as simulations can produce, and it gives us a clear warning that more global heating is likely to cause more floods here.
  • As Richard Allan points out, also in Nature, the warmer the atmosphere is, the more water vapour it can carry. There's even a formula which quantifies this: 6-7% more moisture in the air for every degree of warming near the Earth's surface. But both models and observations also show changes in the distribution of rainfall, with moisture concentrating in some parts of the world and fleeing from others: climate change is likely to produce both more floods and more droughts.
Weiye Loh

RealClimate: Going to extremes - 0 views

  • There are two new papers in Nature this week that go right to the heart of the conversation about extreme events and their potential relationship to climate change.
  • Let’s start with some very basic, but oft-confused points: Not all extremes are the same. Discussions of ‘changes in extremes’ in general without specifying exactly what is being discussed are meaningless. A tornado is an extreme event, but one whose causes, sensitivity to change and impacts have nothing to do with those related to an ice storm, or a heat wave or cold air outbreak or a drought. There is no theory or result that indicates that climate change increases extremes in general. This is a corollary of the previous statement – each kind of extreme needs to be looked at specifically – and often regionally as well. Some extremes will become more common in future (and some less so). We will discuss the specifics below. Attribution of extremes is hard. There are limited observational data to start with, insufficient testing of climate model simulations of extremes, and (so far) limited assessment of model projections.
  • The two new papers deal with the attribution of a single flood event (Pall et al), and the attribution of increased intensity of rainfall across the Northern Hemisphere (Min et al). While these issues are linked, they are quite distinct, and the two approaches are very different too.
  • ...4 more annotations...
  • The aim of the Pall et al paper was to examine a specific event – floods in the UK in Oct/Nov 2000. Normally, with a single event there isn’t enough information to do any attribution, but Pall et al set up a very large ensemble of runs starting from roughly the same initial conditions to see how often the flooding event occurred. Note that flooding was defined as more than just intense rainfall – the authors tracked runoff and streamflow as part of their modelled setup. Then they repeated the same experiments with pre-industrial conditions (less CO2 and cooler temperatures). If the amount of times a flooding event would occur increased in the present-day setup, you can estimate how much more likely the event would have been because of climate change. The results gave varying numbers but in nine out of ten cases the chance increased by more than 20%, and in two out of three cases by more than 90%. This kind of fractional attribution (if an event is 50% more likely with anthropogenic effects, that implies it is 33% attributable) has been applied also to the 2003 European heatwave, and will undoubtedly be applied more often in future. One neat and interesting feature of these experiments was that they used the climateprediction.net set up to harness the power of the public’s idle screensaver time.
  • The second paper is a more standard detection and attribution study. By looking at the signatures of climate change in precipitation intensity and comparing that to the internal variability and the observation, the researchers conclude that the probability of intense precipitation on any given day has increased by 7 percent over the last 50 years – well outside the bounds of natural variability. This is a result that has been suggested before (i.e. in the IPCC report (Groisman et al, 2005), but this was the first proper attribution study (as far as I know). The signal seen in the data though, while coherent and similar to that seen in the models, was consistently larger, perhaps indicating the models are not sensitive enough, though the El Niño of 1997/8 may have had an outsize effect.
  • Both papers were submitted in March last year, prior to the 2010 floods in Pakistan, Australia, Brazil or the Philippines, and so did not deal with any of the data or issues associated with those floods. However, while questions of attribution come up whenever something weird happens to the weather, these papers demonstrate clearly that the instant pop-attributions we are always being asked for are just not very sensible. It takes an enormous amount of work to do these kinds of tests, and they just can’t be done instantly. As they are done more often though, we will develop a better sense for the kinds of events that we can say something about, and those we can’t.
  • There is always concern that the start and end points for any trend study are not appropriate (both sides are guilty on this IMO). I have read precipitation studies were more difficult due to sparse data, and it seems we would have seen precipitation trend graphs a lot more often by now if it was straight forward. 7% seems to be a large change to not have been noted (vocally) earlier, seems like there is more to this story.
Weiye Loh

FT.com / FT Magazine - A disastrous truth - 0 views

  • Every time a disaster strikes, some environmentalists blame it on climate change. “It’s been such a part of the narrative of the public and political debate, particularly after Hurricane Katrina,” Roger Pielke Jr, an expert on the politics of climate change at the University of Colorado, told me.
  • But nothing in the scientific literature indicates that this is true. A host of recent peer-reviewed studies agree: there’s no evidence that climate change has increased the damage from natural disasters. Most likely, climate change will make disasters worse some day, but not yet.
  • Laurens Bouwer, of Amsterdam’s Vrije Universiteit, has recently reviewed 22 “disaster loss studies” and concludes: “Anthropogenic climate change so far has not had a significant impact on losses from natural disasters.”
  • ...4 more annotations...
  • Eric Neumayer and Fabian Barthel of the London School of Economics found likewise in their recent “global analysis” of natural disasters.
  • in his book The Climate Fix: What Scientists and Politicians Won’t Tell You About Global Warming, Pielke writes that there’s no upward trend in the landfalls of tropical cyclones. Even floods in Brisbane aren’t getting worse – just check out the city’s 19th-century floods. Pielke says the consensus of peer-reviewed research on this point – that climate change is not yet worsening disasters – is as strong as any consensus in climate science.
  • It’s true that floods and hurricanes do more damage every decade. However, that’s because ever more people, owning ever more “stuff”, live in vulnerable spots.
  • When it comes to preventing today’s disasters, the squabble about climate change is just a distraction. The media usually has room for only one environmental argument: is climate change happening? This pits virtually all climate scientists against a band of self-taught freelance sceptics, many of whom think the “global warming hoax” is a ruse got up by 1960s radicals as a trick to bring in socialism. (I know, I get the sceptics’ e-mails.) Sometimes in this squabble, climate scientists are tempted to overstate their case, and to say that the latest disaster proves that the climate is changing. This is bad science. It also gives the sceptics something dubious to attack. Better to ignore the sceptics, and have more useful debates about disasters and climate change – which, for now, are two separate problems.
Weiye Loh

Roger Pielke Jr.'s Blog: Bringing it Home - 0 views

  • Writing at MIT's Knight Science Journalism Tracker, Charles Petit breathlessly announces to journalists that the scientific community has now given a green light to blaming contemporary disasters on the emissions of greenhouse gases
  • We recently published a paper showing that the media overall has done an excellent job on its reporting of scientific projections of sea level rise. I suspect that a similar analysis of the issue of disasters and climate change would not result in such favorable results. Of course, looking at the cover of Nature above, it might be understandable why this would be the case.
  •  
    An official shift may just have occurred not only in news coverage of climate change, but the way that careful scientists  talk about it. Till now blaming specific storms on climate change has been frowned upon. And it still is, if one is speaking of an isolated event. But something very much like blaming global warming for what is happening today, right now, outside the window has just gotten endorsement on the cover of Nature. Its photo of a flooded European village has splashed across it, "THE HUMAN FACTOR." Extreme rains in many regions, it tells the scientific community, is not merely consistent with what to expect from global warming,  but herald its arrival. This is a good deal more immediate than saying, as people have for some time, that glaciers are shrinking and seas are rising due to the effects of greenhouse gases. This brings it home.
Weiye Loh

Roger Pielke Jr.'s Blog: Flawed Food Narrative in the New York Times - 0 views

  • The article relies heavily on empty appeals to authority.  For example, it makes an unsupported assertion about what "scientists believe": Many of the failed harvests of the past decade were a consequence of weather disasters, like floods in the United States, drought in Australia and blistering heat waves in Europe and Russia. Scientists believe some, though not all, of those events were caused or worsened by human-induced global warming.  Completely unmentioned are the many (most?) scientists who believe that evidence is lacking to connect recent floods and heat waves to "human-induced global warming."
  • Some important issues beyond carbon dioxide are raised in the article, but are presented as secondary to the carbon narrative.  Other important issues are completely ignored -- for example, wheat rust goes unmentioned, and it probably has a greater risk to food supplies in the short term than anything to do with carbon dioxide. The carbon dioxide-centric focus on the article provides a nice illustration of how an obsession with "global warming" can serve to distract attention from factors that actually matter more for issues of human and environmental concern.
  • The central thesis of the NYT article is the following statement: The rapid growth in farm output that defined the late 20th century has slowed to the point that it is failing to keep up with the demand for food, driven by population increases and rising affluence in once-poor countries. But this claim of slowing output is shown to be completely false by the graphic that accompanies the article, shown below.  Far from slowing, farm output has increased dramatically over the past half-century (left panel) and on a per capita basis in 2009 was higher than at any point since the early 1980s (right panel).  
  •  
    Today's New York Times has an article by Justin Gillis on global food production that strains itself to the breaking point to make a story fit a narrative.  The narrative, of course, is that climate change "is helping to destabilize the food system."  The problem with the article is that the data that it presents don't support this narrative. Before proceeding, let me reiterate that human-caused climate change is a threat and one that we should be taking seriously. But taking climate change seriously does not mean shoehorning every global concern into that narrative, and especially conflating concerns about the future with what has been observed in the past. The risk of course of putting a carbon-centric spin on every issue is that other important dimensions are neglected.
Weiye Loh

Skepticblog » The Value of Vertigo - 1 views

  • But Ruse’s moment of vertigo is not as surprising as it may appear. Indeed, he put effort into achieving this immersion: “I am atypical, I took about three hours to go through [the creation museum] but judging from my students most people don’t read the material as obsessively as I and take about an hour.” Why make this meticulous effort, when he could have dismissed creationism’s well-known scientific problems from the parking lot, or from his easy chair at home?
  • According to Ruse, the vertiginous “what if?” feeling has a practical value. After all, it’s easy to find problems with a pseudoscientific belief; what’s harder is understanding how and why other people believe. “It is silly just to dismiss this stuff as false,” Ruse argues (although it is false, and although Ruse has fought against “this stuff” for decades). “A lot of people believe Creationism so we on the other side need to get a feeling not just for the ideas but for the psychology too.”
  •  
    In June of 2009, philosopher of biology Michael Ruse took a group of grad students to the Answers in Genesis Creation Museum in Kentucky (and also some mainstream institutions) as part of a course on how museums present science. In a critical description of his visit, Ruse reflected upon "the extent to which the Creationist museum uses modern science to its own ends, melding it in seamlessly with its own Creationist message." Continental drift, the Big Bang, and even natural selection are all presented as evidence in support of Young Earth cosmology and flood geology. While immersing himself in the museum's pitch, Ruse wrote, Just for one moment about half way through the exhibit…I got that Kuhnian flash that it could all be true - it was only a flash (rather like thinking that Freudianism is true or that the Republicans are right on anything whatsoever) but it was interesting nevertheless to get a sense of how much sense this whole display and paradigm can make to people.
Weiye Loh

Roger Pielke Jr.'s Blog: Every Relatively Affluent White Guy for Himself - 0 views

  • one of the big arguments that environmentalists have used about the need to stop climate change is that those who will suffer most are the little brown poor people in far-off lands who will, for instance, experience increased incidence of malaria and exposure to floods and other disasters. (Of course the fact that they are already burdened by such things in huge disproportion to the privileged minority doesn’t seem to enter into the argument).
  • But I raise this point because when it comes to climate survivalism, the little brown folks are nowhere to be seen, and apparently it’s every relatively affluent white guy (and his nuclear family, of course) for himself.
  • Dan Sarewitz takes the Washington Post to task for publishing a bizarre commentary on the coming climate apocalypse: Check out the article by a climate survivalist from the February 27, 2011 Washington Post. (I’m going to go out on a limb and treat the article as if it’s not a satire or hoax, but maybe the joke’s on me.) The author describes how he’s buying solar panels and generators and laying in food and supplies and putting extra locks on his doors and windows in anticipation of the coming climate apocalypse, much in the way that in the 1960s certain nuts were digging shelters in their backyard to provide protection against hydrogen bombs, and in the ‘80s (and probably to this day) right-wing crazies were building up small arsenals to protect themselves against the time when the government tried to take away their right to be bigots.
  • ...1 more annotation...
  • fear of the coming apocalypse seems to be an honorable tradition among some factions of the human race, and besides in this case it’s probably good for the beleaguered economy that this guy is spending what must be lots of money on hardware, both high-tech and low. But there are some elements of climate survivalism that are truly troubling. The fact that the Washington Post chose to put this article on the front page of its Sunday opinion section is an editorial judgment that the author, who is executive director of the Chesapeake Climate Action Committee, is someone whose perspective deserves to be taken seriously.
Weiye Loh

Skepticblog » Litigation gone wild! A geologist's take on the Italian seismol... - 0 views

  • Apparently, an Italian lab technician named Giampaolo Giuliani made a prediction about a month before the quake, based on elevated levels of radon gas. However, seismologists have known for a long time that radon levels, like any other “magic bullet” precursor, are unreliable because no two quakes are alike, and no two quakes give the same precursors. Nevertheless, his prediction caused a furor before the quake actually happened. The Director of the Civil Defence, Guido Bertolaso, forced him to remove his findings from the Internet (old versions are still on line). Giuliani was also reported to the police for “causing fear” with his predictions about a quake near Sulmona, which was far from where the quake actually struck. Enzo Boschi, the head of the Italian National Geophysics Institute declared: “Every time there is an earthquake there are people who claim to have predicted it. As far as I know nobody predicted this earthquake with precision. It is not possible to predict earthquakes.” Most of the geological and geophysical organizations around the world made similar statements in support of the proper scientific procedures adopted by the Italian geophysical community. They condemned Giuliani for scaring people using a method that has not shown to be reliable.
  • most the of press coverage I have read (including many cited above) took the sensationalist approach, and cast Guiliani as the little “David” fighting against the “Goliath” of “Big Science”
  • none of the reporters bothered to do any real background research, or consult with any other legitimate seismologist who would confirm that there is no reliable way to predict earthquakes in the short term and Giuliani is misleading people when he says so. Giulian’s “prediction” was sheer luck, and if he had failed, no one would have mentioned it again.
  • ...4 more annotations...
  • Even though he believes in his method, he ignores the huge body of evidence that shows radon gas is no more reliable than any other “predictor”.
  • If the victims insist on suing someone, they should leave the seismologists alone and look into the construction of some of those buildings. The stories out of L’Aquila suggest that the death toll was much higher because of official corruption and shoddy construction, as happens in many countries both before and after big quakes.
  • much of the construction is apparently Mafia-controlled in that area—good luck suing them! Sadly, the ancient medieval buildings that crumbled were the most vulnerable because they were made of unreinforced masonry, the worst possible construction material in earthquake country
  • what does this imply for scientists who are working in a field that might have predictive power? In a litigious society like Italy or the U.S., this is a serious question. If a reputable seismologist does make a prediction and fails, he’s liable, because people will panic and make foolish decisions and then blame the seismologist for their losses. Now the Italian courts are saying that (despite world scientific consensus) seismologists are liable if they don’t predict quakes. They’re damned if they do, and damned if they don’t. In some societies where seismologists work hard at prediction and preparation (such as China and Japan), there is no precedent for suing scientists for doing their jobs properly, and the society and court system does not encourage people to file frivolous suits. But in litigious societies, the system is counterproductive, and stifles research that we would like to see developed. What seismologist would want to work on earthquake prediction if they can be sued? I know of many earth scientists with brilliant ideas not only about earthquake prediction but even ways to defuse earthquakes, slow down global warming, or many other incredible but risky brainstorms—but they dare not propose the idea seriously or begin to implement it for fear of being sued.
  •  
    In the case of most natural disasters, people usually regard such events as "acts of God" and  try to get on with their lives as best they can. No human cause is responsible for great earthquakes, tsunamis, volcanic eruptions, tornadoes, hurricanes, or floods. But in the bizarre world of the Italian legal system, six seismologists and a public official have been charged with manslaughter for NOT predicting the quake! My colleagues in the earth science community were incredulous and staggered at this news. Seismologists and geologists have been saying for decades (at least since the 1970s) that short-term earthquake prediction (within minutes to hours of the event) is impossible, and anyone who claims otherwise is lying. As Charles Richter himself said, "Only fools, liars, and charlatans predict earthquakes." How could anyone then go to court and sue seismologists for following proper scientific procedures?
Weiye Loh

Roger Pielke Jr.'s Blog: Neville Nicholls on Australia's Extreme Rainfall - 0 views

  •  
    The record La Niña event was the fundamental cause of the heavy rains and floods, ie it was a natural fluctuation of the climate system. There may be a global warming signal enhancing this natural variability, but if so then this effect has been quite subtle, at least thus far
Weiye Loh

Your Brain on Computers - Attached to Technology and Paying a Price - NYTimes.com - 0 views

  • The message had slipped by him amid an electronic flood: two computer screens alive with e-mail, instant messages, online chats, a Web browser and the computer code he was writing. (View an interactive panorama of Mr. Campbell's workstation.)
  • Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.
  • “It seems like he can no longer be fully in the moment.”
  • ...4 more annotations...
  • juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.
  • These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive.
  • While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.
  • even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
  •  
    YOUR BRAIN ON COMPUTERS Hooked on Gadgets, and Paying a Mental Price
Weiye Loh

Our ever-changing English | Alison Flood | Comment is free | guardian.co.uk - 0 views

  • Perhaps the Daily Mail should take a leaf out of Jonathan Swift's book and instead of blaming changes in English on "a tidal wave of mindless Americanisms", start calling those damned poets to book.
  • We've been whining on about the deterioration in English for years and years and years, and perhaps we need to get over ourselves. Looking at Swift's 300-year-old plea to keep things the same I'm minded to think that, actually, part of the glory of English, from Shakespeare's insults to Bombaugh's txt speak to the ever-expanding dictionaries of today, is its constantly changing nature, its adaptability, its responsiveness.
  •  
    Our ever-changing English I get grumpy about crimes against language. But we Brits have been lamenting declining standards of English for centuries
Weiye Loh

P2P Foundation » Blog Archive » Crowdsourced curation, reputation systems, an... - 0 views

  • A good example of manual curation vs. crowdsourced curation is the competing app markets on the Apple iPhone and Google Android phone operating systems.
  • Apple is a monarchy, albeit with a wise and benevolent king. Android is burgeoning democracy, inefficient and messy, but free. Apple is the last, best example of the Industrial Age and its top-down, mass market/mass production paradigm.
  • They manufacture cool. They rely on “consumers”, and they protect those consumers from too many choices by selecting what is worthy, and what is not.
  • ...8 more annotations...
  • systems that allow crowdsourced judgment to be tweaked, not to the taste of the general mass, which produces lowest common denominator effects, but to people and experts that you can trust for their judgment.
  • these systems are now implemented by Buzz and Digg 4
  • Important for me though, is that they don’t just take your social graph as is, because that mixes many different people for different reasons, but that you can tweak the groups.
  • “This is the problem with the internet! It’s full of crap!” Many would argue that without professional producers, editors, publishers, and the natural scarcity that we became accustomed to, there’s a flood of low-quality material that we can’t possible sift through on our own. From blogs to music to software to journalism, one of the biggest fears of the established order is how to handle the oncoming glut of mediocrity. Who shall tell us The Good from The Bad? “We need gatekeepers, and they need to be paid!”
  • The Internet has enabled us to build our social graph, and in turn, that social graph acts as an aggregate gatekeeper. The better that these systems for crowdsourcing the curation of content become, the more accurate the results will be.
  • This social-graph-as-curation is still relatively new, even by Internet standards. However, with tools like Buzz and Digg 4 (which allows you to see the aggregate ratings for content based on your social graph, and not the whole wide world) this technique is catching up to human publishers fast. For those areas where we don’t have strong social ties, we can count on reputation systems to help us “rate the raters”. These systems allow strangers to rate each other’s content, giving users some idea of who to trust, without having to know them personally. Yelp has a fairly mature reputation system, where locations are rated by users, but the users are rated, in turn, by each other.
  • Reputation systems and the social graph allow us to crowdsource curation.
  • Can you imagine if Apple had to approve your videos for posting on Youtube, where every minute, 24 hours of footage are uploaded? There’s no way humans could keep up! The traditional forms of curation and gatekeeping simply can not scale to meet the increase in production and transmission that the Internet allows. Crowdsourcing is the only curatorial/editorial mechanism that can scale to match the increased ability to produce that the Internet has given us.
  •  
    Crowdsourced curation, reputation systems, and the social graph
Weiye Loh

In Europe, sharp criticism of US reaction to WikiLeaks - The Boston Globe - 0 views

  • Washington’s fierce reaction to the flood of secret diplomatic cables released by WikiLeaks displays imperial arrogance and hypocrisy, indicating a post-9/11 obsession with secrecy that contradicts American principles.
  • John Naughton, writing in the same British paper, deplored the attack on the openness of the Internet and the pressure on companies such as Amazon and eBay to evict the WikiLeaks site. “The response has been vicious, coordinated and potentially comprehensive,’’ he said, and presents a “delicious irony’’ that “it is now the so-called liberal democracies that are clamoring to shut WikiLeaks down.’’
  • A year ago, he noted, Clinton made a major speech about Internet freedom, interpreted as a rebuke to China’s cyberattack on Google. “Even in authoritarian countries,’’ she said, “information networks are helping people to discover new facts and making governments more accountable.’’ To Naughton now, “that Clinton speech reads like a satirical masterpiece.’’
  • ...4 more annotations...
  • The Russians seemed to take a special delight in tweaking Washington over its reaction to the leaks, suggesting the Americans are being hypocritical. “If it is a full-fledged democracy, then why have they put Assange away in jail? You call that democracy?’’ Prime Minister Vladimir V. Putin said during a news briefing with the French prime minister, Francois Fillon.
  • Even The Financial Times Deutschland (independent of the English-language Financial Times), said that “the already damaged reputation of the United States will only be further tattered with Assange’s new martyr status.’’ It added that “the openly embraced hope of the US government that along with Assange, WikiLeaks will disappear from the scene, is questionable.’’
  • Assange is being hounded, the paper said, “even though no one can explain what crimes Assange allegedly committed with the publication of the secret documents, or why publication by WikiLeaks was an offense, and in The New York Times, it was not.’’
  • But Renaud Girard, a respected reporter for the center-right Le Figaro, said he was impressed by the generally high quality of the American diplomatic corps. “What is most fascinating is that we see no cynicism in US diplomacy,’’ he said. “They really believe in human rights in Africa and China and Russia and Asia. They really believe in democracy and human rights. People accuse the Americans of double standards all the time. But it’s not true here. If anything, the diplomats are almost naive.’
Weiye Loh

The Problem with Climate Change | the kent ridge common - 0 views

  • what is climate change? From a scientific point of view, it is simply a statistical change in atmospheric variables (temperature, precipitation, humidity etc). It has been occurring ever since the Earth came into existence, far before humans even set foot on the planet: our climate has been fluctuating between warm periods and ice ages, with further variations within. In fact, we are living in a warm interglacial period in the middle of an ice age.
  • Global warming has often been portrayed in apocalyptic tones, whether from the mouth of the media or environmental groups: the daily news tell of natural disasters happening at a frightening pace, of crop failures due to strange weather, of mass extinctions and coral die-outs. When the devastating tsunami struck Southeast Asia years ago, some said it was the wrath of God against human mistreatment of the environment; when hurricane Katrina dealt out a catastrophe, others said it was because of (America’s) failure to deal with climate change. Science gives the figures and trends, and people take these to extremes.
  • One immediate problem with blaming climate change for every weather-related disaster or phenomenon is that it reduces humans’ responsibility of mitigating or preventing it. If natural disasters are already, as their name suggests, natural, adding the tag ‘global warming’ or ‘climate change’ emphasizes the dominance of natural forces, and our inability to do anything about it. Surely, humans cannot undo climate change? Even at Cancun, amid the carbon cuts that have been promised, questions are being brought up on whether they are sufficient to reverse our actions and ‘save’ the planet.  Yet the talk about this remote, omnipotent force known as climate change obscures the fact that, we can, and have always been, thinking of ways to reduce the impact of natural hazards. Forecasting, building better infrastructure and coordinating more efficient responses – all these are far more desirable to wading in woe. For example, we will do better at preventing floods in Singapore at tackling the problems rather than singing in praise of God.
  • ...5 more annotations...
  • However, a greater concern lies in the notion of climate change itself. Climate change is in essence one kind of nature-society relationship, in which humans influence the climate through greenhouse gas (particularly CO2) emissions, and the climate strikes back by heating up and going crazy at times. This can be further simplified into a battle between humans and CO2: reducing CO2 guards against climate change, and increasing it aggravates the consequences. This view is anchored in scientists’ recommendation that a ‘safe’ level of CO2 should be at 350 parts per million (ppm) instead of the current 390. Already, the need to reduce CO2 is understood, as is evident in the push for greener fuels, more efficient means of production, the proliferation of ‘green’ products and companies, and most recently, the Cancun talks.
  • So can there be anything wrong with reducing CO2? No, there isn’t, but singling out CO2 as the culprit of climate change or of the environmental problems we face prevents us from looking within. What do I mean? The enemy, CO2, is an ‘other’, an externality produced by our economic systems but never an inherent component of the systems. Thus, we can declare war on the gas or on climate change without taking a step back and questioning: is there anything wrong with the way we develop?  Take Singapore for example: the government pledged to reduce carbon emissions by 16% under ‘business as usual’ standards, which says nothing about how ‘business’ is going to be changed other than having less carbon emissions (in fact, it is questionable even that CO2 levels will decrease, as ‘business as usual’ standards project a steady increase emission of CO2 each year). With the development of green technologies, decrease in carbon emissions will mainly be brought about by increased energy efficiency and switch to alternative fuels (including the insidious nuclear energy).
  • Thus, the way we develop will hardly be changed. Nobody questions whether our neoliberal system of development, which relies heavily on consumption to drive economies, needs to be looked into. We assume that it is the right way to develop, and only tweak it for the amount of externalities produced. Whether or not we should be measuring development by the Gross Domestic Product (GDP) or if welfare is correlated to the amount of goods and services consumed is never considered. Even the UN-REDD (Reducing Emissions from Deforestation and Forest Degradation) scheme which aims to pay forest-rich countries for protecting their forests, ends up putting a price tag on them. The environment is being subsumed under the economy, when it should be that the economy is re-looked to take the environment into consideration.
  • when the world is celebrating after having held at bay the dangerous greenhouse gas, why would anyone bother rethinking about the economy? Yet we should, simply because there are alternative nature-society relationships and discourses about nature that are more or of equal importance as global warming. Annie Leonard’s informative videos on The Story of Stuff and specific products like electronics, bottled water and cosmetics shed light on the dangers of our ‘throw-away culture’ on the planet and poorer countries. What if the enemy was instead consumerism? Doing so would force countries (especially richer ones) to fundamentally question the nature of development, instead of just applying a quick technological fix. This is so much more difficult (and less economically viable), alongside other issues like environmental injustices – e.g. pollution or dumping of waste by Trans-National Corporations in poorer countries and removal of indigenous land rights. It is no wonder that we choose to disregard internal problems and focus instead on an external enemy; when CO2 is the culprit, the solution is too simple and detached from the communities that are affected by changes in their environment.
  • We need hence to allow for a greater politics of the environment. What I am proposing is not to diminish our action to reduce carbon emissions, for I do believe that it is part of the environmental problem that we are facing. What instead should be done is to reduce our fixation on CO2 as the main or only driver of climate change, and of climate change as the most pertinent nature-society issue we are facing. We should understand that there are many other ways of thinking about the environment; ‘developing’ countries, for example, tend to have a closer relationship with their environment – it is not something ‘out there’ but constantly interacted with for food, water, regulating services and cultural value. Their views and the impact of the socio-economic forces (often from TNCs and multi-lateral organizations like IMF) that shape the environment must also be taken into account, as do alternative meanings of sustainable development. Thus, even as we pat ourselves on the back for having achieved something significant at Cancun, our action should not and must not end there. Even if climate change hogs the headlines now, we must embrace more plurality in environmental discourse, for nature is not and never so simple as climate change alone. And hopefully sometime in the future, alongside a multi-lateral conference on climate change, the world can have one which rethinks the meaning of development.
  •  
    Chen Jinwen
Weiye Loh

On newspapers' online comments « Yawning Bread Sampler 2 - 0 views

  • Assistant Professor Mark Cenite of Nanyang Technological University’s Wee Kim Wee School of Communication and Information said: ‘This approach allows users to moderate themselves, and the news site is seen as being sensitive to readers’ values.’
  • But Mr Alex Au, who runs socio-political blog Yawning Bread, cautioned that this could lead to astroturfing. The term, derived from a brand of fake grass, refers to a fake grassroots movement in which a group wishing to push its agenda sends out manipulated and replicated online messages in support of a certain policy or issue. His suggestion: user tiers, in which comments by users with verified identities are displayed visibly and anonymous comments less conspicuously. He said: ‘This approach does not bar people from speaking up, but weighs in by signalling the path towards responsible participation.’
  • what is astroturfing? It is when a few people do one or both of two things: create multiple identities for each of themselves and flood a forum or topic with similar opinions, or get their friends to post boilerplate letters (expressing similar opinions of course) even if they do not totally share them to the same degree. The intent is to create an impression that a certain opinion is more widely held than is actually the case.
  • ...4 more annotations...
  • user-rating will have the tendency of giving prominence to widely-shared opinion. Comments expressing unpopular opinions will get fewer “stars” from other readers and sink in display priority. In theory, it doesn’t have to be so. People may very well give “stars” to well-thought-out comments that argue cogently for a view they don’t agree with, lauding the quality of expression rather than the conclusion, but let’s get real. Most people like to hear what they already believe. That being the case, the effect of such a scheme would be to crowd out unpopular opinion even if they have merit; it produces a majoritarian effect in newspapers’ comments sections.
  • it is open to abuse in that a small group of people wanting to push a particular opinion could repeatedly vote for a certain comment, thereby giving it increased ranking and more prominent display. Such action would be akin to astroturfing.
  • The value of discussion lies not in hearing what we already know or what we already believe in. It lies in hearing alternative arguments and learning new facts. Structuring a discussion forum by giving prominence to merely popular opinion just makes it an echo chamber. The greater public purpose is better served when contrary opinion is aired. That is why I disagree with a scheme whereby users apply ratings and prominence is given to highly-rated comments.
    • Weiye Loh
       
      But the majority of users who participate in online activism/ slacktivism are very much the young, western educated folks. This in itself already make the online social sphere an echo chamber isn't it? 
  • nonymous comments have their uses. Most obviously, there will be times when whistle-blowing serves the public purpose, and so, even if displayed less prominently, they should still be allowed.
  •  
    A popular suggestion among media watchers interviewed is to let users rate the comments and display the highly ranked ones prominently.
Weiye Loh

Himalayan glaciers not melting because of climate change, report finds - Telegraph - 0 views

  • Himalayan glaciers are actually advancing rather than retreating, claims the first major study since a controversial UN report said they would be melted within quarter of a century.
  • Researchers have discovered that contrary to popular belief half of the ice flows in the Karakoram range of the mountains are actually growing rather than shrinking.
  • The discovery adds a new twist to the row over whether global warming is causing the world's highest mountain range to lose its ice cover.
  • ...13 more annotations...
  • It further challenges claims made in a 2007 report by the UN's Intergovernmental Panel on Climate Change that the glaciers would be gone by 2035.
  • Although the head of the panel Dr Rajendra Pachauri later admitted the claim was an error gleaned from unchecked research, he maintained that global warming was melting the glaciers at "a rapid rate", threatening floods throughout north India.
  • The new study by scientists at the Universities of California and Potsdam has found that half of the glaciers in the Karakoram range, in the northwestern Himlaya, are in fact advancing and that global warming is not the deciding factor in whether a glacier survives or melts.
  • Dr Bodo Bookhagen, Dirk Scherler and Manfred Strecker studied 286 glaciers between the Hindu Kush on the Afghan-Pakistan border to Bhutan, taking in six areas.Their report, published in the journal Nature Geoscience, found the key factor affecting their advance or retreat is the amount of debris – rocks and mud – strewn on their surface, not the general nature of climate change.
  • Glaciers surrounded by high mountains and covered with more than two centimetres of debris are protected from melting.Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher.
  • In contrast, more than 50 per cent of observed glaciers in the Karakoram region in the northwestern Himalaya are advancing or stable.
  • "Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability or global sea level," the authors concluded.
  • Dr Bookhagen said their report had shown "there is no stereotypical Himalayan glacier" in contrast to the UN's climate change report which, he said, "lumps all Himalayan glaciers together."
  • Dr Pachauri, head of the Nobel prize-winning UN Intergovernmental Panel on Climate Change, has remained silent on the matter since he was forced to admit his report's claim that the Himalayan glaciers would melt by 2035 was an error and had not been sourced from a peer-reviewed scientific journal. It came from a World Wildlife Fund report.
  • this latest tawdry addition to the pathetic lies of the Reality Deniers. If you go to a proper source which quotes the full study such as:http://www.sciencedaily.com/re...you discover that the findings of this study are rather different to those portrayed here.
  • only way to consistently maintain a lie is to refuse point-blank to publish ALL the findings of a study, but to cherry-pick the bits which are consistent with the ongoing lie, while ignoring the rest.
  • Bookhagen noted that glaciers in the Karakoram region of Northwestern Himalaya are mostly stagnating. However, glaciers in the Western, Central, and Eastern Himalaya are retreating, with the highest retreat rates -- approximately 8 meters per year -- in the Western Himalayan Mountains. The authors found that half of the studied glaciers in the Karakoram region are stable or advancing, whereas about two-thirds are in retreat elsewhere throughout High Asia
  • glaciers in the steep Himalaya are not only affected by temperature and precipitation, but also by debris coverage, and have no uniform and less predictable response, explained the authors. The debris coverage may be one of the missing links to creating a more coherent picture of glacial behavior throughout all mountains. The scientists contrast this Himalayan glacial study with glaciers from the gently dipping, low-relief Tibetan Plateau that have no debris coverage. Those glaciers behave in a different way, and their frontal changes can be explained by temperature and precipitation changes.
1 - 20 of 26 Next ›
Showing 20 items per page