Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged flow

Rss Feed Group items tagged

Weiye Loh

Stock and flow « Snarkmarket - 0 views

  • There are two kinds of quan­ti­ties in the world. Stock is a sta­tic value: money in the bank, or trees in the for­est. Flow is a rate of change: fif­teen dol­lars an hour, or three-thousand tooth­picks a day.
  • stock and flow is the mas­ter metaphor for media today. Here’s what I mean: Flow is the feed. It’s the posts and the tweets. It’s the stream of daily and sub-daily updates that remind peo­ple that you exist. Stock is the durable stuff. It’s the con­tent you pro­duce that’s as inter­est­ing in two months (or two years) as it is today. It’s what peo­ple dis­cover via search. It’s what spreads slowly but surely, build­ing fans over time.
  • I feel like flow is ascenascen­dant these days, for obvi­ous reasons—but we neglect stock at our own peril.
  • ...9 more annotations...
  • Flow is a tread­mill, and you can’t spend all of your time run­ning on the tread­mill. Well, you can. But then one day you’ll get off and look around and go: Oh man. I’ve got noth­ing here.
  • But I’m not say­ing you should ignore flow!
  • this is no time to hole up and work in iso­la­tion, emerg­ing after long months or years with your perfectly-polished opus. Every­body will go: huh?
  • if you don’t have flow to plug your new fans into, you’re suf­fer­ing a huge (here it is!) oppor­tu­nity cost. You’ll have to find them all again next time you emerge from your cave.
  • we all got really good at flow, really fast. But flow is ephemeral. Stock sticks around. Stock is cap­i­tal. Stock is protein.
  • And the real magic trick in 2010 is to put them both together. To keep the ball bounc­ing with your flow—to main­tain that open chan­nel of communication—while you work on some kick-ass stock in the back­ground.
  • all these super-successful artists and media peo­ple today who don’t really think about flow. Like, Wes Ander­son
  • the secret is that some­body else does his flow for him. I mean, what are PR and adver­tis­ing? Flow, bought and paid for.
  • Today I’m still always ask­ing myself: Is this stock? Is this flow? How’s my mix? Do I have enough of both?
  •  
    flow is ascen dant these days, for obvi ous reasons-but we neglect stock at our own peril.
Weiye Loh

Roger Pielke Jr.'s Blog: A Decrease in Floods Around the World? - 0 views

  • Bouziotas et al. presented a paper at the EGU a few weeks ago (PDF) and concluded: Analysis of trends and of aggregated time series on climatic (30-year) scale does not indicate consistent trends worldwide. Despite common perception, in general, the detected trends are more negative (less intense floods in most recent years) than positive. Similarly, Svensson et al. (2005) and Di Baldassarre et al. (2010) did not find systematical change neither in flood increasing or decreasing numbers nor change in flood magnitudes in their analysis.
  • This finding is largely consistent with Kundzewicz et al. (2005) who find: Out of more than a thousand long time series made available by the Global Runoff Data Centre (GRDC) in Koblenz, Germany, a worldwide data set consisting of 195 long series of daily mean flow records was selected, based on such criteria as length of series, currency, lack of gaps and missing values, adequate geographical distribution, and priority to smaller catchments. The analysis of annual maximum flows does not support the hypothesis of ubiquitous growth of high flows. Although 27 cases of strong, statistically significant increase were identified by the Mann-Kendall test, there are 31 decreases as well, and most (137) time series do not show any significant changes (at the 10% level). Caution is advised in interpreting these results as flooding is a complex phenomenon, caused by a number of factors that can be associated with local, regional, and hemispheric climatic processes. Moreover, river flow has strong natural variability and exhibits long-term persistence which can confound the results of trend and significance tests.
  • estructive floods observed in the last decade all over the world have led to record high material damage. The conventional belief is that the increasing cost of floods is associated with increasing human development on flood plains (Pielke & Downton, 2000). However, the question remains as to whether or not the frequency and/or magnitude of flooding is also increasing and, if so, whether it is in response to climate variability and change. Several scenarios of future climate indicate a likelihood of increased intense precipitation and flood hazard. However, observations to date provide no conclusive and general proof as to how climate change affects flood behaviour.
  • ...1 more annotation...
  • References: Bouziotas, D., G. Deskos, N. Mastrantonas, D. Tsaknias, G. Vangelidis, S.M. Papalexiou, and D. Koutsoyiannis, Long-term properties of annual maximum daily river discharge worldwide, European Geosciences Union General Assembly 2011, Geophysical Research Abstracts, Vol. 13, Vienna, EGU2011-1439, European Geosciences Union, 2011. Kundzewicz, Z.W., D. Graczyk, T. Maurer, I. Przymusińska, M. Radziejewski, C. Svensson and M. Szwed, 2005(a):Trend detection in river flow time-series: 1. annual maximum flow. Hydrol. Sci. J., 50(5): 797-810.
Weiye Loh

Roger Pielke Jr.'s Blog: Flood Disasters and Human-Caused Climate Change - 0 views

  • [UPDATE: Gavin Schmidt at Real Climate has a post on this subject that  -- surprise, surprise -- is perfectly consonant with what I write below.] [UPDATE 2: Andy Revkin has a great post on the representations of the precipitation paper discussed below by scientists and related coverage by the media.]  
  • Nature published two papers yesterday that discuss increasing precipitation trends and a 2000 flood in the UK.  I have been asked by many people whether these papers mean that we can now attribute some fraction of the global trend in disaster losses to greenhouse gas emissions, or even recent disasters such as in Pakistan and Australia.
  • I hate to pour cold water on a really good media frenzy, but the answer is "no."  Neither paper actually discusses global trends in disasters (one doesn't even discuss floods) or even individual events beyond a single flood event in the UK in 2000.  But still, can't we just connect the dots?  Isn't it just obvious?  And only deniers deny the obvious, right?
  • ...12 more annotations...
  • What seems obvious is sometime just wrong.  This of course is why we actually do research.  So why is it that we shouldn't make what seems to be an obvious connection between these papers and recent disasters, as so many have already done?
  • First, the Min et al. paper seeks to identify a GHG signal in global precipitation over the period 1950-1999.  They focus on one-day and five-day measures of precipitation.  They do not discuss streamflow or damage.  For many years, an upwards trend in precipitation has been documented, and attributed to GHGs, even back to the 1990s (I co-authored a paper on precipitation and floods in 1999 that assumed a human influence on precipitation, PDF), so I am unsure what is actually new in this paper's conclusions.
  • However, accepting that precipitation has increased and can be attributed in some part to GHG emissions, there have not been shown corresponding increases in streamflow (floods)  or damage. How can this be?  Think of it like this -- Precipitation is to flood damage as wind is to windstorm damage.  It is not enough to say that it has become windier to make a connection to increased windstorm damage -- you need to show a specific increase in those specific wind events that actually cause damage. There are a lot of days that could be windier with no increase in damage; the same goes for precipitation.
  • My understanding of the literature on streamflow is that there have not been shown increasing peak streamflow commensurate with increases in precipitation, and this is a robust finding across the literature.  For instance, one recent review concludes: Floods are of great concern in many areas of the world, with the last decade seeing major fluvial events in, for example, Asia, Europe and North America. This has focused attention on whether or not these are a result of a changing climate. Rive flows calculated from outputs from global models often suggest that high river flows will increase in a warmer, future climate. However, the future projections are not necessarily in tune with the records collected so far – the observational evidence is more ambiguous. A recent study of trends in long time series of annual maximum river flows at 195 gauging stations worldwide suggests that the majority of these flow records (70%) do not exhibit any statistically significant trends. Trends in the remaining records are almost evenly split between having a positive and a negative direction.
  • Absent an increase in peak streamflows, it is impossible to connect the dots between increasing precipitation and increasing floods.  There are of course good reasons why a linkage between increasing precipitation and peak streamflow would be difficult to make, such as the seasonality of the increase in rain or snow, the large variability of flooding and the human influence on river systems.  Those difficulties of course translate directly to a difficulty in connecting the effects of increasing GHGs to flood disasters.
  • Second, the Pall et al. paper seeks to quantify the increased risk of a specific flood event in the UK in 2000 due to greenhouse gas emissions.  It applies a methodology that was previously used with respect to the 2003 European heatwave. Taking the paper at face value, it clearly states that in England and Wales, there has not been an increasing trend in precipitation or floods.  Thus, floods in this region are not a contributor to the global increase in disaster costs.  Further, there has been no increase in Europe in normalized flood losses (PDF).  Thus, Pall et al. paper is focused attribution in the context of on a single event, and not trend detection in the region that it focuses on, much less any broader context.
  • More generally, the paper utilizes a seasonal forecast model to assess risk probabilities.  Given the performance of seasonal forecast models in actual prediction mode, I would expect many scientists to remain skeptical of this approach to attribution. Of course, if this group can show an improvement in the skill of actual seasonal forecasts by using greenhouse gas emissions as a predictor, they will have a very convincing case.  That is a high hurdle.
  • In short, the new studies are interesting and add to our knowledge.  But they do not change the state of knowledge related to trends in global disasters and how they might be related to greenhouse gases.  But even so, I expect that many will still want to connect the dots between greenhouse gas emissions and recent floods.  Connecting the dots is fun, but it is not science.
  • Jessica Weinkle said...
  • The thing about the nature articles is that Nature itself made the leap from the science findings to damages in the News piece by Q. Schiermeier through the decision to bring up the topic of insurance. (Not to mention that which is symbolically represented merely by the journal’s cover this week). With what I (maybe, naively) believe to be a particularly ballsy move, the article quoted Muir-Wood, an industry scientists. However, what he is quoted as saying is admirably clever. Initially it is stated that Dr. Muir-Wood backs the notion that one cannot put the blame of increased losses on climate change. Then, the article ends with a quote from him, “If there’s evidence that risk is changing, then this is something we need to incorporate in our models.”
  • This is a very slippery slope and a brilliant double-dog dare. Without doing anything but sitting back and watching the headlines, one can form the argument that “science” supports the remodeling of the hazard risk above the climatological average and is more important then the risks stemming from socioeconomic factors. The reinsurance industry itself has published that socioeconomic factors far outweigh changes in the hazard in concern of losses. The point is (and that which has particularly gotten my knickers in a knot) is that Nature, et al. may wish to consider what it is that they want to accomplish. Is it greater involvement of federal governments in the insurance/reinsurance industry on the premise that climate change is too great a loss risk for private industry alone regardless of the financial burden it imposes? The move of insurance mechanisms into all corners of the earth under the auspices of climate change adaptation? Or simply a move to bolster prominence, regardless of whose back it breaks- including their own, if any of them are proud owners of a home mortgage? How much faith does one have in their own model when they are told that hundreds of millions of dollars in the global economy is being bet against the odds that their models produce?
  • What Nature says matters to the world; what scientists say matters to the world- whether they care for the responsibility or not. That is after all, the game of fame and fortune (aka prestige).
Weiye Loh

News Clips: Pinning down acupuncture: It's a placebo - 0 views

  • some doctors seem to have embraced even disproven remedies. Take, for instance, a review of acupuncture research that appeared last July in the New England Journal of Medicine. This highly respected journal is one of the most widely read by doctors across specialities.In Acupuncture For Chronic Low Back Pain, the authors reviewed clinical trials done to assess if acupuncture actually helps in chronic low back pain. The most important meta-analysis available was a 2008 study involving 6,359 patients, which 'showed that real acupuncture treatments were no more effective than sham acupuncture treatments'.
  • The authors then editorialised: 'There was nevertheless evidence that both real acupuncture and sham acupuncture were more effective than no treatment and that acupuncture can be a useful supplement to other forms of conventional therapy for low back pain.'
  • First, they admit that pooled clinical trials of the best sort show that real acupuncture does no better than sham acupuncture. This should mean that acupuncture does not work - full stop. But then they say that both sham and real acupuncture work as well as the other and thus is useful. Translation: Please use acupuncture as a placebo on your patients; just don't let them know it is a placebo.
  • ...6 more annotations...
  • I should add that I am not criticising TCM per se. Only acupuncture, a facet of TCM, albeit its most dramatic, is being scrutinised here. Chinese herbology must be analysed on its own merits.Interestingly, although acupuncture may be TCM's poster boy today, the Chinese physician in days of yore would have looked askance at it. Instead, his practice and prestige were based upon his grasp of the Chinese pharmacopoeia.
  • Acupuncture was left to the shamans and blood letters. After all, it was grounded, not in the knowledge of which herbs were best for what conditions, but astrology.
  • In Giovanni Maciocia's 2005 book, The Foundations Of Chinese Medicine: A Comprehensive Text For Acupuncturists And Herbalists, there is a chart showing the astrological provenance of acupuncture. The chart shows how the 12 main acupuncture meridians and the 12 main body segments correspond to the 12 Houses of the Chinese zodiac.
  • In Chinese cosmology, all life is animated by a numinous force called qi, the flow of which mirrors the sun's apparent 'movement' during the year through the ecliptic. (The ecliptic is the imaginary plane of the earth's orbit around the sun).Moreover, everything in the Chinese zodiac is mirrored on Earth and in Man. This was taught even in the earliest systematised TCM text, the Yellow Emperor's Canon Of Medicine, thus: 'Heaven is covered with constellations, Earth with waterways, and man with channels.'This 'as above, so below' doctrine means that if there is qi flowing around in the imaginary closed loop of the zodiac, there is qi flowing correspondingly in the body's closed loop of imaginary meridians as well.
  • Note that not only is acupuncture astrological in origin but also the astrology is based on a model of the universe which has the earth at its centre. This geocentric model was an erroneous idea widely accepted before the Copernican revolution.
  • So should doctors check the daily horoscopes of their patients?
Weiye Loh

Balderdash: Links - 26th March 2012 - 0 views

  •  
    "We are in the midst of a technological upheaval; and financial rewards are flowing to the elites who create and control the new machines. Almost everybody else is threatened - including sophisticated bank executives at Citi and WellPoint's healthcare analysts... Even if displaced workers do find jobs, will these jobs be dignified? A society divided between master-programmers and servants may not appeal to the servants, who will be the majority. But neo-Marxist visions of burger flippers on the barricades seem a touch too paranoid. In the 19th century, the shift from farm to factory was decried as dehumanising. But the supposedly alienated proletariat soon morphed into proud welders and machinists and today it is the decline of factories that is perceived as a problem. The lesson is that attitudes adjust and job status is elastic. There may be real unhappiness during the adjustment phase, but eventually the nannies of yesterday will be the respected childcare professionals of tomorrow. Cooks will turn into executive chefs. And so, in the last analysis, Watson's most enduring impact will be to accentuate the trade-off between equity and growth"
Weiye Loh

Adventures in Flay-land: Dealing with Denialists - Delingpole Part III - 0 views

  • This post is about how one should deal with a denialist of Delingpole's ilk.
  • I saw someone I follow on Twitter retweet an update from another Twitter user called @AGW_IS_A_HOAX, which was this: "NZ #Climate Scientists Admit Faking Temperatures http://bit.ly/fHbdPI RT @admrich #AGW #Climategate #Cop16 #ClimateChange #GlobalWarming".
  • So I click on it. And this is how you deal with a denialist claim. You actually look into it. Here is the text of that article reproduced in full: New Zealand Climate Scientists Admit To Faking Temperatures: The Actual Temps Show Little Warming Over Last 50 YearsRead here and here. Climate "scientists" across the world have been blatantly fabricating temperatures in hopes of convincing the public and politicians that modern global warming is unprecedented and accelerating. The scientists doing the fabrication are usually employed by the government agencies or universities, which thrive and exist on taxpayer research dollars dedicated to global warming research. A classic example of this is the New Zealand climate agency, which is now admitting their scientists produced bogus "warming" temperatures for New Zealand. "NIWA makes the huge admission that New Zealand has experienced hardly any warming during the last half-century. For all their talk about warming, for all their rushed invention of the “Eleven-Station Series” to prove warming, this new series shows that no warming has occurred here since about 1960. Almost all the warming took place from 1940-60, when the IPCC says that the effect of CO2 concentrations was trivial. Indeed, global temperatures were falling during that period.....Almost all of the 34 adjustments made by Dr Jim Salinger to the 7SS have been abandoned, along with his version of the comparative station methodology."A collection of temperature-fabrication charts.
  • ...10 more annotations...
  • I check out the first link, the first "here" where the article says "Read here and here". I can see that there's been some sort of dispute between two New Zealand groups associated with climate change. One is New Zealand’s Climate Science Coalition (NZCSC) and the other is New Zealand’s National Institute of Water and Atmospheric Research (NIWA), but it doesn't tell me a whole lot more than I already got from the other article.
  • I check the second source behind that article. The second article, I now realize, is published on the website of a person called Andrew Montford with whom I've been speaking recently and who is the author of a book titled The Hockey Stick Illusion. I would not label Andrew a denialist. He makes some good points and seems to be a decent guy and geniune sceptic (This is not to suggest all denialists are outwardly dishonest; however, they do tend to be hard to reason with). Again, this article doesn't give me anything that I haven't already seen, except a link to another background source. I go there.
  • From this piece written up on Scoop NZNEWSUK I discover that a coalition group consisting of the NZCSC and the Climate Conversation Group (CCG) has pressured the NIWA into abandoning a set of temperature record adjustments of which the coalition dispute the validity. This was the culmination of a court proceeding in December 2010, last month. In dispute were 34 adjustments that had been made by Dr Jim Salinger to the 7SS temperature series, though I don't know what that is exactly. I also discover that there is a guy called Richard Treadgold, Convenor of the CCG, who is quoted several times. Some of the statements he makes are quoted in the articles I've already seen. They are of a somewhat snide tenor. The CSC object to the methodology used by the NIWA to adjust temperature measurements (one developed as part of a PhD thesis), which they critique in a paper in November 2009 with the title "Are we feeling warmer yet?", and are concerned about how this public agency is spending its money. I'm going to have to dig a bit deeper if I want to find out more. There is a section with links under the heading "Related Stories on Scoop". I click on a few of those.
  • One of these leads me to more. Of particular interest is a fairly neutral article outlining the progress of the court action. I get some more background: For the last ten years, visitors to NIWA’s official website have been greeted by a graph of the “seven-station series” (7SS), under the bold heading “New Zealand Temperature Record”. The graph covers the period from 1853 to the present, and is adorned by a prominent trend-line sloping sharply upwards. Accompanying text informs the world that “New Zealand has experienced a warming trend of approximately 0.9°C over the past 100 years.” The 7SS has been updated and used in every monthly issue of NIWA’s “Climate Digest” since January 1993. Its 0.9°C (sometimes 1.0°C) of warming has appeared in the Australia/NZ Chapter of the IPCC’s 2001 and 2007 Assessment Reports. It has been offered as sworn evidence in countless tribunals and judicial enquiries, and provides the historical base for all of NIWA’s reports to both Central and Local Governments on climate science issues and future projections.
  • now I can see why this is so important. The temperature record informs the conclusions of the IPCC assessment reports and provides crucial evidence for global warming.
  • Further down we get: NIWA announces that it has now completed a full internal examination of the Salinger adjustments in the 7SS, and has forwarded its “review papers” to its Australian counterpart, the Bureau of Meteorology (BOM) for peer review.and: So the old 7SS has already been repudiated. A replacement NZTR [New Zealand Temperature Record] is being prepared by NIWA – presumably the best effort they are capable of producing. NZCSC is about to receive what it asked for. On the face of it, there’s nothing much left for the Court to adjudicate.
  • NIWA has been forced to withdraw its earlier temperature record and replace it with a new one. Treadgold quite clearly states that "NIWA makes the huge admission that New Zealand has experienced hardly any warming during the last half-century" and that "the new temperature record shows no evidence of a connection with global warming." Earlier in the article he also stresses the role of the CSC in achieving these revisions, saying "after 12 months of futile attempts to persuade the public, misleading answers to questions in the Parliament from ACT and reluctant but gradual capitulation from NIWA, their relentless defence of the old temperature series has simply evaporated. They’ve finally given in, but without our efforts the faulty graph would still be there."
  • All this leads me to believe that if I look at the website of NIWA I will see a retraction of the earlier position and a new position that New Zealand has experienced no unusual warming. This is easy enough to check. I go there. Actually, I search for it to find the exact page. Here is the 7SS page on the NIWA site. Am I surprised that NIWA have retracted nothing and that in fact their revised graph shows similar results? Not really. However, I am somewhat surprised by this page on the Climate Conversation Group website which claims that the 7SS temperature record is as dead as the parrot in the Monty Python sketch. It says "On the eve of Christmas, when nobody was looking, NIWA declared that New Zealand had a new official temperature record (the NZT7) and whipped the 7SS off its website." However, I've already seen that this is not true. Perhaps there was once a 7SS graph and information about the temperature record on the site's homepage that can no longer be seen. I don't know. I can only speculate. I know that there is a section on the NIWA site about the 7SS temperature record that contains a number of graphs and figures and discusses recent revisions. It has been updated as recently as December 2010, last month. The NIWA page talks all about the 7SS series and has a heading that reads "Our new analysis confirms the warming trend".
  • The CCG page claims that the new NZT7 is not in fact a revision but rather a replacement. Although it results in a similar curve, the adjustments that were made are very different. Frankly I can't see how that matters at the end of the day. Now, I don't really know whether I can believe that the NIWA analysis is true, but what I am in no doubt of whatsoever is that the statements made by Richard Treadgold that were quoted in so many places are at best misleading. The NIWA has not changed its position in the slightest. The assertion that the NIWA have admitted that New Zealand has not warmed much since 1960 is a politician's careful argument. Both analyses showed the same result. This is a fact that NIWA have not disputed; however, they still maintain a connection to global warming. A document explaining the revisions talks about why the warming has slowed after 1960: The unusually steep warming in the 1940-1960 period is paralleled by an unusually large increase in northerly flow* during this same period. On a longer timeframe, there has been a trend towards less northerly flow (more southerly) since about 1960. However, New Zealand temperatures have continued to increase over this time, albeit at a reduced rate compared with earlier in the 20th century. This is consistent with a warming of the whole region of the southwest Pacific within which New Zealand is situated.
  • Denialists have taken Treadgold's misleading mantra and spread it far and wide including on Twitter and fringe websites, but it is faulty as I've just demonstrated. Why do people do this? Perhaps they are hoping that others won't check the sources. Most people don't. I hope this serves as a lesson for why you always should.
Weiye Loh

How We Know by Freeman Dyson | The New York Review of Books - 0 views

  • Another example illustrating the central dogma is the French optical telegraph.
  • The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.
  • The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.
  • ...27 more annotations...
  • Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.
  • After these two historical examples of rapid communication in Africa and France, the rest of Gleick’s book is about the modern development of information technolog
  • The modern history is dominated by two Americans, Samuel Morse and Claude Shannon. Samuel Morse was the inventor of Morse Code. He was also one of the pioneers who built a telegraph system using electricity conducted through wires instead of optical pointers deployed on towers. Morse launched his electric telegraph in 1838 and perfected the code in 1844. His code used short and long pulses of electric current to represent letters of the alphabet.
  • Morse was ideologically at the opposite pole from Chappe. He was not interested in secrecy or in creating an instrument of government power. The Morse system was designed to be a profit-making enterprise, fast and cheap and available to everybody. At the beginning the price of a message was a quarter of a cent per letter. The most important users of the system were newspaper correspondents spreading news of local events to readers all over the world. Morse Code was simple enough that anyone could learn it. The system provided no secrecy to the users. If users wanted secrecy, they could invent their own secret codes and encipher their messages themselves. The price of a message in cipher was higher than the price of a message in plain text, because the telegraph operators could transcribe plain text faster. It was much easier to correct errors in plain text than in cipher.
  • Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.
  • When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography.
  • In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which was stamped SECRET and never saw the light of day. He published in 1948 an expurgated version of the 1945 paper with the title “A Mathematical Theory of Communication.” The 1948 version appeared in the Bell System Technical Journal, the house journal of the Bell Telephone Laboratories, and became an instant classic. It is the founding document for the modern science of information. After Shannon, the technology of information raced ahead, with electronic computers, digital cameras, the Internet, and the World Wide Web.
  • According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live
  • The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.
  • Gordon Moore was in the hardware business, making hardware components for electronic machines, and he stated his law as a law of growth for hardware. But the law applies also to the information that the hardware is designed to embody. The purpose of the hardware is to store and process information. The storage of information is called memory, and the processing of information is called computing. The consequence of Moore’s Law for information is that the price of memory and computing decreases and the available amount of memory and computing increases by a factor of a hundred every decade. The flood of hardware becomes a flood of information.
  • In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”
  • On December 8, 2010, Gleick published on the The New York Review’s blog an illuminating essay, “The Information Palace.” It was written too late to be included in his book. It describes the historical changes of meaning of the word “information,” as recorded in the latest quarterly online revision of the Oxford English Dictionary. The word first appears in 1386 a parliamentary report with the meaning “denunciation.” The history ends with the modern usage, “information fatigue,” defined as “apathy, indifference or mental exhaustion arising from exposure to too much information.”
  • The consequences of the information flood are not all bad. One of the creative enterprises made possible by the flood is Wikipedia, started ten years ago by Jimmy Wales. Among my friends and acquaintances, everybody distrusts Wikipedia and everybody uses it. Distrust and productive use are not incompatible. Wikipedia is the ultimate open source repository of information. Everyone is free to read it and everyone is free to write it. It contains articles in 262 languages written by several million authors. The information that it contains is totally unreliable and surprisingly accurate. It is often unreliable because many of the authors are ignorant or careless. It is often accurate because the articles are edited and corrected by readers who are better informed than the authors
  • Jimmy Wales hoped when he started Wikipedia that the combination of enthusiastic volunteer writers with open source information technology would cause a revolution in human access to knowledge. The rate of growth of Wikipedia exceeded his wildest dreams. Within ten years it has become the biggest storehouse of information on the planet and the noisiest battleground of conflicting opinions. It illustrates Shannon’s law of reliable communication. Shannon’s law says that accurate transmission of information is possible in a communication system with a high level of noise. Even in the noisiest system, errors can be reliably corrected and accurate information transmitted, provided that the transmission is sufficiently redundant. That is, in a nutshell, how Wikipedia works.
  • The information flood has also brought enormous benefits to science. The public has a distorted view of science, because children are taught in school that science is a collection of firmly established truths. In fact, science is not a collection of truths. It is a continuing exploration of mysteries. Wherever we go exploring in the world around us, we find mysteries. Our planet is covered by continents and oceans whose origin we cannot explain. Our atmosphere is constantly stirred by poorly understood disturbances that we call weather and climate. The visible matter in the universe is outweighed by a much larger quantity of dark invisible matter that we do not understand at all. The origin of life is a total mystery, and so is the existence of human consciousness. We have no clear idea how the electrical discharges occurring in nerve cells in our brains are connected with our feelings and desires and actions.
  • Even physics, the most exact and most firmly established branch of science, is still full of mysteries. We do not know how much of Shannon’s theory of information will remain valid when quantum devices replace classical electric circuits as the carriers of information. Quantum devices may be made of single atoms or microscopic magnetic circuits. All that we know for sure is that they can theoretically do certain jobs that are beyond the reach of classical devices. Quantum computing is still an unexplored mystery on the frontier of information theory. Science is the sum total of a great multitude of mysteries. It is an unending argument between a great multitude of voices. It resembles Wikipedia much more than it resembles the Encyclopaedia Britannica.
  • The rapid growth of the flood of information in the last ten years made Wikipedia possible, and the same flood made twenty-first-century science possible. Twenty-first-century science is dominated by huge stores of information that we call databases. The information flood has made it easy and cheap to build databases. One example of a twenty-first-century database is the collection of genome sequences of living creatures belonging to various species from microbes to humans. Each genome contains the complete genetic information that shaped the creature to which it belongs. The genome data-base is rapidly growing and is available for scientists all over the world to explore. Its origin can be traced to the year 1939, when Shannon wrote his Ph.D. thesis with the title “An Algebra for Theoretical Genetics.
  • Shannon was then a graduate student in the mathematics department at MIT. He was only dimly aware of the possible physical embodiment of genetic information. The true physical embodiment of the genome is the double helix structure of DNA molecules, discovered by Francis Crick and James Watson fourteen years later. In 1939 Shannon understood that the basis of genetics must be information, and that the information must be coded in some abstract algebra independent of its physical embodiment. Without any knowledge of the double helix, he could not hope to guess the detailed structure of the genetic code. He could only imagine that in some distant future the genetic information would be decoded and collected in a giant database that would define the total diversity of living creatures. It took only sixty years for his dream to come true.
  • In the twentieth century, genomes of humans and other species were laboriously decoded and translated into sequences of letters in computer memories. The decoding and translation became cheaper and faster as time went on, the price decreasing and the speed increasing according to Moore’s Law. The first human genome took fifteen years to decode and cost about a billion dollars. Now a human genome can be decoded in a few weeks and costs a few thousand dollars. Around the year 2000, a turning point was reached, when it became cheaper to produce genetic information than to understand it. Now we can pass a piece of human DNA through a machine and rapidly read out the genetic information, but we cannot read out the meaning of the information. We shall not fully understand the information until we understand in detail the processes of embryonic development that the DNA orchestrated to make us what we are.
  • The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information.
  • . Lord Kelvin, one of the leading physicists of that time, promoted the heat death dogma, predicting that the flow of heat from warmer to cooler objects will result in a decrease of temperature differences everywhere, until all temperatures ultimately become equal. Life needs temperature differences, to avoid being stifled by its waste heat. So life will disappear
  • Thanks to the discoveries of astronomers in the twentieth century, we now know that the heat death is a myth. The heat death can never happen, and there is no paradox. The best popular account of the disappearance of the paradox is a chapter, “How Order Was Born of Chaos,” in the book Creation of the Universe, by Fang Lizhi and his wife Li Shuxian.2 Fang Lizhi is doubly famous as a leading Chinese astronomer and a leading political dissident. He is now pursuing his double career at the University of Arizona.
  • The belief in a heat death was based on an idea that I call the cooking rule. The cooking rule says that a piece of steak gets warmer when we put it on a hot grill. More generally, the rule says that any object gets warmer when it gains energy, and gets cooler when it loses energy. Humans have been cooking steaks for thousands of years, and nobody ever saw a steak get colder while cooking on a fire. The cooking rule is true for objects small enough for us to handle. If the cooking rule is always true, then Lord Kelvin’s argument for the heat death is correct.
  • the cooking rule is not true for objects of astronomical size, for which gravitation is the dominant form of energy. The sun is a familiar example. As the sun loses energy by radiation, it becomes hotter and not cooler. Since the sun is made of compressible gas squeezed by its own gravitation, loss of energy causes it to become smaller and denser, and the compression causes it to become hotter. For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.
  • The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information.
  • A darker view of the information-dominated universe was described in a famous story, “The Library of Babel,” by Jorge Luis Borges in 1941.3 Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe.
  • Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition:We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Weiye Loh

In Japan, a Culture That Promotes Nuclear Dependency - NYTimes.com - 0 views

  • look no further than the Fukada Sports Park, which serves the 7,500 mostly older residents here with a baseball diamond, lighted tennis courts, a soccer field and a $35 million gymnasium with indoor pool and Olympic-size volleyball arena. The gym is just one of several big public works projects paid for with the hundreds of millions of dollars this community is receiving for acce
  • the aid has enriched rural communities that were rapidly losing jobs and people to the cities. With no substantial reserves of oil or coal, Japan relies on nuclear power for the energy needed to drive its economic machine. But critics contend that the largess has also made communities dependent on central government spending — and thus unwilling to rock the boat by pushing for robust safety measures.
  • Tsuneyoshi Adachi, a 63-year-old fisherman, joined the huge protests in the 1970s and 1980s against the plant’s No. 2 reactor. He said many fishermen were angry then because chlorine from the pumps of the plant’s No. 1 reactor, which began operating in 1974, was killing seaweed and fish in local fishing grounds. However, Mr. Adachi said, once compensation payments from the No. 2 reactor began to flow in, neighbors began to give him cold looks and then ignore him. By the time the No. 3 reactor was proposed in the early 1990s, no one, including Mr. Adachi, was willing to speak out against the plant. He said that there was the same peer pressure even after the accident at Fukushima, which scared many here because they live within a few miles of the Shimane plant. “Sure, we are all worried in our hearts about whether the same disaster could happen at the Shimane nuclear plant,” Mr. Adachi said. However, “the town knows it can no longer survive economically without the nuclear plant.”
  • ...1 more annotation...
  • Much of this flow of cash was the product of the Three Power Source Development Laws, a sophisticated system of government subsidies created in 1974 by Kakuei Tanaka, the powerful prime minister who shaped Japan’s nuclear power landscape and used big public works projects to build postwar Japan’s most formidable political machine. The law required all Japanese power consumers to pay, as part of their utility bills, a tax that was funneled to communities with nuclear plants. Officials at the Ministry of Economy, Trade and Industry, which regulates the nuclear industry, and oversees the subsidies, refused to specify how much communities have come to rely on those subsidies. “This is money to promote the locality’s acceptance of a nuclear plant,” said Tatsumi Nakano of the ministry’s Agency for Natural Resources and Energy.
Weiye Loh

Egypt: Timeline of Communication Shutdown during the Revolution - Global Voices Advocacy - 0 views

  •  
    This diagram represents sequence of communication shutdown implemented by security agencies in Egypt and telecommunications companies starting 25 January to 6 February, to control the flow of information between people.
Weiye Loh

China calls out US human rights abuses: laptop searches, 'Net porn - 0 views

  • The report makes no real attempt to provide context to a huge selection of news articles about bad things happening in the US, piled up one against each other in almost random fashion.
  • As the UK's Guardian paper noted, "While some of the data cited in the report is derived from official or authoritative sources, other sections are composed from a mishmash of online material. One figure on crime rates is attributed to '10 Facts About Crime in the United States that Will Blow Your Mind, Beforitsnews.com'." The opening emphasis on US crime is especially odd; crime rates in the US are the lowest they have been in decades; the drop-off has been so dramatic that books have been written in attempts to explain it.
  • But the report does provide an interesting perspective on the US, especially when it comes to technology, and it's not all off base. China points to US laptop border searches as a problem (and they are): According to figures released by the American Civil Liberties Union (ACLU) in September 2010, more than 6,600 travelers had been subject to electronic device searches between October 1, 2008 and June 2, 2010, nearly half of them American citizens. A report on The Wall Street Journal on September 7, 2010, said the Department of Homeland Security (DHS) was sued over its policies that allegedly authorize the search and seizure of laptops, cellphones and other electronic devices without a reasonable suspicion of wrongdoing. The policies were claimed to leave no limit on how long the DHS can keep a traveler's devices or on the scope of private information that can be searched, copied, or detained. There is no provision for judicial approval or supervision. When Colombian journalist Hollman Morris sought a US student visa so he could take a fellowship for journalists at Harvard University, his application was denied on July 17, 2010, as he was ineligible under the "terrorist activities" section of the USA Patriot Act. An Arab American named Yasir Afifi, living in California, found the FBI attached an electronic GPS tracking device near the right rear wheel of his car.
  • ...2 more annotations...
  • China also sees hypocrisy in American discussions of Internet freedom. China comes in regularly for criticism over its "Great Firewall," but it suggests that the US government also restricts the Internet. While advocating Internet freedom, the US in fact imposes fairly strict restriction on cyberspace. On June 24, 2010, the US Senate Committee on Homeland Security and Governmental Affairs approved the Protecting Cyberspace as a National Asset Act, which will give the federal government "absolute power" to shut down the Internet under a declared national emergency. Handing government the power to control the Internet will only be the first step towards a greatly restricted Internet system, whereby individual IDs and government permission would be required to operate a website. The United States applies double standards on Internet freedom by requesting unrestricted "Internet freedom" in other countries, which becomes an important diplomatic tool for the United States to impose pressure and seek hegemony, and imposing strict restriction within its territory. An article on BBC on February 16, 2011 noted the US government wants to boost Internet freedom to give voices to citizens living in societies regarded as "closed" and questions those governments' control over information flow, although within its borders the US government tries to create a legal frame to fight the challenge posed by WikiLeaks. The US government might be sensitive to the impact of the free flow of electronic information on its territory for which it advocates, but it wants to practice diplomacy by other means, including the Internet, particularly the social networks. (The cyberspace bill never became law, and a revised version is still pending in Congress.)
  • Finally, there's pornography, which China bans. Pornographic content is rampant on the Internet and severely harms American children. Statistics show that seven in 10 children have accidentally accessed pornography on the Internet and one in three has done so intentionally. And the average age of exposure is 11 years old - some start at eight years old (The Washington Times, June 16, 2010). According to a survey commissioned by the National Campaign to Prevent Teen and Unplanned Pregnancy, 20 percent of American teens have sent or posted nude or seminude pictures or videos of themselves. (www.co.jefferson.co.us, March 23, 2010). At least 500 profit-oriented nude chat websites were set up by teens in the United States, involving tens of thousands of pornographic pictures.
  •  
    Upset over the US State Department's annual human rights report, China publishes a report of its own on various US ills. This year, it calls attention to America's border laptop searches, its attitude toward WikiLeaks, and the prevalence of online pornography. In case the report's purpose wasn't clear, China Foreign Ministry spokesman Hong Lei said this weekend, "We advise the US side to reflect on its own human rights issue, stop acting as a preacher of human rights as well as interfering in other countries' internal affairs by various means including issuing human rights reports."
Weiye Loh

The Origins of "Basic Research" - 0 views

  • For many scientists, "basic research" means "fundamental" or "pure" research conducted without consideration of practical applications. At the same time, policy makers see "basic research" as that which leads to societal benefits including economic growth and jobs.
  • The mechanism that has allowed such divergent views to coexist is of course the so-called "linear model" of innovation, which holds that investments in "basic research" are but the first step in a sequence that progresses through applied research, development, and application. As recently explained in a major report of the US National Academy of Sciences: "[B]asic research ... has the potential to be transformational to maintain the flow of new ideas that fuel the economy, provide security, and enhance the quality of life" (Rising Above the Gathering Storm).
  • A closer look at the actual history of Google reveals how history becomes mythology. The 1994 NSF project that funded the scientific work underpinning the search engine that became Google (as we know it today) was conducted from the start with commercialization in mind: "The technology developed in this project will provide the 'glue' that will make this worldwide collection usable as a unified entity, in a scalable and economically viable fashion." In this case, the scientist following his curiosity had at least one eye simultaneously on commercialization.
  • ...1 more annotation...
  • In their appeal for more funding for scientific research, Leshner and Cooper argued that: "Across society, we don't have to look far for examples of basic research that paid off." They cite the creation of Google as a prime example of such payoffs: "Larry Page and Sergey Brin, then a National Science Foundation [NSF] fellow, did not intend to invent the Google search engine. Originally, they were intrigued by a mathematical challenge ..." The appealing imagery of a scientist who simply follows his curiosity and then makes a discovery with a large societal payoff is part of the core mythology of post-World War II science policies. The mythology shapes how governments around the world organize, account for, and fund research. A large body of scholarship has critiqued postwar science policies and found that, despite many notable successes, the science policies that may have made sense in the middle of the last century may need updating in the 21st century. In short, investments in "basic research" are not enough. Benoit Godin has asserted (PDF) that: "The problem is that the academic lobby has successfully claimed a monopoly on the creation of new knowledge, and that policy makers have been persuaded to confuse the necessary with the sufficient condition that investment in basic research would by itself necessarily lead to successful applications." Or as Leshner and Cooper declare in The Washington Post: "Federal investments in R&D have fueled half of the nation's economic growth since World War II."
Reseena Abdullah

China Requires Censorship Software on New PCs - 3 views

The article talks about the software that the Chinese government has legislated to be installed on all PCs from July 1st onwards. The software is designed to "filter out pornography and other 'unhe...

censorship surveillance

started by Reseena Abdullah on 07 Sep 09 no follow-up yet
joanne ye

Democracy Project to Fill Gap in Online Politics - 3 views

Reference: Democracy Project to Fill Gap in Online Politics (2000, June 8). PR Newswire. Retrieved 23 September, 2009, from Factiva. (Article can be found at bottom of the post) Summary: The D...

human rights digital freedom democracy

started by joanne ye on 24 Sep 09 no follow-up yet
Low Yunying

China's Green Dam Internet Filter - 6 views

Article: http://edition.cnn.com/2009/TECH/06/30/china.green.dam/index.html Summary: China has passed a mandate requiring all personal computers sold in the country to be accompanied by a contro...

China pornography filter

started by Low Yunying on 02 Sep 09 no follow-up yet
Weiye Loh

Ellsberg: "EVERY attack now made on WikiLeaks and Julian Assange was made against me an... - 0 views

  • Ex-Intelligence Officers, Others See Plusses in WikiLeaks Disclosures
  • The following statement was released today, signed by Daniel Ellsberg, Frank Grevil, Katharine Gun, David MacMichael, Ray McGovern, Craig Murray, Coleen Rowley and Larry Wilkerson; all are associated with Sam Adams Associates for Integrity in Intelligence.
  • How far down the U.S. has slid can be seen, ironically enough, in a recent commentary in Pravda (that’s right, Russia’s Pravda): “What WikiLeaks has done is make people understand why so many Americans are politically apathetic … After all, the evils committed by those in power can be suffocating, and the sense of powerlessness that erupts can be paralyzing, especially when … government evildoers almost always get away with their crimes. …”
  • ...6 more annotations...
  • shame on Barack Obama, Eric Holder, and all those who spew platitudes about integrity, justice and accountability while allowing war criminals and torturers to walk freely upon the earth. … the American people should be outraged that their government has transformed a nation with a reputation for freedom, justice, tolerance and respect for human rights into a backwater that revels in its criminality, cover-ups, injustices and hypocrisies.
  • As part of their attempt to blacken WikiLeaks and Assange, pundit commentary over the weekend has tried to portray Assange’s exposure of classified materials as very different from — and far less laudable than — what Daniel Ellsberg did in releasing the Pentagon Papers in 1971. Ellsberg strongly rejects the mantra “Pentagon Papers good; WikiLeaks material bad.” He continues: “That’s just a cover for people who don’t want to admit that they oppose any and all exposure of even the most misguided, secretive foreign policy. The truth is that EVERY attack now made on WikiLeaks and Julian Assange was made against me and the release of the Pentagon Papers at the time.”
  • WikiLeaks’ reported source, Army Pvt. Bradley Manning, having watched Iraqi police abuses, and having read of similar and worse incidents in official messages, reportedly concluded, “I was actively involved in something that I was completely against.” Rather than simply go with the flow, Manning wrote: “I want people to see the truth … because without information you cannot make informed decisions as a public,” adding that he hoped to provoke worldwide discussion, debates, and reform.
  • The media: again, the media is key. No one has said it better than Monseñor Romero of El Salvador, who just before he was assassinated 25 years ago warned, “The corruption of the press is part of our sad reality, and it reveals the complicity of the oligarchy.” Sadly, that is also true of the media situation in America today.
  • The big question is not whether Americans can “handle the truth.” We believe they can. The challenge is to make the truth available to them in a straightforward way so they can draw their own conclusions — an uphill battle given the dominance of the mainstream media, most of which have mounted a hateful campaign to discredit Assange and WikiLeaks.
  • So far, the question of whether Americans can “handle the truth” has been an academic rather than an experience-based one, because Americans have had very little access to the truth. Now, however, with the WikiLeaks disclosures, they do. Indeed, the classified messages from the Army and the State Department released by WikiLeaks are, quite literally, “ground truth.”
Weiye Loh

Why dummies ONLY use statistics to make a point « - 0 views

  • look at stuff that statisticians usually dismiss as noise
  • I happen to be such a great fan of weirdonomics, I actually believe weird statistics sometimes work even better than the conventional approach – for one they provide a more timely snap shot of what’s really happening in the world than official numbers along with what we would usually term as market driven aggregates.
  • I don’t need to look the trading volume or for that matter refer to data that is generated by the EPFR, which tracks global money flows
  • ...3 more annotations...
  • I could just as well get the same feel or texture of the prevailing sentiment by counting how many times people search for the word, “market down turn” – “recession” – “unemployment,” in Google or counting how many times those words appear in the Herald Tribune and WSJ – in either case, my point is resorting to unconventional methods to make sense of our world may actually hold out more prospects than resorting to a conventional methods.
  • we need to be mindful whenever we talk about productivity in the context of statistics bc it can skewer the picture
  • Statistics can even tell you what’s your chances of living beyond 70 years if you dont smoke and drink like me. But it can’t tell you really simple things like how much of life you lived in those years. In fact, it can tell you very little about the human condition – It can’t tell you for example why a parent believes so much in a disabled child, that to them, he’s god given; it cant tell you why people choose to fight and even die for their country - it can tell you even less about your story, love & hate, war & peace………or for that matter anything about the human condition. Man Stastistics has got no soul……. It’s machine language. And less of a heart. It’s good calculating how big manhole covers should be or how many nuts do you need to hold up a bridge under X, Y and Z conditions - other than that, its good for nothing else - that’s why whenever I see a man spouting statistics – I just know he is full of shit; doesn’t matter who he is -could well be a man on TV, a bent pastor who thinks Jesus asked him to build another shopping mall or even someone who just wants to sell you something you dont need - they’re all full of invisibe and odorless shit – and that’s the deadliest type of shit, bc you can be neck deep in it and still not know that you are in shit.”
  •  
    Why dummies ONLY use statistics to make a point
Weiye Loh

If suspect Jared Lee Loughner has schizophrenia, would that make him more likely to go ... - 0 views

  • Shortly after Jared Lee Loughner had been identified as the alleged shooter of Arizona Rep. Gabrielle Giffords, online sleuths turned up pages of rambling text and videos he had created. A wave of amateur diagnoses soon followed, most of which concluded that Loughner was not so much a political extremist as a man suffering from "paranoid schizophrenia."
  • For many, the investigation will stop there. No need to explore personal motives, out-of-control grievances or distorted political anger. The mere mention of mental illness is explanation enough. This presumed link between psychiatric disorders and violence has become so entrenched in the public consciousness that the entire weight of the medical evidence is unable to shift it. Severe mental illness, on its own, is not an explanation for violence, but don't expect to hear that from the media in the coming weeks.
  • Seena Fazel is an Oxford University psychiatrist who has led the most extensive scientific studies to date of the links between violence and two of the most serious psychiatric diagnoses—schizophrenia and bipolar disorder, either of which can lead to delusions, hallucinations, or some other loss of contact with reality. Rather than looking at individual cases, or even single studies, Fazel's team analyzed all the scientific findings they could find. As a result, they can say with confidence that psychiatric diagnoses tell us next to nothing about someone's propensity or motive for violence.
  • ...1 more annotation...
  • The fact that mental illness is so often used to explain violent acts despite the evidence to the contrary almost certainly flows from how such cases are handled in the media. Numerous studies show that crimes by people with psychiatric problems are over-reported, usually with gross inaccuracies that give a false impression of risk. With this constant misrepresentation, it's not surprising that the public sees mental illness as an easy explanation for heartbreaking events. We haven't yet learned all the details of the tragic shooting in Arizona, but I suspect mental illness will be falsely accused many times over.
Weiye Loh

Why Do Intellectuals Oppose Capitalism? - 0 views

  • Not all intellectuals are on the "left."
  • But in their case, the curve is shifted and skewed to the political left.
  • By intellectuals, I do not mean all people of intelligence or of a certain level of education, but those who, in their vocation, deal with ideas as expressed in words, shaping the word flow others receive. These wordsmiths include poets, novelists, literary critics, newspaper and magazine journalists, and many professors. It does not include those who primarily produce and transmit quantitatively or mathematically formulated information (the numbersmiths) or those working in visual media, painters, sculptors, cameramen. Unlike the wordsmiths, people in these occupations do not disproportionately oppose capitalism. The wordsmiths are concentrated in certain occupational sites: academia, the media, government bureaucracy.
  • ...6 more annotations...
  • Wordsmith intellectuals fare well in capitalist society; there they have great freedom to formulate, encounter, and propagate new ideas, to read and discuss them. Their occupational skills are in demand, their income much above average. Why then do they disproportionately oppose capitalism? Indeed, some data suggest that the more prosperous and successful the intellectual, the more likely he is to oppose capitalism. This opposition to capitalism is mainly "from the left" but not solely so. Yeats, Eliot, and Pound opposed market society from the right.
  • can distinguish two types of explanation for the relatively high proportion of intellectuals in opposition to capitalism. One type finds a factor unique to the anti-capitalist intellectuals. The second type of explanation identifies a factor applying to all intellectuals, a force propelling them toward anti-capitalist views. Whether it pushes any particular intellectual over into anti-capitalism will depend upon the other forces acting upon him. In the aggregate, though, since it makes anti-capitalism more likely for each intellectual, such a factor will produce a larger proportion of anti-capitalist intellectuals. Our explanation will be of this second type. We will identify a factor which tilts intellectuals toward anti-capitalist attitudes but does not guarantee it in any particular case.
  • Intellectuals now expect to be the most highly valued people in a society, those with the most prestige and power, those with the greatest rewards. Intellectuals feel entitled to this. But, by and large, a capitalist society does not honor its intellectuals. Ludwig von Mises explains the special resentment of intellectuals, in contrast to workers, by saying they mix socially with successful capitalists and so have them as a salient comparison group and are humiliated by their lesser status.
  • Why then do contemporary intellectuals feel entitled to the highest rewards their society has to offer and resentful when they do not receive this? Intellectuals feel they are the most valuable people, the ones with the highest merit, and that society should reward people in accordance with their value and merit. But a capitalist society does not satisfy the principle of distribution "to each according to his merit or value." Apart from the gifts, inheritances, and gambling winnings that occur in a free society, the market distributes to those who satisfy the perceived market-expressed demands of others, and how much it so distributes depends on how much is demanded and how great the alternative supply is. Unsuccessful businessmen and workers do not have the same animus against the capitalist system as do the wordsmith intellectuals. Only the sense of unrecognized superiority, of entitlement betrayed, produces that animus.
  • What factor produced feelings of superior value on the part of intellectuals? I want to focus on one institution in particular: schools. As book knowledge became increasingly important, schooling--the education together in classes of young people in reading and book knowledge--spread. Schools became the major institution outside of the family to shape the attitudes of young people, and almost all those who later became intellectuals went through schools. There they were successful. They were judged against others and deemed superior. They were praised and rewarded, the teacher's favorites. How could they fail to see themselves as superior? Daily, they experienced differences in facility with ideas, in quick-wittedness. The schools told them, and showed them, they were better.
  • We have refined the hypothesis somewhat. It is not simply formal schools but formal schooling in a specified social context that produces anti-capitalist animus in (wordsmith) intellectuals. No doubt, the hypothesis requires further refining. But enough. It is time to turn the hypothesis over to the social scientists, to take it from armchair speculations in the study and give it to those who will immerse themselves in more particular facts and data. We can point, however, to some areas where our hypothesis might yield testable consequences and predictions. First, one might predict that the more meritocratic a country's school system, the more likely its intellectuals are to be on the left. (Consider France.) Second, those intellectuals who were "late bloomers" in school would not have developed the same sense of entitlement to the very highest rewards; therefore, a lower percentage of the late-bloomer intellectuals will be anti-capitalist than of the early bloomers. Third, we limited our hypothesis to those societies (unlike Indian caste society) where the successful student plausibly could expect further comparable success in the wider society. In Western society, women have not heretofore plausibly held such expectations, so we would not expect the female students who constituted part of the academic upper class yet later underwent downward mobility to show the same anti-capitalist animus as male intellectuals. We might predict, then, that the more a society is known to move toward equality in occupational opportunity between women and men, the more its female intellectuals will exhibit the same disproportionate anti-capitalism its male intellectuals show.
Weiye Loh

Spatially variable response of Himalayan glaciers to climate change affected by debris ... - 0 views

  • Controversy about the current state and future evolution of Himalayan glaciers has been stirred up by erroneous statements in the fourth report by the Intergovernmental Panel on Climate Change1, 2.
  • Variable retreat rates3, 4, 5, 6 and a paucity of glacial mass-balance data7, 8 make it difficult to develop a coherent picture of regional climate-change impacts in the region.
  • we report remotely-sensed frontal changes and surface velocities from glaciers in the greater Himalaya between 2000 and 2008 that provide evidence for strong spatial variations in glacier behaviour which are linked to topography and climate.
  • ...2 more annotations...
  • More than 65% of the monsoon-influenced glaciers that we observed are retreating, but heavily debris-covered glaciers with stagnant low-gradient terminus regions typically have stable fronts. Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher. In contrast, more than 50% of observed glaciers in the westerlies-influenced Karakoram region in the northwestern Himalaya are advancing or stable.
  • Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability9, 10 or global sea level11.
Weiye Loh

Himalayan glaciers not melting because of climate change, report finds - Telegraph - 0 views

  • Himalayan glaciers are actually advancing rather than retreating, claims the first major study since a controversial UN report said they would be melted within quarter of a century.
  • Researchers have discovered that contrary to popular belief half of the ice flows in the Karakoram range of the mountains are actually growing rather than shrinking.
  • The discovery adds a new twist to the row over whether global warming is causing the world's highest mountain range to lose its ice cover.
  • ...13 more annotations...
  • It further challenges claims made in a 2007 report by the UN's Intergovernmental Panel on Climate Change that the glaciers would be gone by 2035.
  • Although the head of the panel Dr Rajendra Pachauri later admitted the claim was an error gleaned from unchecked research, he maintained that global warming was melting the glaciers at "a rapid rate", threatening floods throughout north India.
  • The new study by scientists at the Universities of California and Potsdam has found that half of the glaciers in the Karakoram range, in the northwestern Himlaya, are in fact advancing and that global warming is not the deciding factor in whether a glacier survives or melts.
  • Dr Bodo Bookhagen, Dirk Scherler and Manfred Strecker studied 286 glaciers between the Hindu Kush on the Afghan-Pakistan border to Bhutan, taking in six areas.Their report, published in the journal Nature Geoscience, found the key factor affecting their advance or retreat is the amount of debris – rocks and mud – strewn on their surface, not the general nature of climate change.
  • Glaciers surrounded by high mountains and covered with more than two centimetres of debris are protected from melting.Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher.
  • In contrast, more than 50 per cent of observed glaciers in the Karakoram region in the northwestern Himalaya are advancing or stable.
  • "Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability or global sea level," the authors concluded.
  • Dr Bookhagen said their report had shown "there is no stereotypical Himalayan glacier" in contrast to the UN's climate change report which, he said, "lumps all Himalayan glaciers together."
  • Dr Pachauri, head of the Nobel prize-winning UN Intergovernmental Panel on Climate Change, has remained silent on the matter since he was forced to admit his report's claim that the Himalayan glaciers would melt by 2035 was an error and had not been sourced from a peer-reviewed scientific journal. It came from a World Wildlife Fund report.
  • this latest tawdry addition to the pathetic lies of the Reality Deniers. If you go to a proper source which quotes the full study such as:http://www.sciencedaily.com/re...you discover that the findings of this study are rather different to those portrayed here.
  • only way to consistently maintain a lie is to refuse point-blank to publish ALL the findings of a study, but to cherry-pick the bits which are consistent with the ongoing lie, while ignoring the rest.
  • Bookhagen noted that glaciers in the Karakoram region of Northwestern Himalaya are mostly stagnating. However, glaciers in the Western, Central, and Eastern Himalaya are retreating, with the highest retreat rates -- approximately 8 meters per year -- in the Western Himalayan Mountains. The authors found that half of the studied glaciers in the Karakoram region are stable or advancing, whereas about two-thirds are in retreat elsewhere throughout High Asia
  • glaciers in the steep Himalaya are not only affected by temperature and precipitation, but also by debris coverage, and have no uniform and less predictable response, explained the authors. The debris coverage may be one of the missing links to creating a more coherent picture of glacial behavior throughout all mountains. The scientists contrast this Himalayan glacial study with glaciers from the gently dipping, low-relief Tibetan Plateau that have no debris coverage. Those glaciers behave in a different way, and their frontal changes can be explained by temperature and precipitation changes.
1 - 20 of 33 Next ›
Showing 20 items per page